Article
0 comment

PostgreSQL: Backup a large database to disk

I really like database dumps with complete column inserts. You can read them, browse, search for some special data and even manipulate them. The most simple way to create such a readable backup is:

pg_dump --column-inserts --inserts --if-exists --clean --create -f FILENAME.sql DATABASENAME

There is one drawback: the files are large compared to the actual information stored und im- and export are rather slow. But there is a backup format that is both compressed and fast. It’s called “directory format”:

pg_dump -Fd -f DIRECTORYNAME -d DATABASENAME

This creates a directory called DIRECTORYNAME and dumps the content of DATABASENAME in a compressed format.
Getting the data back into the database is done with pg_restore:

pg_restore -C -d DATABASENAME DIRECTORYNAME

The -C option creates the database prior import.

Leave a Reply

Required fields are marked *.