Article
0 comment

PostgreSQL: Backup a large database to disk

I really like database dumps with complete column inserts. You can read them, browse, search for some special data and even manipulate them. The most simple way to create such a readable backup is:

There is one drawback: the files are large compared to the actual information stored und im- and export are rather slow. But there is a backup format that is both compressed and fast. It’s called “directory format”:

This creates a directory called DIRECTORYNAME and dumps the content of DATABASENAME in a compressed format.
Getting the data back into the database is done with pg_restore:

The -C option creates the database prior import.

Article
0 comment

PostgreSQL: Copy a table structure and data

Once again a short notice about some nifty PostgreSQL feature you might not be aware of. It’s quire simple to copy the structure of a table with:

Now you can copy data with or without some sort of filter:

Hope that helps.

Article
0 comment

PostgreSQL: Convert an hstore column to jsonb

I’m involved in a tech scouting project, where I designed a PostgreSQL database schema using an hstore column to store additional a priori unspecified data to every record in that table. For some historical reason I decided to use an hstore column instead of one of the much more versatile JSON column types.

Historical priorities change and so they did in that project. Now I wanted to use that column as JSONB. As always in SQL columns are changed with an ALTER TABLE … ALTER COLUMN statement. In this case the conversion of the content is a bit more complex, but giving PostgreSQL some hint about what we expect does the job.

First let’s create a table to play with and insert one row of data:

Let’s see, if it looks OK:

Yields:

Fine. Next step is the magic column conversion:

Another look:

And we get:

Mission accomplished.