Article
0 comment

Junk in, junk out

When I was at school – yes, this was some time ago – I had a colleague who was at constant war with his frensh language teacher and the teacher with him. So once when writing an exam, in the end my colleague returned a piece of paper to the teachers desk, containing a longer monologue sayingwhy the student absolutely didn’t feel like writing an exam. The whole text was written in a large spiral on the paper.

When the exams were returned, my colleague got his paper back, signed by a note of “6”, which in Germany is the worst note you can get. The explanation of why the teacher voted for a 6 was written around the figure “6” … in a spiral.


One story I read on the internets said, that a student of english literature got back his very very bad exam with a remark from the teaher:

I return this otherwise good writing paper back to you, because someone wrote gibberish all over it and put your name on top.

I like this sense of humour!

Article
0 comment

PostgreSQL: Backup a large database to disk

I really like database dumps with complete column inserts. You can read them, browse, search for some special data and even manipulate them. The most simple way to create such a readable backup is:

pg_dump --column-inserts --inserts --if-exists --clean --create -f FILENAME.sql DATABASENAME

There is one drawback: the files are large compared to the actual information stored und im- and export are rather slow. But there is a backup format that is both compressed and fast. It’s called “directory format”:

pg_dump -Fd -f DIRECTORYNAME -d DATABASENAME

This creates a directory called DIRECTORYNAME and dumps the content of DATABASENAME in a compressed format.
Getting the data back into the database is done with pg_restore:

pg_restore -C -d DATABASENAME DIRECTORYNAME

The -C option creates the database prior import.

Article
1 comment

PostgreSQL: Copy a table structure and data

Once again a short notice about some nifty PostgreSQL feature you might not be aware of. It’s quire simple to copy the structure of a table with:

CREATE TABLE table2 ( like table1 INCLUDING DEFAULTS INCLUDING CONSTRAINTS INCLUDING INDEXES );

Now you can copy data with or without some sort of filter:

INSERT INTO table2 SELECT * FROM table LIMIT 100;

Hope that helps.

Article
0 comment

PostgreSQL: Convert an hstore column to jsonb

I’m involved in a tech scouting project, where I designed a PostgreSQL database schema using an hstore column to store additional a priori unspecified data to every record in that table. For some historical reason I decided to use an hstore column instead of one of the much more versatile JSON column types.

Historical priorities change and so they did in that project. Now I wanted to use that column as JSONB. As always in SQL columns are changed with an ALTER TABLE … ALTER COLUMN statement. In this case the conversion of the content is a bit more complex, but giving PostgreSQL some hint about what we expect does the job.

First let’s create a table to play with and insert one row of data:

CREATE TABLE hstore_data (data hstore);
INSERT INTO hstore_data (data) VALUES ('key1 => "value1", key2 => "value2"');

Let’s see, if it looks OK:

SELECT * FROM hstore_data;

Yields:

                data                
------------------------------------
 "key1"=>"value1", "key2"=>"value2"
(1 row)

Fine. Next step is the magic column conversion:

ALTER TABLE hstore_data ALTER COLUMN data TYPE JSONB USING CAST(data as JSONB);

Another look:

SELECT * FROM hstore_data;

And we get:

                 data                 
--------------------------------------
 {"key1": "value1", "key2": "value2"}
(1 row)

Mission accomplished.

Article
0 comment

Learning by examination

I’m a studied chemist. This means, at least in Germany, I survived an endless number of oral exams. Most of them were boring. Some for me and most of them for the tester. But some of them were outstanding. Because of the atmosphere and the conduct of discussion by the tester. All of them were lead by older professors, none of them by one of their assistants.

What made them stand out was that the examination itself was sort of a conversation. Certainly not on eyes level but fairly unbiased. And I learned something. This means I didn’t fail a question and was reminded, that I didn’t know something but that in the course of the discussion my tester tried to control the flow of conversation in a way that I, by own thinking, learned something I unconsciously already knew but didn’t recognise.

One example: in my diploma exam in organic chemistry I was asked if I knew what was the predominant industrial chemical process scientifically examined in the first half of the 20th century in Germany. I knew many inorganic industrial processes invented by that time but didn’t realise that the main interest of Germanys chemical elite was coal liquefaction. Then I was asked if I knew what was one of the main products of direct coal liquefaction. It was toluene. Now the tester asked me about the inorganic processes I remembered: these where the Haber-Bosch process to yield ammonia gas from gaseous nitrogen and hydrogen and the Ostwald process by burning ammonia to nitrogen oxides at a platinum-rhodium catalyst, absorbing them in water to get nitric acid. Well, ok, whats that all together? From coal you get toluene, from simple nitrogen from the air you get nitric acid. tntWhat could a chemist do with that? Right: nitration of toluene to 2,4,6-trinitrotoluene, better known as TNT. So from end to end the coal liquefaction had nothing to do with preparing a liquid fuel. Instead it was a strategic process in times of war. Wow, I didn’t think that far by combining the facts I already knew …

I went out of those exams richer in knowledge and with more self-confidence. Well and usually with a good mark. What I didn’t know until recently is that this form of deep conversation on complex topics is something I really enjoy. Which has something to do with my psychological predisposition. But this is another story to be told separately.