Migrate tables from one PostgreSQL database to another using the INFORMATION_SCHEMA

“I know that I know nothing” — Socrates

The request

Recently a client asked us to create a database migration process between two PostgreSQL databases. This would have been a very trivial process, but for the following restriction...

We didn’t have access to the source database and the client could not provide us with the schema due to their strict data management policy.

So in essence we had to create a script that would migrate a list of tables, schema and data, from one database to another without having to specify the exact schema for each table.

The implementation

Fortunately all SQL databases have the INFORMATION_SCHEMAem…schema. We read from the PostgreSQL documentation :

The information schema consists of a set of views that contain information about the objects defined in the current database. The information schema is defined in the SQL standard and can therefore be expected to be portable and remain stable — unlike the system catalogs, which are specific to PostgreSQL and are modelled after implementation concerns.

For our needs the data contained in the information schema were sufficient. All we have to do is get the columns and their properties for a specific table, this can be done by running the following SQL query.

You can run the same query with SELECT * to get the full list of information fields available.

Running the query for a simple table as an example we get the following results.

Using the information above we can construct the DROP TABLE...CREATE TABLE... query that we will run in the destination database.

Putting it all together we created a simple python script that you can find in the following link

https://gist.github.com/Havoc24k/d646eb54925e8777a01c45b33e4a0f68
  1. Connects to the source and destination databases.
  2. Reads a list of table names from an array.
  3. It gets the schema data for each table using the PostgreSQL INFORMATION_SCHEMA and the table data using a simple SELECT *... query.
  4. Dynamically creates and executes the DROP TABLE...CREATE TABLE... query.
  5. Dynamically creates and executes the INSERT query that will insert the existing table data to the newly created table in the destination database.

Before you execute the script you have to configure your database credentials and add the list of tables you want to be migrated. If anything goes wrong the script prints an error and does a rollback on all the queries that had run in the destination database. Also there is a -f flag that skips the prompt that asks the user to proceed with the import for each table.

Hope this posts helps others with similar needs. I am pretty sure that this can be done with other tools or methods but I find value in exploring the INFORMATION_SCHEMA, I consider it to be a hidden storage of information that can be used in different ways.