Postgres and integration testing in Node.js apps

Stefanos Gkalgkouranas
Geoblink Tech blog
Published in
8 min readMay 6, 2021

This article explains one of the testing approaches we use at Geoblink for our integration tests involving Postgres. It includes a walkthrough of:

  1. spinning up a Postgres container
  2. building a simple Node.js API server
  3. writing integration tests for our server

Testing

Testing is one of the most crucial parts in software engineering and has a huge impact in keeping a system reliable and easily extendable. Here at Geoblink we treat testing with the attention it deserves. When I first joined Geoblink as a Junior Backend Engineer I was relieved to find a codebase that was thoroughly tested. This boosted my confidence in making changes to the code and helped me get a better grasp of the app’s components and their expected functionality. This was a definitive factor in convincing me of the importance of testing. So, here I am, writing about it, attempting to explain one of the approaches that we use for integration testing with a simple example later on.

Unit, integration and end-to-end tests all play their role in our services but in this article we are going to focus on integration tests and more specifically with Postgres. In the core application’s backend, where the interaction with the databases happens, we rely heavily on integration tests. In an API server, this type of testing means you make actual requests to your endpoints and query the database to make sure that everything works as expected.

Like all things in software engineering, integration testing can be done in more than one way. I encourage you to read our previous article “Testing Postgres applications: one simple trick” as it gives additional information about the testing approach that we are going to use here.

Our approach to integration testing with Postgres

The approach we are going to use for our tests is based on the following steps:

  1. Make sure that there is exactly one connection from the application to the database
  2. Create a temporary table with the same name as the table that we want to test
  3. Insert fake data (optionally) and test your cases

After creating the temporary table, all queries will be querying that table instead of the original. This is because of how Postgres works. It uses a search path which is basically a list of schemas in a specific order that Postgres looks up to determine which table to query. When creating a temporary table, Postgres places it on top of that list, so we can assume it has priority and that our query will hit the temporary table instead of the original. We can take advantage of this in our tests, by creating temporary tables and inserting only the data that we need in order to test our cases.

Keep in mind that this method won’t work if you specify the schema in your queries, like this:

SELECT * FROM my_schema.my_table

The typical flow of our testing process would be the following:

  1. Load the application and mock interfaces
  2. Create temporary tables
  3. Load fake data into the temporary tables (optional)
  4. Call endpoint
  5. Assert
  6. Drop temporary tables and clean up

Now, let’s start by creating our server and database and, then, we will move on to the tests.

Initializing our app

We are going to build a very minimal API server that handles personal notes. Actually, in order to keep it short, we will only include one endpoint POST /note just to demonstrate how this testing approach works. Keep in mind that error handling, project structure etc. are not handled properly.

Here is the stack we are going to use:

The files and structure will end up looking as simple as this:

pg-test-example
├── scripts
│ └── init-db.sql
├── docker-compose.yml
├── index.js
├── note.test.js
├── package.json
└── poolClient.js

For the record, I developed this example using Node.js v14.16.1 on an Ubuntu 20.4 machine.

So, let’s start by opening up a terminal and setting up our project.

mkdir pg-test-example && cd pg-test-example
npm init -y
npm install express pg-pool pg
npm install --save-dev mocha chai supertest

This will initialize our project and install the dependencies we are going to need.

After a quick clean up of the package.json file it looks like this:

You might have noticed I added the scripts local and test. So, as you might have guessed, running

  • npm run local will run the server
  • npm run test will run the tests

But first, we need to make sure all our code for our API server is in place and the database is running, so let’s get to it.

Spinning up the database

We are going to use Docker and docker-compose to spin up a database. That database will be ephemeral and only for the purposes of this example. I’d like to make it clear that setting up the database like that is not part of our testing process, I only chose it as a shortcut to facilitate this example. Ideally, you would run the tests against a database that replicates your production database and is dedicated to testing.

We will use the official Postgres image provided in docker hub. There you will find the documentation of the image. One thing to note there is the information regarding initialization scripts. It says that any *.sql file that we put under the folder /docker-entrypoint-initdb.d at the root of the container will be run before starting the database services. We will take advantage of this to create our table.

First, let’s write our docker-compose.yml file:

You can see that we are creating a volume to bind the scripts folder to /docker-entrypoint-initdb.d just as instructed in the documentation. Any script we put under scripts will be copied to the container and run when our database gets initialized.

Before we run docker-compose we need to make sure that we have created the scripts folder in the root of our project and we have put our init-db.sql script in there. That script should look like this:

Note the UNIQUE constraint in the name column. We will write a related test later on.

Finally, let’s run it:

docker-compose up --renew-anon-volumes

The --renew-anon-volumes flag is optional and it’s just to make sure that we reinitialize the database each time we rerun it with that command.

And that’s it for the database part. Remember to run docker-compose down -v when you are done with this example to stop and remove the containers and also remove the anonymous volume that was created.

Building our API server

Now, let’s write our very simple server which will consist of a single endpoint POST /note that will take some parameters and insert an entry into the database.

As mentioned before, to connect our server to the database we will use pg-pool. It’s a package that is straightforward and very easy to use. Let’s create the poolClient.js file and use this code:

This module exports the query function. Calling this function will acquire a client from the pool, query the database, and return the client back to the pool so it can be used again.

That’s all regarding the server. To check that it works we need to make sure our database is running and, then, we can run the server by opening a terminal and running npm run local in the root of our project. Now, to see that it’s working, we can open another terminal and make a request to the endpoint, like so:

curl http://localhost:3000/note \
-H 'Content-Type: application/json' \
--data "{\"name\":\"a_note\",\"content\":\"some_content\"}"

You should see the success message logged at your server’s terminal. Finally, let’s move on to the integration tests.

Writing the integration tests

Our integration tests are going to start our server and make actual HTTP requests to the endpoints which will, in turn, hit the database. Right after that we are going to make assertions to ensure that the endpoint is behaving as expected.

Let’s write a few tests for the following cases:

  • Successful request
  • Invalid parameters
  • Name already exists (remember we have a unique database constraint on the name column)

Take a moment to read and understand the following code and then we will go through the main highlights. Here is how our test file note.test.js is going to look like this:

Running the tests we should see that they are all passing. Remember that our database needs to be up and running. To run the tests open a terminal at the root of your project and execute the test script:

npm run test

Here are a few comments on the key factors of these tests.

About the mocking:

  • As mentioned earlier, we need to make sure that our pool has only 1 connection and also it’s best if it doesn’t have an idle timeout. That’s due to the nature of temporary tables being available only in the current session and kept alive only for the duration of that session.
  • By mocking the pool before loading the app, we make sure that when the app loads it will import our mocked pool instead of the original one.

About the tables and data:

  • When inserting fake data and dropping the temporary tables, make sure you specify the pg_temp schema to avoid corrupting the database in case of mistake. That’s the schema where Postgres stores the temporary tables.
  • Creating the temporary table with LIKE … INCLUDING ALL also copies the original table’s constraints. In some tests we might want this in order to test them and in others it might be better to skip them to focus specifically on what we want to test. If that’s the case then we could, for example, replace it with CREATE TEMPORARY TABLE note AS SELECT * FROM note LIMIT 0. This will only copy the table structure.
  • Notice the IF EXISTS in the drop table query. If we remove it we’ll see our second test failing to clean up in the afterEach statement and throwing an error schema "pg_temp" does not exist. That’s because in this example we are using the pg-pool library in a way that it internally closes and removes the client from the pool as soon as the query throws an error (in our case inserting a duplicate name value). Remember that temporary tables live for the duration of a session, so our error results in the temporary table being dropped already before reaching the afterEach statement. If that’s not the case for you, then it’s best to remove it so you have a more predictable flow.

Conclusions

Integration testing can be quite simple and it should not be overlooked. To be fair, initially it can be intimidating to set up a testing environment and find an approach that suits your needs, since there are many options and variations to do so and a lot of abstract talk around it. This is why I wanted to provide a concrete example and share one of the approaches we use at Geoblink that works well for us.

Give it a test!

--

--