Testing database interactions using Go
When it comes to integration or end-to-end testing, there is no more place for the mocks. As all should tested around real components or at least stubs for integration. Sometimes this might look to be hard, even more, when you have to deal with databases.
The simplest example of the test case which includes a database in it is some web server which returns a record data by its id. To test this case, the record should be present in the test database before the actual test’s run.
First of all, you should check that the database is clean and ready to receive required data. The test should run in the isolated suite, which includes only required data and not rely on any previous data it may contain. If for example test case is about pagination of records, assert might fail if there some other records which were not cleaned and not expected to be.
The second step is to insert actual data you will dealing with into the database before the test run. Either you are going to do this by executing some statements or uploading schema.
Go is a new language. Although it borrows ideas from existing languages, it has unusual properties that make effective Go programs different in character from programs written in its relatives
As a former Ruby on Rails developer, I appreciate the practices and solutions like FactoryBot for fixtures, DatabaseCleaner etc, what Rails provide for testing interactions with a database. As all the good things what works goes further and be used in other languages all of this solutions find their way in Go.
Here I will introduce some packages which shares the ideas:
Basically this package mimics the “Rails’ way” of writing tests for database applications, where sample data is kept in fixtures files. Before the execution of every test, the test database is cleaned and the fixture data is loaded into the database.
The testfixtures work from the box. It’s doing both things, it’s cleaning database before each test and then loads the data into the database. No parallel support.
Clean database for testing, inspired by database_cleaner for Ruby. It uses flock syscall under the hood to make sure the test can runs in parallel without racing issues.
The dbcleaner allows clearing database before each test. It allows running tests in parallel.
Package txdb is a single transaction based database sql driver. When the connection is opened, it starts a transaction and all operations performed on this sql.DB will be within that transaction. If concurrent actions are performed, the lock is acquired and connection is always released the statements and rows are not holding the connection.
The go-txdb allows to run tests inside the SQL transactions. This means what each test will run in the isolated database instance. This allows running tests in parallel.
Mainly this package was created for testing purposes, to give the ability to seed a database with records from simple .yaml files. Polluter respects the order in files, so you can handle foreign_keys just by placing them in the right order.
The polluter gives the ability to seed a database with the data from YAML files. This package created to use with the database cleaning tool or transactional database.
As it looks too simple and could do more, it’s doing all that it should. If you want some random data you could pre-generate files with data using some faker package. When goes to foreign_key constraint violation, polluter respects the order in files, so if you have placed your data in the proper order all will work fine.
To make your data easy to use as possible, ids in tests should be hard-coded. If the key is auto-incremented it could be managed with changing initial value. For example, for MySQL with executing
ALTER TABLE table_name AUTO_INCREMENT = 10;
The testfixtures package provides all you need from out of the box. But I prefer to use more flexible solutions:
- For acceptance testing — database cleaner with polluter
- For integration testing — go-txdb with polluter.
This code example shows the usage of the polluter with the go-txdb. Here all the goals reached.
Each test runs in the isolated transaction and seeded with required data. So each test is independent and other tests afterward will not be failing depending on the order of execution. This also allows running all tests in parallel without fear of the race conditions.
This is the best what I found for now when it comes to testing the database interactions, hope you enjoyed.