Database compatibility testing as part of a CI pipeline

Christoph Grotz
Google Cloud - Community
3 min readAug 22, 2022
Photo by Call Me Fred on Unsplash

I really love modern deployment strategies like blue-green-deployment or even better canary testing, but I always stumble over database schema migrations. In the past, this often prevented my team from really going with canary testing, since we were never sure if the new version wasn’t introducing breaking changes on the database. Sure, you could try to add it as a field to your JIRA user stories checklist, but I’m not 100% confident that this would be correctly filled all the time.

In this blog post, I will share an approach with you, that you can run as part of your CI/CD pipeline. I created an example repository that shows the complete setup in action.

Database compatibility testing in 5 steps

The CI pipeline will build 3 container images for the app, the database migration scripts and the test suite. Afterwards the pipeline executes the following steps:
1. Use Cloud Run Admin APIs to fetch the container image version running on production. We want to test if the new database schema migrations are compatible with this app version.
2. Using the current production version a new test instance is deployed to Cloud Run
3. The current production database is cloned and migrated using the latest database migration container image
4. The test suite container is run against the app test instance
5. If the test is successful the canary CD pipeline is triggered to deploy the new version to production.

Using database migration tools like Flyway and Liquibase we nowadays have great tools to manage database schemas. It beats passing database migration scripts to DB admins a lot. But I’m a little hesitant running them as part of the container startup, the permissions to change the DB schema and the increased cold start latency are not always acceptable. So I prefer to package them in their own container and run them as a separate step as part of the continuous delivery pipeline. This setup allows to nicely use makes database migration a much more versatile tool, for example allowing us to test database schema compatibility with different application versions. For this example, I’m using golang-migrate, which is a great tool to use with Golang applications for database migrations as well as standalone.

Packaging the integration tests into a separate container allows similar versatile reuse. When you cleverly flag or group your tests, you can even consider that you use a subset of your integration tests as smoke tests, that you run periodically to check your application. Packaged integration tests also can be reused with ZAProxy for security testing (but that’s another story).

The approach in this blog uses PostgreSQL, but can be modified to be usable with NoSQL databases as well. You will just need to change some of the features. I opted for cloning the production database. In Postgres this can’t be done when there is write access on the database, so in a real setup you might want to use an integration database as a source or go with cloning the whole SQL instance.

After the tests, you could flag the release bundle accordingly for compatibility in your CD tool or software repository, for example using approvals or Binary Authorisation from your CI/CD pipeline.

There is nothing really special about the setup, just the steps to be executed. You can follow the steps in the Readme.md for setting up a test version of this setup.

--

--

Christoph Grotz
Google Cloud - Community

I’m a technology enthusiast and focusing on Digital Transformation, Internet of Things and Cloud.