Using docker-compose in Bitbucket Pipelines

Iván Perdomo
magnet.coop
Published in
3 min readMar 15, 2018
https://www.flickr.com/photos/beigephotos/5334612/

At Magnet we use Bitbucket to host our code and we recently switched to use Bitbucket Pipelines for building, checking, testing and deploying our code.

Another recent addition to our toolbox is a reproducible and fully automated development environment with Docker and docker-compose.

The most common approach to build and test your code in a Continuous Integration (CI) environment is to have a set of instructions following the CI conventions. In most of the available hosted solutions Travis CI, CircleCI, Gitlab and Bitbucket this means to declare your dependencies, build steps in a file using YAML format.

Why would you consider using docker-compose for CI builds?

Having a different set of build steps in CI and in the developer’s machine could lead to the known developer statement: “Works on my machine”. The other issue is that you need to maintain at least 2 set of instructions, and with a small team, this means having less time allocated to work on solving customers problems. Ideally our CI build process follows the same instructions that a developer would do in their computer.

The other main concern when automating the build process is the increasing complexity of the systems we build. Even if you keep developing monoliths you have at least 2 systems: your single multi-tier running system (e.g. a JVM process) and a database. This is where docker-compose can help you by having a description of a running system declared in a single file.

Making sense of Docker support in Bitbucket Pipelines

Bitbucket Pipelines has the support for building and running containers as part of the build process. In this same blog post they mention the support of docker-compose files.

When we tried a simple docker-compose up in our build, it failed because that program was not available in the default container image. Remember that our goal was to execute the same steps as a normal developer in their computer.

After some documentation reading, we found that there is a way to define custom images to build our code. Then we found that there is a Docker in Docker image. We just need to be able to install docker-compose as part of the build dependencies. Some searching lead us to know that docker-compose is a python application and is available via pip.

The full recipe

  • Use a Docker in Docker image to run our build
  • Install docker-compose via pip
  • Use any docker-compose command directly as part of your build steps

An example of this setup

The bitbucket-pipelines-docker-compose repository is a running example on how to use docker-compose as part of your build pipeline.

This is a minimal example where we use a server and client service using the same postgis image. We define a dependency between the services and the client waits for the server to be ready.

Happy building!

--

--