Using docker-compose in Bitbucket Pipelines

Iván Perdomo
Mar 15, 2018 · 3 min read
Image for post
Image for post
https://www.flickr.com/photos/beigephotos/5334612/

At Magnet we use Bitbucket to host our code and we recently switched to use Bitbucket Pipelines for building, checking, testing and deploying our code.

Another recent addition to our toolbox is a reproducible and fully automated development environment with Docker and docker-compose.

The most common approach to build and test your code in a Continuous Integration (CI) environment is to have a set of instructions following the CI conventions. In most of the available hosted solutions Travis CI, CircleCI, Gitlab and Bitbucket this means to declare your dependencies, build steps in a file using YAML format.

Why would you consider using docker-compose for CI builds?

Having a different set of build steps in CI and in the developer’s machine could lead to the known developer statement: . The other issue is that you need to maintain at least 2 set of instructions, and with a small team, this means having less time allocated to work on solving customers problems. Ideally our CI build process follows the same instructions that a developer would do in their computer.

The other main concern when automating the build process is the increasing complexity of the systems we build. Even if you keep developing monoliths you have at least 2 systems: your single multi-tier running system (e.g. a JVM process) and a database. This is where docker-compose can help you by having a description of a running system declared in a single file.

Making sense of Docker support in Bitbucket Pipelines

Bitbucket Pipelines has the support for building and running containers as part of the build process. In this same blog post they mention the support of docker-compose files.

When we tried a simple docker-compose up in our build, it failed because that program was not available in the default container image. Remember that our goal was to execute the same steps as a normal developer in their computer.

After some documentation reading, we found that there is a way to define custom images to build our code. Then we found that there is a Docker in Docker image. We just need to be able to install docker-compose as part of the build dependencies. Some searching lead us to know that docker-compose is a python application and is available via pip.

The full recipe

  • Use a Docker in Docker image to run our build
  • Install docker-compose via pip
  • Use any docker-compose command directly as part of your build steps

An example of this setup

The bitbucket-pipelines-docker-compose repository is a running example on how to use docker-compose as part of your build pipeline.

This is a minimal example where we use a server and client service using the same postgis image. We define a dependency between the services and the client waits for the server to be ready.

Happy building!

magnet.coop

Clojure-based smart software platforms

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store