Continuous Integration and Continuous Delivery with BitBucket Pipelines — Part 2

Adam Taylor
Adzuna Engineering
Published in
5 min readOct 4, 2018
“two white flying rockets during daytime” by SpaceX on Unsplash

In Part 1 we covered the basics of continuous integration and delivery using BitBucket Pipelines.

We had our pipeline up and running so that every commit to every branch was built and tested by the pipeline and we were able to see the results in the Pipelines UI.

We also had configured the pipeline to deploy any changes to the master branch to our staging server if the build passed its tests.

BitBucket Pipelines Build Screen

We ended the article by mentioning some of the things we could do to advance our setup:

  • Generating build artifacts
  • Making use of the “Deployment” functionality
  • Using the pipeline to deploy to production
  • Creating a custom image to speed up the build and test cycle

Today we’ll talk about the next steps we took to improve the pipeline.

Using Deployments to Visualise Delivery of the Application

BitBucket Deployments

BitBucket Deployments are a way of helping teams visualise the state of their application across three environments: test, staging and production.

Track the deployments you execute through Bitbucket Pipelines and give your team insight into the status of your deployment environments and visibility over what code changes land in each environment.

The above image shows one of our services and the state of it across a staging and production environment.

To configure the deployments, you tag part of your steps as being a deployment:

deployment: test

So our pipelines configuration became:

image: ubuntu:16.04pipelines:
default:
- step:
name: Run the tests
caches:
- pip
script:
- export LANG=en_US.UTF-8 && export LANGUAGE=en_US:en && export LC_ALL=en_US.UTF-8
- bash scripts/pipeline/000-configure-container.sh
- bash scripts/pipeline/001-configure-environment.sh
- bash scripts/pipeline/002-run-tests.sh
services:
- mysql
branches:
master:
- step:
name: Run the tests and deploy to staging
deployment: staging
caches:
- pip
script:
- export LANG=en_US.UTF-8 && export LANGUAGE=en_US:en && export LC_ALL=en_US.UTF-8
- bash scripts/pipeline/000-configure-container.sh
- bash scripts/pipeline/001-configure-environment.sh
- bash scripts/pipeline/002-run-tests.sh
- bash scripts/pipeline/003-deploy-to-development.sh
services:
- mysql
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_DATABASE: pipelines
MYSQL_ROOT_PASSWORD: let_me_in

Now every successful build of the master branch would be deployed to an environment as before, but it would be tagged as the “staging” environment and therefore be visualised in the deployments screen.

This was a nice improvement, but we also wanted to be able to deploy to production using the pipeline…

Using our Pipeline to Deploy to Production

We decided that we wanted to adopt a continuous delivery approach to deploying software into production.

Recall the difference between continuous delivery and continuous deployment:

Continuous Delivery is sometimes confused with Continuous Deployment. Continuous Deployment means that every change goes through the pipeline and automatically gets put into production, resulting in many production deployments every day. Continuous Delivery just means that you are able to do frequent deployments but may choose not to do it, usually due to businesses preferring a slower rate of deployment. In order to do Continuous Deployment you must be doing Continuous Delivery.

Source: https://martinfowler.com/bliki/ContinuousDelivery.html

We have unit tests, run by the pipeline, for this service, but we don’t have tests of the services and functionality that consume and integrate with this service.

For now, we do the integration testing ourselves, manually. We want to perform this first, before triggering a production deployment.

This is where manually triggering a step comes into play. If you configure a step to be manual, it is executed by triggering it in the BitBucket UI — exactly how we wanted it to work!

A manual trigger is configured by adding trigger: manual to a step.

So we added a manually triggered step to our pipelines configuration. It was very similar to our master step for testing and deploying to staging, but this step deployed our application to production:

image: ubuntu:16.04pipelines:
default:
- step:
name: Run the tests
caches:
- pip
script:
- export LANG=en_US.UTF-8 && export LANGUAGE=en_US:en && export LC_ALL=en_US.UTF-8
- bash scripts/pipeline/000-configure-container.sh
- bash scripts/pipeline/001-configure-environment.sh
- bash scripts/pipeline/002-run-tests.sh
services:
- mysql
branches:
master:
- step:
name: STAGE test and deploy
deployment: staging
caches:
- pip
script:
- export LANG=en_US.UTF-8 && export LANGUAGE=en_US:en && export LC_ALL=en_US.UTF-8
- bash scripts/pipeline/000-configure-container.sh
- bash scripts/pipeline/001-configure-environment.sh
- bash scripts/pipeline/002-run-tests.sh
- bash scripts/pipeline/003-deploy-to-staging.sh
services:
- mysql
- step:
trigger: manual
name: PROD test and deploy
deployment: production
caches:
- pip
script:
- export LANG=en_US.UTF-8 && export LANGUAGE=en_US:en && export LC_ALL=en_US.UTF-8
- bash scripts/pipeline/000-configure-container.sh
- bash scripts/pipeline/001-configure-environment.sh
- bash scripts/pipeline/002-run-tests.sh
- bash scripts/pipeline/003-deploy-to-production.sh
services:
- mysql
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_DATABASE: pipelines
MYSQL_ROOT_PASSWORD: let_me_in

Now with this configuration we can manually trigger production deployments:

BitBucket Production Deployment Preview

And we see our application be rebuilt and retested and deployed into production in an automated and repeatable manner:

BitBucket Pipelines Production Deployment

We haven’t addressed all our areas for improvement but we’ve improved the pipeline such that we now:

  • build and test every commit to every branch automatically
  • build and test every commit to the master branch and deploy it to staging automatically, for further manual testing
  • can manually trigger a build, test and deployment of the master branch to production
  • are able to visual the state of the deployments across our environments with the “Deployments” UI

This has been working very well for us and I would recommend any other BitBucket users interested in CI/CD to have a look at Pipelines and Deployments.

--

--