How we automated deployments and testing with BitBucket Pipelines

Eric Jiang
MonPlan
Published in
6 min readMar 17, 2018

Hi there! When we recently moved our Git repositories over from GitHub to BitBucket was to utilise the entire Atlassian Suite to improve our Git and Development Workflows such as Documentation, Git, Testing, JIRA and Continuous Integration.

First of all, what is BitBucket Pipelines?

Integrated CI/CD for Bitbucket Cloud that’s trivial to set up, automating your code from test to production.
- BitBucket Site (Atlassian)

Credit: Atlassian (BitBucket)

BitBucket Pipelines is a relatively similar product to Bamboo, Travis and Jenkins. It is built into the BitBucket system, (though you pay for more build minutes).

You can do many things such as:

  • running tests for Pull Requests
  • deploying code
  • scheduling tasks to run

Initial Setup

Note: We use NodeJS as part of our Frontend.

image: node:8

pipelines:
default:
- step:
caches:
- node
script: # Modify the commands below to build your repository.
- yarn
- yarn test:ci

What this does, is that it runs yarn, yarn test:ci (our non-interactive test) for every time this pipeline is run.

Now, what if I want to run tests for branches which follow a pattern for example, feature/new-material-button or fix/ui-issue-with-form. Well, as per the Pipelines Documentation we can automate part of our build using that! All we need to do is match the pattern using ‘**’.

# This is a sample build configuration for JavaScript.
# Check our guides at https://confluence.atlassian.com/x/14UWN for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: node:8

pipelines:
default:
- step:
caches:
- node
script: # Modify the commands below to build your repository.
- yarn
- yarn test:ci
branches:
master: # test and deploy
- step:
name: Runnning Automated Tests
caches:
- node
script:
- yarn
- yarn test:ci
'**': # matching type/name (i.e. feature/sso-login)
- step:
caches:
- node
script:
- yarn
- yarn test:ci

As well as running tests for our master branch. This silly-developer proofs our branches. Now all we need to do is protect our branches, setting that:

  • Pipelines is run, and 1 successful build has occurred
  • Reviewed by at least 1 person and approved.
  • No write access to the branch
The protection on our deploy and master branches is set that even Admins don’t have write access

This allows for more rigorous testing and prevents even eslint, unit tests warning automatically rejecting a PR.

Even administrators can’t push to master now. 😈

Scheduling Pipelines

Something that I always loved about Pipelines is scheduling builds and tests. As part of current Pipelines configuration, we have completely automated our builds. At 5pm each night, Pipelines is run on our master branch to ensure that there is broken builds in master.

But wait, we not only schedule tests as part of our CI methodology.

Automating our Builds to Dev and Staging

Key Note: Our instances are currently served over Google App Engine

When someone asks me to deploy every half-hour

Every time, someone asks me to push something up to dev, staging or prod, I really hate it. Even asking me, every half an hour. And I’ve always had trouble deploying to these instances, especially with our old CI provider CircleCI and Travis.

Now, here comes the awesome part, in case you didn’t realise where this going, I’m going to 🤯 mind blow some of you (actually I did when this worked, on the first try too).

So first off we are going to create a service worker which is linked to our Google Cloud Platform project, ask for a private key and created in JSON. We are then going to base64 encode it and pass it over to BitBucket Cloud as an Environment Variable. (Note that this account may need some certain elevated permissions).

How our environment variables looked like!

All, we need to do now is automate our tests, luckily for us, Pipelines provides something called ‘deployment mode’, useful huh.

deploy/nightly: # test and deploy
- step:
caches:
- node
script:
- yarn
- yarn test:ci
- step:
name: Deploying to dev
deployment: test
trigger: automatic
script:
# Install Google App Engine SDK
- curl -o /tmp/google-cloud-sdk.tar.gz https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
- tar -xvf /tmp/google-cloud-sdk.tar.gz -C /tmp/
- /tmp/google-cloud-sdk/install.sh -q
- source /tmp/google-cloud-sdk/path.bash.inc
# Authenticating with the service account key file
- echo $GCLOUD_API_KEYFILE_DEV | base64 --decode --ignore-garbage > ./gcloud-api-key.json
- gcloud auth activate-service-account --key-file gcloud-api-key.json
# Linking to the Google Cloud project
- gcloud config set project $GCLOUD_PROJECT_DEV
- yarn # reinstall packages in pre-build
- sh ./scripts/gae_dev.sh

So in the script above, every time we do a PR into the deploy/nightly branch, we run two steps:

  1. Test the code
  2. Deploy, which involves (installing the Google Cloud SDK, building and deploying) — our shell script once again tests, builds and wrappers in our server code and then deploys.

Note that the $GCLOUD_API_KEYFILE_DEV is the environment variable used for the JSON Private Key for our service account and $GCLOUD_PROJECT_DEV is the project ID for the target GCP Project.

This is what you should see

What else can we do?

Some of you may have realised where this going. For those of you watching at home, in case you didn’t realise — I’m going to schedule some of these deployments. Within our team, we have set ‘protected’ dates of when we can deploy.

So once again go into Pipelines, and schedule some builds. Our staging runs every Wednesday at 1pm, and nightly runs every day at 3pm. Allowing us to automatically deploying to dev/staging.

A key note to point out is that you can also manually deploy to prod (something that I haven’t yet setup as it may be slightly dangerous 🚒). All you need to do is set the trigger to manual.

deploy/prod: # test and deploy
- step:
caches:
- node
script:
- yarn
- yarn test:ci
- step:
name: Deploying to prod
deployment: test
trigger: manual
script:
# Install Google App Engine SDK
- curl -o /tmp/google-cloud-sdk.tar.gz https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
- tar -xvf /tmp/google-cloud-sdk.tar.gz -C /tmp/
- /tmp/google-cloud-sdk/install.sh -q
- source /tmp/google-cloud-sdk/path.bash.inc
# Authenticating with the service account key file
- echo $GCLOUD_API_KEYFILE_DEV | base64 --decode --ignore-garbage > ./gcloud-api-key.json
- gcloud auth activate-service-account --key-file gcloud-api-key.json
# Linking to the Google Cloud project
- gcloud config set project $GCLOUD_PROJECT_DEV
- yarn # reinstall packages in pre-build
- sh ./scripts/gae_dev.sh

You can read more about Deployments here.

That’s all folks! Feel free to give some 👏 and some 👍 to this.

monPlan is the Monash Course Planner for Students, by Students. It is made with ❤️ in Melbourne, Australia (The best ☕ in the 🌏) .

--

--

Eric Jiang
MonPlan
Editor for

I build software, travel and take photos | 👨‍💻 GitHub: github.com/ericjiang97 |👀 Views are my own