Using Bitbucket Pipeline for AWS ECS deployments

Sahaj Software
inspiringbrilliance
6 min readAug 8, 2018

by Nick Benton

Background

Recently, I was tasked with writing a program which deployed both a dashboard and an API on a single EC2 instance. I was then told that the code base was to be stored in Bitbucket. I am a huge proponent of CI/CD and automated deployments to a Blue/Green environment (read my next article to see our Blue/Green design) so I decided to first tackle the deployment pipeline. Since the requirement (and my preference) was for this to be deployed on AWS, our first thought of deployment pipeline was to use AWS CodeCommit and AWS CodeDeploy.

After an initial build using these tools, we realized that there was not an easy way to include webhooks from Bitbucket to the AWS Pipeline. Bitbucket had a way to fire off lambdas but not a way to webhook straight to CodeDeploy (this may have been by design thanks to their own pipelines addition). Bitbucket could however easily fire off lambdas. Now I had to make the decision, create a lambda that fires off CodeDeploy or create a Bitbucket Pipeline that fires off CodeDeploy… Well none of this is any good. We’re starting to look at using in-place software to fire off FaaS to fire off a FaaS. The initial reasoning as to use AWS CodePipeline was to keep all cost allocation of the development in the cloud so that the client would get one clean bill each month. But the client is already using Bitbucket and already paying for bitbucket. As mentioned before, at this point I’m considering using Bitbucket Pipelines to fire off another deployment pipeline… Time to simplify this whole process and use Bitbucket Pipelines.

Simplicity

Time to remove any of the complexities of the process I can. We decided to build our application, both dashboard and api, as two docker images with the same code base and two different entry points deployed onto the same EC2 instance. This minimizes cost to the client as well as a logical separation of the functionality. We could use a single repository and each time we would push it would build both of our images individually and deploy them to the same instance.

Basic AWS Architecture

The Pipeline

“Great, so you made some decisions and designed a project. I’m here to copy your Bitbucket Pipeline!!!”

I get it, I would have skimmed straight to here.

The great thing about Bitbucket Pipelines is that it ultimately is just a bash script. Thanks to this, you can simply run AWS CLI commands as you would externally. Before we build our pipeline, we have some more decisions to make. First, how many branches are you going to have? I decided to base this off of how many AWS accounts I had. I decided to use two branches, dev and master. Development would push directly to my dev AWS account and master to my production AWS account. When I decided that the code was ready for production, I only needed to create a pull request and it would deploy to production. Next, how many steps? Two. One for dashboard and one for API. This is based on my architecture above. Below is an example of the skeleton for this design (and a link for easier copying)

https://gitlab.com/NickBenton/BitbucketPipelines-ECS/blob/master/examples/steps-skeleton.yml

Setting up AWS access

Bitbucket Pipelines has a setup that makes working with AWS rather straight forward. All you have to do is set up three environment variables and it will automatically run all CLI commands on your account. Go to Setting -> Pipelines -> Environment variables. You’ll have to set AWS_ACCESS_KEY_ID, AWS_DEFAULT_REGION, AWS_SECRET_ACCESS_KEY. We will additionally setup AWS_REGISTRY_URL since we are using ECS. This covers all that is needed to start deploying to our main account (for us this is dev since most deployments will go to dev). For our purposes, we had to create these variables for the production account as well.

AWS Environment Variables

“Get to the good stuff already!” Okay, okay…

The Script

Fine, here’s the script for the dashboard step. We’ll go through it below.

https://github.com/NickBenton/BitbucketPipelines-ECS/blob/master/examples/script-example.yml

The main parts that are done in this script are as follows:

AWS ECR login: This will retrieve the login command to use to authenticate your Docker client to your registry.

Export Enviornment Variables: Here you will set the service, cluster, task, execrole, and build ID to be used throughout the pipeline.

Docker Build, Push: These are your standard Docker commands to push the image to your ECR repository you’ve logged into.

Stop Current Task: We export the current running task and then stop it so that we can run a new task. This is done in my build specifically because I only allow 1 task for my service. This part can be changed based on your needs.

Register New Task: Register the new task you will build. My task is built with many of the standards to include port 5000 for Python Flask and deploys it’s docker logs to CloudWatch. Before it can deploy this way you must have gone to CloudWatch and created a new log group of the same name.

Update Service: This deploys the new task to the service you’ve listed above and begins the deployment of a new image to the EC2 instance.

Once these steps have finished running through Bitbucket Pipelines, the Target Group associated with the ECS Service will drain (ECS Draining), run initial health checks, then be fully accessible.

Final Product

“That’s just one script! You told me you had two branches to two enviornments deploying two containers each!”

Right again nonexistent reader I’ve made up in my mind!

Here’s the final pipeline!

The final step after you’ve figured out how to properly run this for one script is to repeat it four times. In my case, I additionally added in a Blue/Green strategy for my development environment. That code is too large to paste it all here so I’ve linked it above. Note that once you do complete this, you will want to save the code as bitbucket-pipelines.yml at the root level so that Pipelines will pick it up. Then, enable pipelines as seen below.

Settings-> Pipelines -> Settings

Now every time you push your code to dev it will automatically build and deploy both your dashboard and your API. Every time you merge a pull request to master it will build and deploy your code to your production environment.

Disclaimer Way Too Late

This is the result of my first time building code using Bitbucket Pipelines. I looked far and wide for the exact guide I have given here and could not find it so I figured I’d publish my findings so that the next person has it more easily available. I’m sure there are many additional changes that may make this guide and/or the code more complete. Please do respond with any advice or additions. This code was used for a project over 6 months and served us very well to speed up our development process.

Enjoy!

It’s an open source project. If you add improvements, contribute! If you really did enjoy this, clap!

--

--

Sahaj Software
inspiringbrilliance

Pioneers in combining creative solutioning, intuitive design and technical excellence to deliver simple software solutions to complex business problems