How to automatically deploy new Docker image updates in an ECS service being the Dockerfiles in BitBucket and the Docker images in ECR.

Pablo Perez
Pablo Perez
Published in
5 min readJan 3, 2019

How to automatically deploy new Docker image updates in an ECS service being the Dockerfiles in BitBucket and the Docker images in ECR.

Part 0. What we already have.

Having a use case where the developers update Dockerfile versions all the time, and needing to automatically update our ECS service tasks with the correspondent image new versions we will need to deploy the following to achieve this.

Very detailed explanation:

We have a bitbucket repository where the Dockerfile is and gets updated from time to time.

We have an ECS cluster where a service runs task from a task definition with the resulting image of the original Dockerfile which has been pushed to ECR before.

We’ll start creating the AWS CodePipeline

However, You can see CodePipeline doesn’t support Bitbucket…

So, in order to integrate BitBucket with CodePipeline you will need to:

PART 1. Preparing webhooks.

  1. - Get the Oath Key from BitBucket. Click on “Settings” under your avatar and click on “Oauth”, then click on “Add consumer”

Annotate the resulting Key and Secret.

2.- This QuickStart implements the required like a lambda function triggered by a webhook in order to pull your new Bitbucket commits and zip up the code placing it in S3. When this function. In this way you bridge Git services with AWS services like AWS CodePipeline.

In the following link click on “launch quickstart”:

https://aws.amazon.com/quickstart/architecture/git-to-s3-using-webhooks/

This will open a CloudFormation console, please, be aware that by default it’s launched in OHIO region and it has to be in the region where you have your pipeline!!!

Leave the parameter fields as follows, you only need to add the OAuth2 key and secret you creted before:

I have set 0.0.0.0/0 in Allowed IPs as other ip ranges were used like 18.234.32.227, 18.234.32.226 and 18.234.32.225 as I could see in the Lambda function (created by the Quickstart CloudFormation stack) Cloudwatch logs that pull the content and uploaded it to S3. For further hardening check with Atlassian.

Once the CloudFormation stack completes the creation process annotate the following values:

3.- Now create the Webhook in your Bitbucket account.

Click settings under your “repo” and click “Add Webhook”

Then you will only fill the URL part, use the GitPullWebHookApi value which is the webhook endpoint

4.- Copy the SSH public key generated in the Outputs section of the CloudFormation stack and add it in your bitbucket repo keys.

PART 2. Creating the Pipeline.

1.- Now, we’ll start creating the AWS CodePipeline from scratch, in the “Source” stage we’ll select S3 as the source provider and the S3 bucket name generated in the CloudFormation stack outputs, this is “OutputBucketName”. The object key will be Dockerfile as it is what the developers will commit.

The S3 Object Key is composed like this:

yourbitbucketaccountuser/ yourbitbucketreponame /branch/master/yourbitbucketaccountuser_yourbitbucketreponame_branch_master.zip

pabloperfer/testimage/branch/master/pabloperfer_testimage_branch_master.zip

2.- In the next stage we’ll select Codebuild as the Build provider, we will create a project and we’ll generate the necessary output artifact containing the container and image name in order for ECS to deploy .

Click on “insert build” commands and then “switch to editor”.

Be very careful at this point when you create imagedefinitions.json. It needs the exact container name to be passed, it has to match with the container name running in the task

[{“name”:”container_name”,”imageUri”:”image_URI”}]

Add ECR permissions to the CodeBuild role

4.- The last stage will be the deploy one, where we’ll select the ECS cluster, service and provide the image definitions json file which indicates the image to deploy.

Part 3. From the developer push in bitbucket to the deploy in ECS.

--

--