Dev Story: Deploy to GCP Cloud Run with Bitbucket Pipelines

Awesome server-less stack

Siwawes Wongcharoen
Jun 2 · 5 min read

Hello there!

In this post I will talk about how to use Bitbucket Pipelines to build a Docker image and deploy a Docker container on GCP Cloud Run.

Take a look at this: I mentioned to the Bitbucket Pipelines Docker image Docker container and GCP Cloud Run all in one sentence!. I will start by giving a brief intro to each one, so if you are already familiar with them then jump to the next section.


Bitbucket Pipelines

Integrated CI/CD for Bitbucket Cloud that’s trivial to set up, automating your code from test to production.

Bitbucket Pipelines

Pipelines is a CI tool from Atlassian that comes fully integrated one of the most popular source control services Bitbucket. To create a pipeline, you just write the yml file to control the worker. You can use it to run tests, to deploy servers, or whatever else you can think of. If you want something advanced then you can use the base Docker image from AWS or Gcloud to run your pipeline commands. In the simple use case, you can run bash script to perform some work.


Docker image and Docker container

Package Software into Standardized Units for Development, Shipment and Deployment

Docker Container

GCP Cloud Run

GCP Cloud Run

Cloud Run. Allowing you to run any stateless http container in a fully managed environment, paying only for the exact resources you use.
@steren

GCP Cloud Run was announced at the Google I/O 2019 and, ah, this is the game changing product from Google! (if not too expensive). You can just build the Docker image and send it to Google infrastructure, then Google will take care the rest. Indeed, AWS also has the Fargate. I think they are very similar product. But Fargate is a bit more complex than Cloud Run. You have to consider many things, VPC, Target group, Load balancer, etc. But for Cloud Run you just make the stateless container and listen to the port provided from Cloud Run environment. Yeah, easy like that. :-)

So, Cloud Run concept is very good. If you used to use Firebase Cloud Functions, you will love Cloud Run.

Cloud Run Overview

OK Google, Let’s roll.

First, we need some necessary things before we start. They are:

  1. Google Cloud account and project
  2. Bitbucket account
  3. Node.js and express skill
  4. Docker skill
  5. gcloud CLI
  6. Some snacks :-)

I’m going to take you through the following steps one-by-one step:

  1. Example Application
  2. Build Docker image
  3. GCP key
  4. Bitbucket Pipeline and gcloud cli

1. Example Application

I will use Node application. If you prefer the others, see examples here.

index.js

The important key is at line 11.

Line 11 : Use PORT from environment or use 8080

When this code run on Cloud run, Cloud run will assign a random port and bind to public automatically.


2. Build Docker image

Dockerfile

OK, you will see there is nothing special here. Yes, it is just normal Docker file. But, don’t forget to build the image and try to run it.


3. GCP Key

Now take a rest from coding for a bit. Put your DevOps hat on and head over to the GCP console of your project.

Gcloud menu

Find the service accounts from IAM & admin. Then create new service account.

You need to give these IAM roles to the service account created:

  1. Storage Admin: for pushing docker images to GCR.
  2. Cloud Run Admin: for deploying a service to Cloud Run.
  3. IAM Service Account user.
Required IAM roles

At the last step, you need to download the json key file. Again, keep it private.


4. Bitbucket Pipeline and gcloud cli

Finally we can put it all together. You should already have Gcloud service account key that you created in step 3 and Bitbucket repo.

bitbucket-pipelines.yml

Explain:

Line 2–3 : Tell Pipelines that we will use the docker command.

Line 6–13 : Listen to changes on branch master, if change occur run the container base from Node version 10, then run 2 commands.

Noted, Line 12 : You can use npm ci instead of npm install .

Line 15–16 : Listen to changes on tag master release-production-*

Line 19 : Run the container base from Google cloud SDK. We can use gcloud command from this image.

Line 22 : Tell the agent, this is production deployment workflow.

Line 29–30 : Set variable, on the fly. So, you can understand this yml file more easier.
For IMAGE_NAME, I recommend format [HOSTNAME]/[PROJECT-ID]/[IMAGE]:[TAG].

Docker Image on GCR

Line 33 : Build Docker image from the source code.

Line 36–37 : Authenticate gcloud with the service account.

Line 40 : Register GCR to Docker command.

Line 43 : Push the image to GCR

Line 46 : Use gcloud command to deploy the image on GCR to Cloud Run

Within a minute use will see your new revision on Gcloud console.

Cloud Run service revisions

Noted, if you get stuck with bitbucket-pipelines.yml, pipelines validator will help you.


Conclusion

Flexible environment and production grade => Cloud Run
Simple CI tool => Bitbucket Pipelines
Magic => Docker

In conclusion, if you are looking for the flexible place where you can host your applications with no server skill, get automatic scaling and not too expansive, then Cloud Run is the answer. You can easily pack your application into a Docker image, but just make sure that your application is stateless. And if you keep your source code on Bitbucket, Pipeline should be the first option for your CI tool.

Thank you

Please note this is my first blog post in English. When you find many mistakes then don’t get too moody with me.


Thanks to Antony Harfield

Siwawes Wongcharoen

Written by

Full Stack Developer

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade