Creating Build Triggers in Google Cloud Container Engine

We use Google Cloud Platform within our platform, to deploy ML models to the cloud as containers, and serve them using Kubernetes Engine. For developers, the process of pushing code changes to GitHub / BitBucket and then having to run through docker build and gcloud processes to deploy the code is far from ideal. Thankfully, with Build Triggers, after the first deployment, you can automate the process of rebuilding the image in the cloud when changes are pushed to GitHub or BitBucket.

The process is reasonably straight-forward. Head into the Google Cloud Console and to Container Engine. Then select Build Triggers and Add Trigger. Choose your flavour of remote repository hub that you want to synchronise, for example, we choose BitBucket here. You will need to run through some authorisation steps to get the account linked. Select the repository that you want to link, and Google Cloud will begin mirroring it. Once this process finishes, you will be presented with some additional config. Give it a memorable name, and in the image name section, we chose to go with:

gcr.io/alixir-199021/myimage:new

The new tag makes it easy for us to to tell which image is the latest build. Then to finish, simply complete the prompts and create the trigger.

Now, make a change to your local code and push it up to your remote repo as you usually would, and with any luck, in the ‘Build History’ section of Container Registry, you will see your container image being rebuilt reactively due to the remote code change. Awesome!

Once it has built, you can then perform a rolling update on your Kubernetes deployment and send it live. Of course, there are many other ways you can utilise build triggers, such as pushing to a staging server, but for our workflow this keeps it simple and manageable.