Deploying to Google Kubernetes Engine from Gitlab CI

Matt Dowds
Sep 7, 2018 · 5 min read

Intro

Google Kubernetes Engine (GKE) and Gitlab CI are two tools we are making more and more use of at John Lewis, but getting them to talk to each other is more complicated than it first appears. Gitlab has a built-in Kubernetes integration, but for this to work the cluster needs to have an older form of authorization, ABAC, to be turned on. It is disabled by default and Google recommends against using it.

This post will walk through an alternative way of deploying to GKE from Gitlab CI. It assumes a basic familiarity with both tools, and will take you through setting up the authorisation, pushing a Docker image to the Google Container Registry (GCR), and then deploying that image to your cluster.

Prereqs

  • A Kubernetes cluster set up in Google Kubernetes Engine
  • A project to deploy with a repo and CI pipeline set up in Gitlab
  • A Dockerfile for that application

Create service account & key

The first step is to create a service account within Google Cloud. This will be used by Gitlab CI to authenticate against the container registry and Kubernetes cluster.

To do this, go to the service accounts section of the Google Cloud Console and click the button to create a service account. Make sure it has the Storage Object Admin and Kubernetes Engine Developer roles (for the container registry and kubernetes deployment respectively).

Check the option to “Furnish a new private key” and choose the JSON option. Clicking ‘Save’ will create the service account and download the key file. Then you can add the contents of the key file as a variable within the CI/CD section of your repo settings.

Pushing to Google Cloud Repository (GCR)

With the service account set up, we can now move on to setting up the Gitlab CI config itself. If you don’t have one already, create a .gitlab-ci.yml file in the root directory of your project. Add the stages needed to build and test your application (for more info on this, see the Gitlab CI documentation). Once these stages are run, we’ll want two new ones: one to build and push the Docker image, and one to run the actual deployment.

The job to build and push the docker image will look something like this:

docker_image:
image: 'docker:latest'
stage: docker_image
variables:
DOCKER_IMAGE_TAG: 'eu.gcr.io/my-project/my-application'
script:
# Build the image
— docker build --cache-from "${DOCKER_IMAGE_TAG}" -t "${DOCKER_IMAGE_TAG}" .
# Log in to Google Container Registry
— echo "$SERVICE_ACCOUNT_KEY" > key.json
— docker login -u _json_key --password-stdin https://eu.gcr.io < key.json
# Push the image
— docker push ${DOCKER_IMAGE_TAG}
only:
— master

The format of the image name is similar to that required by the Gitlab registry: eu.gcr.io (or whatever your region’s registry is called; check in the Google Cloud Console), followed by the name of your Google Cloud project then followed by what you actually want to call the image.

In this example we’re not specifying a version for the image, so it will default to using latest and overriding the previous one. If you wanted to version, you could use the ${CI_PIPELINE_IID} variable that refers to the ID of the currently running pipeline.

Performing the deployment

To actually deploy to the Kubernetes cluster, you’ll need another job to run after the docker image pushing has completed. The first step is to authenticate against the cluster, and as the built-in integration in Gitlab has the issues described above, you need to do this manually using the Google Cloud SDK.

Luckily, Google has created a Docker image that contains the SDK and kubectl (google/cloud-sdk), which you can use as the image to run this job in. This image is a slimmed down one however and I found it necessary to create a custom one based on it that installed the envsusbt program (done by installing the gettextpackage), which is used to substitute environment variables into the various Kubernetes YAML payload files.

The deployment job should look something like this:

deploy:
image: 'gcloud-with-envsubst:latest'
stage: deploy
variables:
DOCKER_IMAGE_TAG: 'eu.gcr.io/my-project/my-application:latest'
script:
# Authenticate with GKE
— echo "$SERVICE_ACCOUNT_KEY" > key.json
— gcloud auth activate-service-account --key-file=key.json
— gcloud config set project my-project
— gcloud config set container/cluster my-cluster
— gcloud config set compute/zone europe-west2-a
— gcloud container clusters get-credentials my-cluster --zone europe-west2-a
# Do the deployment
— cat kubernetes/my-application-deployment.yaml | envsubst | kubectl apply -f -
environment:
name: test
only:
— master

There’s a few things going on here. The image property is set to the custom Cloud SDK image, but you can use the base one if you don’t need any additional programs installed.

We again declare an environment variable containing the image tag, and as before we’re using latest here, but you could specify a more precise version if you tagged one in the previous job.

The first few lines of the script do the authentication, using the service account key, and set the project, cluster and zone we want to deploy to. Finally it uses get-credentials to configure kubectl to point to that cluster.

The second part of the script grabs a Kubernetes deployment payload stored in the /kubernetes directory of the repo, uses envsubst to pass the DOCKER_IMAGE_TAG variable in and then tells kubectl to apply it. (For more information on Kubernetes deployments, see the documentation).

Environment Section

One other thing to note is the environment section. Within your Gitlab repo you can name your environments, then refer to the one you’re deploying to in your CI config. This populates a nice interface within Gitlab that shows you when deployments happened in each environment, as well as which commit was deployed.

Summary

And that’s it! Once you’ve got a service account key, a Gitlab CI config with these two jobs configured and a Kubernetes deployment for kubectl to run then every push to your repo will deploy your application to your cluster in Google Cloud.

Hopefully Gitlab will update their native integration soon so it will work with the newer RBAC authentication in GKE. Until then this is a fairly simple way to get these two powerful tools to talk to one another.

Matt Dowds

Written by

Full-stack Software Engineer at John Lewis & Partners

John Lewis & Partners Software Engineering

Stories from engineers within John Lewis

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade