Creating a CI/CD pipeline for GKE using CloudBuild and Spinnaker

In this article, I will describe how to set up a basic CI/CD pipeline using Google CloudBuild and Spinnaker which deploys a new image to the existing Kubernetes Deployment on a GKE

Constantine Yurevich
SegmentStream
7 min readMay 20, 2019

--

Update January 2020: We completely moved our CI/CD to Github Action as Spinnaker + CloudBuild felt like an overkill for a startup team.

At SegmentStream we believe that DevOps is not a profession but rather a part of the company’s culture. Therefore we try to automate all the processes as much as possible so that developers focus on development instead of managing infrastructure.

Note: this tutorial is not production-ready, there is no testing, canary-deployments, etc. But it can give a good enough overview of what Spinnaker is capable of.

Prerequisites:

  1. GKE cluster is up and running
  2. App Kubernetes Deployment is up and running on the GKE
  3. Spinnaker is set up with proper permissions and is integrated with the GKE cluster
  4. Spinnaker fiat service account with proper permissions is set up to enable automatic pipeline triggering (not required if your pipelines don’t have restrictions for specific roles)
  5. Spinnaker is configured to listen for Google CloudBuild Pub/Sub notifications
  6. Slack notifications for the Spinnaker are enabled and set up

Note: I understand that setting up GKE and Spinnaker might be a challenge by itself. Let me know if you would like me to make a detailed guide about the Spinnaker setup on the GKE in the comments section.

What we want to achieve:

  1. A new code is pushed to the GitHub master branch
  2. Automated CloudBuld trigger initiates CloudBuild which builds a Docker image and pushes it to the Google Docker Registry (gcr.io)
  3. Once build is successful Spinnaker pipeline is trigged and deploys new Docker image to the staging server. A message is sent to the specified Slack channel as a bonus.
  4. Manual judgment step is invoked where developers can manually decide whether to promote new changes to the production server or not. Messages about either of decisions are sent to the specified Slack channel.
  5. Based on the manual approval it propagates further and deploys the image to the production server. A message is sent to the specified Slack channel once a new image is deployed to the production server.

Git repository file structure:

Note: I didn’t put any specific source files here to make this tutorial as much language-agnostic as possible.

We usually don’t use default cloudbuild.yamlname for the CloudBuild config because this one is used to the automated GitHub checks

cloudbuild.spinnaker.yaml

The file cloudbuild.spinnakler.yaml is responsible for:

  1. Building a Docker image (line 2–3)
  2. Tagging this image with SHORT_SHA and latest (line 4–5)
  3. Export image artifacts to the Google Container Registry (line 6)

Step 1: Creating a Google CloudBuild trigger

This trigger will listen for all changes in the master branch and run your build.

Note: once the trigger is invoked CloudBuild will automatically create a Pub/Sub topic where all you build status updates will be published

Step 2: Creating a new Spinnaker pipeline and configuring its expected artifacts

In this step, we will configure a Spinnaker pipeline to listen for the Pub/Sub messages published by the CloudBuild.

First of all, you need to set up the Expected Artifacts. These artifacts should be provided for your pipeline to run.

From the cloudbuild.spinnaker.yaml we know that our build will provide the following images:

  • gcr.io/<GCP_PROJECT>/sample-app:<SHORT_SHA>
  • gcr.io/<GCP_PROJECT>/sample-app:latest

This is exactly what needs to be specified in the Expected Artifacts section of the pipeline configuration. By default it should not include image tags, Spinnaker will detect them automatically.

For the Display name just use docker_image for now. This is just a reference name that will be used later on.

Step 3: Configuring pipeline automatic trigger

Go to Automatic Triggers section, click Add Trigger and configure your trigger similar to what you see on the screenshot below:

  • Type: Pub/Sub
  • Pub/Sub System Type: google
  • Subscription Name: select CloudBuild subscription name you defined during the Spinnaker installation and configuration of the Google CloudBuild Pub/Sub notifications
  • Payload Constraints: it is possible that we might have other CloudBuild triggers that build features or other branches except for master. In our company only builds from the master branch are considered to be stable and might have thelatest tag. This way we add a constraint that only build from the master branch can trigger this pipeline
  • Attribute Constraints: by default CloudBuild publishes different kinds of messages into the Pub/Sub while our pipeline is only interested in successful builds
  • Run As User: select a fiat service account which triggers your pipeline. More info can be found here
  • Artifact Constraints: pipeline will be triggered only if a Pub/Sub message will have artifacts defined in this section, just select one we defined in Step 1.

Step 4: Creating a pipeline stage which deploys a new image to the staging server

Click Add Stage to add the first stage of the pipeline. Define its name and type:

  • Type: Patch (Manifest)
  • Stage Name: Update Staging Docker Image

In this stage, we will just patch our existing Kubernetes Deployment manifest and update the image to the new one.

Setup Patch (Manifest) Configuration:

  • Account: select your staging GKE cluster
  • Namespace: select you staging Kubernetes Deployment namespace
  • Kind: deployment
  • Selector: Choose a static target
  • Name: select the name of the Kubernetes Deployment you would like to patch
  • Manifest Source: Test

In this code, we tell Spinnaker to patch a specific part of our Kubernetes Deployment manifest. It is important not to specify any image tags as Spinnaker will extract them automatically from the artifacts detected during the pipeline triggering.

Note: completing this step is only possible if you Spinnaker is properly installed and integrated with the GKE cluster

The last bonus thing we would like to set up for this stage is sending Slack message once deployment to the staging server is completed:

  • Notify via: Slack
  • Slack Channel: #notify-builds (or whatever channel your Slack bot has access to)
  • Notify when: This stage is complete

Step 5: Creating a pipeline stage for manual approval before the deployment to the production server

Click Add Stage to add the first stage of the pipeline. Define its name, type, and configuration:

  • Type: Manual Judgement
  • Stage Name: Manual Approval

Setup Manual Judgement Configuration:

  • Instructions: Would you like to deploy Sample App to the Production server?
  • Propagate Authentication: checked
  • Send Notifications: checked

Click Add Notification Preference to setup Slack notifications about the Manual Judgement process:

  • Notify via: Slack
  • Slack Channel: #notify-builds (or whatever channel your Slack bot has access to)

Step 6: Creating a pipeline stage which deploys a new image to the production server

Click Add Stage to add the last stage of the pipeline. Define its name and type:

  • Type: Patch (Manifest)
  • Stage Name: Update Prod Docker Image

In this stage, we will patch our existing production Kubernetes Deployment manifest and update the image to the new one.

Setup Patch (Manifest) Configuration the same way as it was done for the staging server:

The only difference is that production cluster and(or) namespace should be specified instead of the staging.

Setup Slack notification similar to the staging deployment and click Save.

Done! Now you basic CI/CD pipeline is set up and it will be triggered the next time new changes are pushed to the master branch.

--

--

Constantine Yurevich
SegmentStream

CEO and Co-Founder @ SegmentStream. Build a startup. Live in London. Want to live forever. Share weird thoughts business, self-development, health and travel.