How to Set Up a Deployment Pipeline on Google Cloud with Cloud Build and Cloud Functions

Automatically deploying Cloud Function instances when changes get pushed to your Git repositories.

Ivam Luz
CI&T
9 min readJul 28, 2020

--

Photo by roman pentin on Unsplash

Introduction

In my last two articles, I showed How to Set Up a Deployment Pipeline on Google Cloud with Cloud Build, Container Registry and Cloud Run and also How to Develop, Debug and Test your Python Google Cloud Functions on Your Local Dev Environment.

In this article, we’ll build on top of the previous two, showing how to deploy Cloud Functions with Cloud Build, as well as to check the behavior of our Cloud Functions after they are deployed, to make sure the behavior we saw while developing with Functions Framework for Python is accurate with the deployed version.

Disclaimer: all opinions expressed are my own, and represent no one but myself.

Cloud Build

Cloud Build is a service that executes your builds on Google Cloud Platform infrastructure. Cloud Build can import source code from Google Cloud Storage, Cloud Source Repositories, GitHub, or Bitbucket, execute a build to your specifications, and produce artifacts such as Docker containers or Java archives.

Cloud Build executes your build as a series of build steps, where each build step is run in a Docker container. A build step can do anything that can be done from a container irrespective of the environment. To perform your tasks, you can either use the supported build steps provided by Cloud Build or write your own build steps.

Reference: https://cloud.google.com/cloud-build/docs

Cloud Functions

Google Cloud Functions is a serverless execution environment for building and connecting cloud services. With Cloud Functions you write simple, single-purpose functions that are attached to events emitted from your cloud infrastructure and services. Your function is triggered when an event being watched is fired. Your code executes in a fully managed environment. There is no need to provision any infrastructure or worry about managing any servers.

Cloud Functions can be written using JavaScript, Python 3, Go or Java. runtimes on Google Cloud Platform. You can take your function and run it in any standard Node.js (Node.js 6, 8 or 10), Python 3 (Python 3.7), Go (Go 1.11 or Go 1.13) or Java (Java 11) environment, which makes both portability and local testing a breeze.

Reference: https://cloud.google.com/functions/docs/concepts/overview

Set Up the GCP Project

Create the Project

To follow this tutorial, you’ll need to have access to a GCP Project. If you don’t have one already, follow these steps to create it:

  1. Access the GCP Console, enter a name for your new project and click the CREATE button;
  2. Once your project is created, make sure it’s selected on the top-left corner, right beside the Google Cloud Platform logo.

Enable the Required APIs

  1. From the top-left menu, select APIs & Services, then click the ENABLE APIS AND SERVICES button;
  2. Enable Cloud Build API, Cloud Functions API and Cloud Resource Manager API.

The Cloud Resource Manager API “Creates, reads, and updates metadata for Google Cloud Platform resource containers.” — Under the hood, I imagine Google leverages its container infrastructure for running Cloud Functions on top of it and that’s why this API is needed. If we try to run our pipeline without having this API enabled, it will fail with a message similar to this:

ERROR: (gcloud.functions.deploy) User [<project-number>@cloudbuild.gserviceaccount.com] does not have permission to access project [<project-id>:testIamPermissions] (or it may not exist): Cloud Resource Manager API has not been used in project <project-number> before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/cloudresourcemanager.googleapis.com/overview?project=<project-number> then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.

What’s weird here is that it looks like this API should be enabled only for the first deployment to succeed. After that, it keeps working, even if the API is disabled. I’m not sure why this happens and why this is not clearly stated in the docs though.

Configure the Required IAM Permissions

In order for our Cloud Build pipeline to work properly, we need to update the default service account used by the service, identified by the address <project-number>@cloudbuild.gserviceaccount.com, with some new permissions. To do so:

  • From the top-left menu, select IAM & Admin;
  • Find the service account identified by <project-number>@cloudbuild.gserviceaccount.com;
  • Edit the service account and add the Cloud Functions Admin and Service Account User roles.

Cloud Functions Admin is needed, so Cloud Build has the permissions necessary to deploy the Cloud Function instances; It’s also necessary, along with the Service Account User role, so the Cloud Function service may be configured to allow access from unauthenticated users with the --allow-unauthenticated flag.

Without the Cloud Functions Admin role — I first tried with Cloud Functions Developer instead — , the following error (though it is presented as a warning) happens:

WARNING: Setting IAM policy failed, try “gcloud alpha functions add-iam-policy-binding sample_http — member=allUsers — role=roles/cloudfunctions.invoker”

While without the Service Account User role, an error similar to the following happens:

ERROR: (gcloud.functions.deploy) ResponseError: status=[403], code=[Forbidden], message=[Missing necessary permission iam.serviceAccounts.actAs for $MEMBER on the service account <project-id>@appspot.gserviceaccount.com.
Ensure that service account <project-id>@appspot.gserviceaccount.com is a member of the project <project-id>, and then grant $MEMBER the role ‘roles/iam.serviceAccountUser’.
You can do that by running ‘gcloud iam service-accounts add-iam-policy-binding <project-id>@appspot.gserviceaccount.com — member=$MEMBER — role=roles/iam.serviceAccountUser’
In case the member is a service account please use the prefix ‘serviceAccount:’ instead of ‘user:’.]

The Sample Repository

The sample repository we’ll use for this tutorial is the same used on the previous one, where we talked about How to Develop, Debug and Test your Python Google Cloud Functions on Your Local Dev Environment. It provides two very basic functions:

The source code for our cloud functions
  • sample_http listens for HTTP events and returns a hello message;
  • sample_pubsub receives Pub/Sub messages and logs their contents.

Configuring our Cloud Build Pipeline

The steps of our pipeline are defined in a YML file called cloudbuild.yaml. As you can see, our pipeline is composed of two steps:

  1. The first step is responsible for deploying the HTTP-triggered function;
  2. The second step is responsible for deploying the Pub/Sub-triggered function.
cloudbuild.yaml — Our Cloud Build pipeline file

Some things worth noticing in the file above:

  • Each step in the pipeline makes use of a helper script that hides the complexity of the commands used to deploy each function:
deploy-http.sh — Scripting for deploying the HTTP-triggered cloud function
deploy-pubsub.sh — Scripting for deploying the Pub/Sub-triggered cloud function
  • The steps are configured to run in parallel. This is achieved with the waitFor: ["-"] parameter. For more information about this behavior, check Configuring the order of build steps in the official docs.

Set Up the Cloud Build Trigger

With everything in place, it’s now time to set up our Cloud Build Trigger. To do so, follow these steps:

  • From the top-left menu, select Cloud Build, select Triggers in the left menu and then click the Connect repository button:
The Cloud Build Triggers page

Select GitHub (Cloud Build GitHub App), and click Continue:

Selecting the source to configure the Cloud Builder trigger
  • Authorize Google Cloud Build access to GitHub:
GitHub authorization page
  • Install the Google Cloud Build GitHub App:
Cloud Build GitHub App installation prompt
  • Select the GitHub account to install the Google Cloud Build GitHub App:
Selecting the account to install the Google Cloud Build app for GitHub
  • Then select the repositories you want Cloud Build to have access to:
Selecting the repository to install the Google Cloud Build app for GitHub
  • And finally connect the GitHub repository to Cloud Build (if you don’t see your repository in the list, make sure to refresh the Google Cloud Console page):
Connecting the GitHub repository to Cloud Build
  • Select the GitHub repository and click Create push trigger:
Creating the GitHub repository push trigger
  • Notice you aren’t able to configure the trigger parameters at the time of its creation, but it’s possible to do so after it’s created:
Editing the Cloud Build trigger
  • Configure the trigger as shown in the image below:
Cloud Build trigger configuration

Here, we specify:

  • The Name and Description of the trigger;
  • That the build should be triggered whenever stuff is pushed into the master branch of the repository;
  • That the build configuration is provided by the cloudbuild.yaml file from our repository.

Triggering builds

To test the configuration done so far, you have two options:

  1. Commit and push any changes to the master branch of your repository;
  2. Run the trigger manually by clicking the Run trigger button:
Option to run the Cloud Build trigger manually

To see your build in action, select Dashboard in the left side menu:

Cloud Build dashboard

For each configured build, the Dashboard shows:

  • The date and time of the latest build;
  • The build duration;
  • A description of the trigger;
  • A link to the source repository;
  • The hash of the commit for which the build was triggered;
  • A small chart with the Success/Failure build history;
  • The average duration of the builds;
  • The percentage of success and failures.

To view details about it, click in the link shown under Latest Build. You should see something like this:

Cloud Build — Build details

Notice you are able to see the output for each of the build steps defined in our cloudbuild.yaml file.

Testing the deployed functions

Testing the HTTP-triggered function

To test the HTTP-triggered function, we can make use of the test-deployed-http.sh script:

test-deployed-http.sh helper script

This script uses the gcloud command-line tool to fetch the URL of the deployed HTTP-triggered function and then performs a GET request to it, passing ?subject=Foobar as a query string parameter. Once we run it, we should expect an output similar to the following:

Testing the Pub/Sub-triggered function

To test the Pub/Sub-triggered function, we can make use of the test-deployed-pubsub.sh script:

test-deployed-pubsub.sh helper script

This script receives two parameters (a message and a comma-separated list of key=pair attributes and publishes it to the Pub/Sub topic our function is configured to be triggered by. Once we run it, we should expect an output similar to this:

To make sure the message was successfully received and processed, we can check the logs in the Google Cloud console. To do so, from the top-left menu, select Cloud Functions, then click sample_pubsub, and click the VIEW LOGS button:

The log output from Pub/Sub-Triggered function

As we can see in the image above, the sample_pubsub function was successfully triggered by the Pub/Sub message published by our script to the topic the function was configured to listen to when it was deployed.

Clean-up

To undo the changes done while following this tutorial, make sure to:

  • Delete the deployed Cloud Functions;
  • Delete the generated Pub/Sub topic;
  • Delete the Cloud Build configured triggers;

Final Thoughts

In this tutorial, we have gone through the process of setting up a deployment pipeline powered by GitHub, Cloud Build, and Cloud Functions.

The pipeline was configured to be triggered every time new code is pushed into the master branch of the connected repository. Once that happens, the pipeline deploys the HTTP- and the Pub/Sub-triggered functions implemented for this tutorial.

Even though a GitHub repository was used here, the process for configuring a BitBucket repository is very similar. And though GitLab is not available as an option to be connected, it’s also possible to connect it by making using of webhooks, as described by my friend

in this repository.

As we could see, it’s possible to automate most of the tasks involved in deploying the Cloud Functions instances and use Cloud Build for configuring a CI/CD pipeline. We also created some helper scripts for making our cloudbuild.yaml file more straightforward and also some scripts for testing our functions both locally and against their deployed versions more easily.

I hope you had a good time reading this article and learned some new stuff along the way.

Happy coding!

Thanks to

for reviewing the article.

--

--