How to Set Up a Deployment Pipeline on Google Cloud with Cloud Build and Cloud Functions
Automatically deploying Cloud Function instances when changes get pushed to your Git repositories.
In my last two articles, I showed How to Set Up a Deployment Pipeline on Google Cloud with Cloud Build, Container Registry and Cloud Run and also How to Develop, Debug and Test your Python Google Cloud Functions on Your Local Dev Environment.
In this article, we’ll build on top of the previous two, showing how to deploy Cloud Functions with Cloud Build, as well as to check the behavior of our Cloud Functions after they are deployed, to make sure the behavior we saw while developing with Functions Framework for Python is accurate with the deployed version.
Disclaimer: all opinions expressed are my own, and represent no one but myself.
Cloud Build is a service that executes your builds on Google Cloud Platform infrastructure. Cloud Build can import source code from Google Cloud Storage, Cloud Source Repositories, GitHub, or Bitbucket, execute a build to your specifications, and produce artifacts such as Docker containers or Java archives.
Cloud Build executes your build as a series of build steps, where each build step is run in a Docker container. A build step can do anything that can be done from a container irrespective of the environment. To perform your tasks, you can either use the supported build steps provided by Cloud Build or write your own build steps.
Google Cloud Functions is a serverless execution environment for building and connecting cloud services. With Cloud Functions you write simple, single-purpose functions that are attached to events emitted from your cloud infrastructure and services. Your function is triggered when an event being watched is fired. Your code executes in a fully managed environment. There is no need to provision any infrastructure or worry about managing any servers.
Set Up the GCP Project
Create the Project
To follow this tutorial, you’ll need to have access to a GCP Project. If you don’t have one already, follow these steps to create it:
- Access the GCP Console, enter a name for your new project and click the CREATE button;
- Once your project is created, make sure it’s selected on the top-left corner, right beside the Google Cloud Platform logo.
Enable the Required APIs
- From the top-left menu, select APIs & Services, then click the ENABLE APIS AND SERVICES button;
- Enable Cloud Build API, Cloud Functions API and Cloud Resource Manager API.
The Cloud Resource Manager API “Creates, reads, and updates metadata for Google Cloud Platform resource containers.” — Under the hood, I imagine Google leverages its container infrastructure for running Cloud Functions on top of it and that’s why this API is needed. If we try to run our pipeline without having this API enabled, it will fail with a message similar to this:
ERROR: (gcloud.functions.deploy) User [<project-number>@cloudbuild.gserviceaccount.com] does not have permission to access project [<project-id>:testIamPermissions] (or it may not exist): Cloud Resource Manager API has not been used in project <project-number> before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/cloudresourcemanager.googleapis.com/overview?project=<project-number> then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
What’s weird here is that it looks like this API should be enabled only for the first deployment to succeed. After that, it keeps working, even if the API is disabled. I’m not sure why this happens and why this is not clearly stated in the docs though.
Configure the Required IAM Permissions
In order for our Cloud Build pipeline to work properly, we need to update the default service account used by the service, identified by the address <project-number>@cloudbuild.gserviceaccount.com, with some new permissions. To do so:
- From the top-left menu, select IAM & Admin;
- Find the service account identified by <project-number>@cloudbuild.gserviceaccount.com;
- Edit the service account and add the Cloud Functions Admin and Service Account User roles.
Cloud Functions Admin is needed, so Cloud Build has the permissions necessary to deploy the Cloud Function instances; It’s also necessary, along with the Service Account User role, so the Cloud Function service may be configured to allow access from unauthenticated users with the
Without the Cloud Functions Admin role — I first tried with Cloud Functions Developer instead — , the following error (though it is presented as a warning) happens:
WARNING: Setting IAM policy failed, try “gcloud alpha functions add-iam-policy-binding sample_http — member=allUsers — role=roles/cloudfunctions.invoker”
While without the Service Account User role, an error similar to the following happens:
ERROR: (gcloud.functions.deploy) ResponseError: status=, code=[Forbidden], message=[Missing necessary permission iam.serviceAccounts.actAs for $MEMBER on the service account <project-id>@appspot.gserviceaccount.com.
Ensure that service account <project-id>@appspot.gserviceaccount.com is a member of the project <project-id>, and then grant $MEMBER the role ‘roles/iam.serviceAccountUser’.
You can do that by running ‘gcloud iam service-accounts add-iam-policy-binding <project-id>@appspot.gserviceaccount.com — member=$MEMBER — role=roles/iam.serviceAccountUser’
In case the member is a service account please use the prefix ‘serviceAccount:’ instead of ‘user:’.]
The Sample Repository
The sample repository we’ll use for this tutorial is the same used on the previous one, where we talked about How to Develop, Debug and Test your Python Google Cloud Functions on Your Local Dev Environment. It provides two very basic functions:
sample_httplistens for HTTP events and returns a hello message;
sample_pubsubreceives Pub/Sub messages and logs their contents.
Configuring our Cloud Build Pipeline
The steps of our pipeline are defined in a YML file called cloudbuild.yaml. As you can see, our pipeline is composed of two steps:
- The first step is responsible for deploying the HTTP-triggered function;
- The second step is responsible for deploying the Pub/Sub-triggered function.
Some things worth noticing in the file above:
- Each step in the pipeline makes use of a helper script that hides the complexity of the commands used to deploy each function:
- The steps are configured to run in parallel. This is achieved with the
waitFor: ["-"]parameter. For more information about this behavior, check Configuring the order of build steps in the official docs.
Set Up the Cloud Build Trigger
With everything in place, it’s now time to set up our Cloud Build Trigger. To do so, follow these steps:
- From the top-left menu, select Cloud Build, select Triggers in the left menu and then click the Connect repository button:
Select GitHub (Cloud Build GitHub App), and click Continue:
- Authorize Google Cloud Build access to GitHub:
- Install the Google Cloud Build GitHub App:
- Select the GitHub account to install the Google Cloud Build GitHub App:
- Then select the repositories you want Cloud Build to have access to:
- And finally connect the GitHub repository to Cloud Build (if you don’t see your repository in the list, make sure to refresh the Google Cloud Console page):
- Select the GitHub repository and click Create push trigger:
- Notice you aren’t able to configure the trigger parameters at the time of its creation, but it’s possible to do so after it’s created:
- Configure the trigger as shown in the image below:
Here, we specify:
- The Name and Description of the trigger;
- That the build should be triggered whenever stuff is pushed into the master branch of the repository;
- That the build configuration is provided by the cloudbuild.yaml file from our repository.
To test the configuration done so far, you have two options:
- Commit and push any changes to the master branch of your repository;
- Run the trigger manually by clicking the Run trigger button:
To see your build in action, select Dashboard in the left side menu:
For each configured build, the Dashboard shows:
- The date and time of the latest build;
- The build duration;
- A description of the trigger;
- A link to the source repository;
- The hash of the commit for which the build was triggered;
- A small chart with the Success/Failure build history;
- The average duration of the builds;
- The percentage of success and failures.
To view details about it, click in the link shown under Latest Build. You should see something like this:
Notice you are able to see the output for each of the build steps defined in our cloudbuild.yaml file.
Testing the deployed functions
Testing the HTTP-triggered function
To test the HTTP-triggered function, we can make use of the test-deployed-http.sh script:
This script uses the
gcloud command-line tool to fetch the URL of the deployed HTTP-triggered function and then performs a GET request to it, passing
?subject=Foobar as a query string parameter. Once we run it, we should expect an output similar to the following:
Testing the Pub/Sub-triggered function
To test the Pub/Sub-triggered function, we can make use of the test-deployed-pubsub.sh script:
This script receives two parameters (a message and a comma-separated list of
key=pair attributes and publishes it to the Pub/Sub topic our function is configured to be triggered by. Once we run it, we should expect an output similar to this:
To make sure the message was successfully received and processed, we can check the logs in the Google Cloud console. To do so, from the top-left menu, select Cloud Functions, then click sample_pubsub, and click the VIEW LOGS button:
As we can see in the image above, the
sample_pubsub function was successfully triggered by the Pub/Sub message published by our script to the topic the function was configured to listen to when it was deployed.
To undo the changes done while following this tutorial, make sure to:
- Delete the deployed Cloud Functions;
- Delete the generated Pub/Sub topic;
- Delete the Cloud Build configured triggers;
In this tutorial, we have gone through the process of setting up a deployment pipeline powered by GitHub, Cloud Build, and Cloud Functions.
The pipeline was configured to be triggered every time new code is pushed into the master branch of the connected repository. Once that happens, the pipeline deploys the HTTP- and the Pub/Sub-triggered functions implemented for this tutorial.
Even though a GitHub repository was used here, the process for configuring a BitBucket repository is very similar. And though GitLab is not available as an option to be connected, it’s also possible to connect it by making using of webhooks, as described by my friend Ricardo Mendes in this repository.
As we could see, it’s possible to automate most of the tasks involved in deploying the Cloud Functions instances and use Cloud Build for configuring a CI/CD pipeline. We also created some helper scripts for making our cloudbuild.yaml file more straightforward and also some scripts for testing our functions both locally and against their deployed versions more easily.
I hope you had a good time reading this article and learned some new stuff along the way.
Thanks to Ricardo Mendes for reviewing the article.