Automating GCP Cloud Function testing using Jenkins

Alekhya Chowdhury
6 min readMay 22, 2022

--

Introduction

While following a test driven development process we need to track the progress by repetitive testing. Hence without a robust , efficient and cost effective automation testing framework it comes down to manually triggering payloads and verifying the results . Manual testing not only takes more time but as we all know are prone to human errors.

In this post I am documenting an approach where we can use Jenkins as an orchestrating tool, which can be used to test GCP components meant to process events like Cloud Function, Cloud Run and Dataflow.

For our use case I am going to consider a simple cloud function which will be responsible for consuming a JSON message from an input pub sub topic transform it to another JSON structure and publish to an output pub sub topic. We will use Jenkins to create a framework to test the cloud function as soon as any new changes are deployed in the cloud function code

In the above diagram Jenkins is the central component orchestrating between various actions required to execute a test case. Jenkins will be interacting with pub sub topics and GCS buckets via python scripts.
We are going to trigger the Jenkins pipeline via a call to a Jenkins web hook from cloud build script. The same cloud build script will also deploy the cloud function. The flow of events will be as below.
Code changes committed in git >> cloud build is triggered >> jenkins pipeline is executed >> cloud function is invoked >> test case is evaluated

Jenkinsfile

Once the cloud function has been deployed and the Jenkins job has been triggered, the following actions will be performed in sequence. The actions need to be configured in a jenkinsfile and the contents of the jenkinsfile is explained below.

  1. In the first stage of the jenkinsfile we install the required python libraries for interacting with Pub Sub Topics and GCS bucket .
Jenkinsfile stage 1 — Install dependencies

2. A git repository containing the python scripts required for interacting with various GCP components will be cloned in the jenkins work directory. (each jenkins pipeline has its own work directory) .
We also need to clean up the artefacts belonging to any previous execution hence we are removing the entire cloned git repos from previous runs.

Jenkinsfile stage 2 — remove existing parent git folder from previous run and clone from git

3. Jenkins will trigger python scripts (Get_Payload.py) to fetch the input and expected output payloads from the payload repository (GCS) and store these files in the jenkins work directory.

Jenkinsfile stage 3 and 4 — Fetch input and expected output payloads from payload repo. We need to pass the gcp project id (gcp_project_name) hosting the gcs bucket acting as the payload_repository , the folder location (gcs_folder) inside the bucket and the payload name along with the file extension (input/expected_output.json)

4. PubSub_Publisher.py will be called to read the input payload stored locally in previous step and push the input payload to the input Pub Sub topic.
We are also storing and printing out the acknowledgement id received after pushing the input message to the input pub sub topic.

Jenkinsfile stage 5 — publish input payload to pubsub topic (input_pubub_topic)

5 .PubSub_Listener.py will be triggered. This will listen for the output message in the output topic via a synchronous pull subscription which has been configured to consume only one message and then stop the consumption . The received message is also persisted in the Jenkins working directory.

6. Verify_Payload.py will be called to triggered to compare the expected output and the received (actual) output. If the payloads match then we have a successful validation and the test case passes otherwise the test case fails.

Configuring Jenkins

After creating the jenkinsfile we need to configure the jenkins pipeline

We can start by creating a “Pipeline ” project in Jenkins

The build trigger needs to be selected as below and an relevant authentication token needs to be given. This will generate an endpoint(web hook) which we will later use to trigger the jenkins pipeline remotely

The Pipeline script and the python codes will be places in a Source Control management (SCM) repository like git , from where it will be fetched each time we invoke the pipeline.
The required details for configuring the connectivity with SCM(git) are the repository url, branch name and the jenkinsfile path

Triggering Jenkins from Cloud Build

After creating the jenkins pipeline our next objective is to integrate the pipeline with cloud build . This will enable us to trigger jenkins pipeline jobs after cloud function has been deployed once we push the changes to git.

Step “deploy cf” is responsible for cloud function deployment and the second step “trigger test” is responsible for triggering the jenkins pipeline

We are going to invoke the jenkins pipeline execution by calling the endpoint generated during the pipeline configuration.

Here I have used a curl command along with the username:password parameters to call the jenkins pipeline endpoint.

We also need to create a corresponding cloud build trigger in our gcp project. The cloud build trigger should point towards the cloudbuild file location in git. We should also give the cloud build service account necessary permissions to deploy the cloud function (Service account user role needs to be granted)

Demo
Upload the input and expected output payload in the payload repository

input.json
expected_output.json

The mapping present in the cloud function is shown as below. We are going to take the two fields present in the input and map it to two corresponding fields in the output.

transformation /mapping present in the cloud function

Create the input and output pub sub topics and trigger the cloud build by pushing some changes in git.

After the cloud function is successfully deployed the jenkins pipeline is triggered

Each stage defined in the jenkinsfile shows up in the UI and can be individually checked for the logs

We have a successful validation since the expected output matches with the output of the cloud function.

If we change the mapping (ItemQty to Qty) then the cloud function’s output doesn’t match with the expected output and we get a validation/test case failure.

We can use the same approach to test more complex patterns involving multiple GCP resources as almost all GCP resources support interactions via python clients. We can also configure jenkins to send success or failure notification once a test case has been executed.

--

--

Alekhya Chowdhury

Senior cloud engineer (GCP , OCI) , IKEA (Ingka) || ex-Capgemini || ex-Wipro | | ex-IBM , keen to take up challenges and an eager learner