Testing Cloud Functions locally using cloud-functions package

Khureltulga Dashdavaa
5 min readOct 16, 2022

--

Here I have summarized how should we test Cloud Functions on our laptop without using Docker.

Photo by Luca Bravo on Unsplash

Cloud Functions

Google Cloud Platform (GCP) provides several different computing resources, and one of the easiest to use is Cloud Functions. It handles all infrastructure solutions, so all we need to do is write our code as a stand-alone program and deploy it to the GCP Cloud Functions. All, invoking, stopping, and scaling is done through GCP. Also, it is a cheaper service and a fair amount of usage of Cloud Functions is free for us (https://cloud.google.com/functions/pricing). If you are familiar with AWS’s Lambda service, it is a similar service to GCP.

Like Lambda, Cloud Functions is also suitable for burstable short-term light tasks. For example, if there is a task that needs to process small data, which is constantly fed one after one, and put them into some storage like GCS, and BigQuery. Cloud Functions would be perfect.

Developing Cloud Functions

Of course, we can use GCP’s console to develop and deploy Cloud Functions which makes really easy for a small code. However, development workflow on the browser is not an ideal solution. WebIDE is not a great solution for developing in terms of development/deployment slowness (lag), and full support of IDE.

So, here I have summarized develop/run/test/deploy workflow on our laptop.

Develop

I have created a python virtual environment and created main.py script with the following code:

Simple Cloud Functions script. Listen for GCS file upload/update

Run

To run Cloud Functions locally, GCP provides us with 2 options: Functions Framework and buildpacks (document). Functions Framework makes us run Cloud Functions directly on our laptops by emulating HTTP/cloud events to send requests to our Cloud Functions. On the other hand, buildpacks provide full Cloud Functions runtime through the Docker environment. So if portability is necessary, we should use buildpacks, in other cases, GCP recommends we use Functions Framework.

Here, I have used Functions Framework because it is easy to use. All I need to do is install the framework using the following command on the python virtual environment I have created:

pip install functions-framework

Then run the command as follows. The Port is listening port, the target is method name inside main.py script, signature tells what is trigger type. Mainly there are HTTP, and event-driven types in GCP Cloud Functions.
HTTP type: Cloud Functions is invoked through a URL call
Event-driven: Cloud Functions are invoked through other GCP services (like uploading/editing files on GCS triggers Cloud Functions)

functions-framework --port=8080 --target=function --signature-type=cloudevent

If the command does not show any error, this means it started correctly.

Testing

After we run Cloud Functions, we need to send an HTTP request to invoke it. I have used Postman for this stage. This is the document I used to create the request.

Postman requests header settings

And here is the body here:

{
"bucket": "MY_BUCKET",
"contentType": "text/plain",
"kind": "storage#object",
"md5Hash": "...",
"metageneration": "1",
"name": "MY_FILE.txt",
"size": "352",
"storageClass": "MULTI_REGIONAL",
"timeCreated": "2020-04-23T07:38:57.230Z",
"timeStorageClassUpdated": "2020-04-23T07:38:57.230Z",
"updated": "2020-04-23T07:38:57.230Z"
}

If you see the result of the postman request here, it is confirmed that run successfully.

In the console I have run Cloud Function, it printed the following output:

(venv) blog-cloud-functions % functions-framework --port=8080 --target=function --signature-type=cloudevent
{
"bucket": "MY_BUCKET",
"contentType": "text/plain",
"kind": "storage#object",
"md5Hash": "...",
"metageneration": "1",
"name": "MY_FILE.txt",
"size": "352",
"storageClass": "MULTI_REGIONAL",
"timeCreated": "2020-04-23T07:38:57.230Z",
"timeStorageClassUpdated": "2020-04-23T07:38:57.230Z",
"updated": "2020-04-23T07:38:57.230Z"
}
Hello, cloud function local testing!
cloud_event_data.get('name')='MY_FILE.txt' is file uplaoded to the Cloud Storage.

Deploy

After we installed gcloud cli, and authorized by logging into GCP, run the following command to deploy Cloud Functions to GCP.

(venv) blog-cloud-functions % gcloud functions deploy cloud-storage-trigger-function-test-01 \
--gen2 \
--region=asia-northeast1 \
--runtime=python39 \
--source=./ \
--entry-point=function \
--trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
--trigger-event-filters="bucket=cloud-functions-trigger-test-bucket-01"

There are 1st and 2nd generation types to deploy Cloud Functions, and GCP recommends we use 2nd generation (gen2).

Because I am deploying Cloud Functions that are invoked through Cloud Storage’s event, I am using --trigger-event=google.cloud.storage.object.v1.finalized flags.

Issues I faced during the workflow creating

Developing, Running, Testing flow went fine without any big troubles. However, during deployment I had an error like this:

ERROR: (gcloud.functions.deploy) OperationError: code=7, message=Creating trigger failed for projects/PROJECT_ID/locations/asia-northeast1/triggers/cloud-storage-trigger-function-test-01-971683: The Cloud Storage service account for your bucket is unable to publish to Cloud Pub/Sub topics in the specified project.
To use GCS CloudEvent triggers, the GCS service account requires the Pub/Sub Publisher (roles/pubsub.publisher) IAM role in the specified project. (See https://cloud.google.com/eventarc/docs/run/quickstart-storage#before-you-begin)

It looks like gcloud function deploy the command does not add necessary permission to the service account of GCS. If Cloud Functions are invoked through only HTTP, then deployment should go fine, in my opinion. So I had to add the necessary permission to the service account with the following command:

SERVICE_ACCOUNT="$(gsutil kms serviceaccount -p PROJECT_ID)"

gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:${SERVICE_ACCOUNT}" \
--role='roles/pubsub.publisher'

Summary

I have introduced the development workflow of Cloud Functions locally. There was no complete guide on how to do it on GCP’s official documentation. In order to develop, run locally and test, and deploy fully locally, I needed to see several GCP documentation which took me several hours. That is why I have summarized the full development workflow here.

Of course, if the code is very small, and you need to deploy it fast, GCP’s web console on the browser would be a better solution. However, when code becomes a little bit complicated, and you need to work on the script for a considerable amount of time by developing and testing, I think local workflow is a better solution than WebIDE.

--

--

Khureltulga Dashdavaa

Love infrastructure solutions on AWS, GCP. Working at an IT start-up as Software Engineer in Tokyo. Candidate for Master's Degree in CS from Osaka Univ.