GCP – Cloud Functions – Develop it the right way

Murli Krishnan
Google Cloud - Community
6 min readNov 6, 2022

General tendency for developing on the cloud functions is by hit-trial and deploying it multiple times till the deployment works, this is not a good approach from developer productivity perspective.

Cloud Functions are “Function as a Service” offering of GCP which allows event driven or http functions to be deployed for usage by other services or users.

This blog focusses on development techniques that will help to get the cloud functions deployment fast and right the first time.

The below image shows the typical downfall of doing local development with testing and deploying done on cloud functions.

Development Approach Switch

This is not the correct approach for development as deployment of cloud functions takes time (1–2 mins) and for every small errors like syntax, request parsing, logic issue etc, this has a compounding effect where the developer spends a lot of time getting the cloud function deployed correctly for the first time.

How do we simplify the development methodology for cloud functions.

Enter Functions Framework !!

Introduction

Functions Framework helps in setting up environment for testing of cloud functions code locally.
Environment means setting up a quick local web server on which the function gets deployed which can be tested by simple curl requests or any local test programs.

Getting Started (HTTP Function)

For the purposes of the demo, I will be using python runtime but the same principle applies for other runtimes

Let’s start with a simple HTTP function

The below function multiply takes in 2 numbers and returns the multiplication result.

This is a very simplistic version for demo but handle the response codes and errors properly.

But why the request argument and why get_json()

Cloud Functions expects the input object to be of Flask Request Object and get_json is a flask request object function that extracts the JSON data content.

For testing this function lets install functions-framework from pypi package

pip install functions-framework

Running the functions framework is very simple and does not require any code changes.

functions-framework --port 8080 --target multiply --signature-type http --source main.py --debug

Port indicates port on the local machine where the web server is listening
Target indicates the entrypoint to the function
Signature-type can be event, http or cloudevent
Source indicates the code file containing the entry point for the function.

The output of the above looks like below

Output of Functions Framework command

Run the tests on the local server using curl or any other means.

curl -X POST \
-H "Content-type:application/json" \
-d '{"num_1":20, "num_2": 30}' \
-w '\n' \
http://localhost:8080

The output is as below

Simple Curl Request to test the cloud function

Voila !! The testing is done. Now we can simply deploy to the cloud functions and deploy/invoke without any syntax or logical issues.

gcloud functions deploy multiply_function \
--source cloud_functions/ \
--entry-point multiply \
--runtime python39 \
--region us-central1 \
--allow-unauthenticated \
--trigger-http
Deployed Cloud Function

The cloud function once deployed can be invoked in a similar way.

curl -X POST \
-H "Authentication: Bearer $(gcloud auth print-access-token)" \
-H "Content-type:application/json" \
-d '{"num_1":20, "num_2": 30}' \
-w '\n' \
https://redacted.cloudfunctions.net/multiply_function

Event Driven — Cloud Pub sub

For Cloud pub-sub, we will be using a pub-sub emulator for running pubsub locally and publishing messages to local end-point

The summary of the setup looks something like below

Overall Flow

The above boxes shown in grey can be combined as part of bash script for simplicity

The below command runs the pub-sub emulator locally on port 8081

gcloud beta emulators pubsub start --project=<redacted> --host-
port='localhost:8081'
Cloud Pub-sub Emulator output

Set the environment variables needed for re-directing messages to emulator

export PUBSUB_EMULATOR_HOST=localhost:8081
export PUBSUB_EMULATOR_PROJECT=<redacted>

Now lets setup a pubsub schema, pubsub topic and subscription.

The reference for the below commands are REST API for pubsub — https://cloud.google.com/pubsub/docs/reference/rest.
The below commands points to localhost:8081 where the emulator is running instead of pubsub.googleapis.com.

Client libraries for pubsub can also be used instead of REST API

Creation of schema
We are expecting the messages to be published should have single field repo_name of type string

curl -X POST \
-H "Content-type: application/json" \
-d @pubsub_schema.json \
-w '\n' \
http://localhost:8081/v1/projects/<redacted>/schemas?schemaId="test-json-schema"
@pubsub_schema.json
{
"name": "projects/<redacted>/schemas/test-json-schema",
"type": "AVRO",
"definition": "{\"type\": \"record\",\"name\": \"sample\",\"fields\": [{\"name\": \"repo_name\",\"type\": \"string\"}]}"
}

Creation of Pubsub Topic

curl -X PUT \
-H "Content-type: application/json" \
-d @pubsub_request.json \
http://localhost:8081/v1/projects/<redacted>/topics/test-topic
@pubsub_request.json
{
"schemaSettings": {
"schema": "projects/<redacted>/schemas/test-json-schema",
"encoding": "JSON"
}
}

Creation of Pubsub Subscription

The subscription’s push endpoint is mentioned as “http://localhost:8080” where the cloud functions framework will be running the webserver, ready to accept the messages

curl -X PUT \
-H "Content-type: application/json" \
-d @pubsub_sub_request.json \
http://localhost:8081/v1/projects/<redacted/subscriptions/test-topic-sub
@pubsub_sub_request.json
{
"topic": "projects/<redacted>/topics/test-topic",
"pushConfig": {
"pushEndpoint": "http://localhost:8080"
}
}

Lets get the functions framework started

functions-framework --port 8080 --target pubsub_event --signature-type event --source main.py --debug
definition of pubsub_event function

Now, lets try posting message to the emulator.
The message requires to be base 64 encoded as below

base64.b64encode('{"repo_name":"repository-1"}'.encode("utf-8"))
b'eyJyZXBvX25hbWUiOiJyZXBvc2l0b3J5LTEifQ=='
curl -X POST \
-H "Content-type: application/json" \
-d @pubsub_message.json \
-w '\n' \
http://localhost:8081/v1/projects/<redacted>/topics/test-topic:publish
@pubsub_message.json
{
"messages":[
{
"data": "eyJyZXBvX25hbWUiOiJyZXBvc2l0b3J5LTEifQ=="
}
]
}

You will be able to see the output at the server as below

The event data and context printed from the function.

Now you can write your function logic to process the messages and deploy the cloud function post testing

Event Driven — Gen 1 — Sample Cloud Storage Event

Let’s take an example of setting up a cloud function that fires up on cloud storage event.

Cloud Functions Generation 1 — Event Triggers requires 2 arguments
1. Event — Dictionary representation of the event data
2. Context — (google.cloud.functions.Context) indicating the metadata for event (event_id, event_type, timestamp and resource)

The event dictionary will depend on the type of event for which cloud function is registered.

For cloud storage event trigger — Refer https://cloud.google.com/functions/docs/calling/storage
For cloud storage event data https://github.com/googleapis/google-cloudevents/blob/main/proto/google/events/cloud/storage/v1/data.proto

All such supported events also have their proto structure in the github for testing.

The above event and context data for cloud storage event can be generated by simply deploying a dummy cloud function with printing of event and context.

Once the event data is ready, the cloud functions framework can be used in same way by providing curl request or via program

curl -X POST \
-H "Authentication: Bearer $(gcloud auth print-access-token)" \
-H "Content-type:application/json" \
-d @request.json \
-w '\n' \
https://redacted.cloudfunctions.net/storage_event

Hope this content was helpful.
Please connect with me on my linked-in handle for any queries — https://www.linkedin.com/in/murli-krishnan-a1319842/

--

--