Cloud Functions Best Practices (1/4) : Get the environment ready

Beranger Natanelic
Google Cloud - Community
8 min readNov 15, 2022

Structure, test, retest and deploy Google Cloud Functions efficiently

This article is part of a 4 articles serie in which I give various advices about Google Cloud Functions development. This work is a result of two years of daily practice, deployment and monitoring. Some of those best practice are directly from the official documentation, others are from my experience, what was proven to be the most effective. For any different point of view, feel free to comment this (free) article. Thanks!

Cloud Functions Best Practices (2/4) : Optimize the Cloud Functions >>>

Cloud Functions Best Practices (3/4) : Secure the Cloud Functions >>>

Cloud Functions Best Practices (4/4): Monitor and log the executions >>>

Get the Google Cloud Functions environment ready

Online tutorials are cool, mine are even super cool! ;) With a bunch of articles, anyone can learn how to use Secret Manager in Google Cloud Functions, how to protect a Google Cloud Functions, how to rate limit them, how to use Cloud Storage, Cloud Pub/Sub and every possible tools.

But,

It has a major drawback ⇒ Tutorials are made for a single, short, specific topic.

And business projects are everything but short and specific.

Business projects combine multiple Cloud Functions, using various tools that need to be tried, tested, and debugged, hopefully efficiently.

How to do that is not explained in single-feature tutorials.

That’s what I cover in this first article ⇒ Get the Google Cloud Functions environment ready to allow multi Cloud Functions management and efficient testing & debug.

1 function <> 1 folder & 1 function <> 1 file

After months using Google Cloud Functions, it’s easy to fall in love and deploy a dozen.

These GCF will need to be deployed, to be pushed to Git, to be documented and used by other users.

Having a clear and efficient structure is essential.

Bad structure

A beginner mistake that I saw a few time is to put all the Google Cloud Functions in the same folder, create a big README and push all that to Git.

The structure would look like that (python):

This works! Nothing bad here, it works for 1-2-3 Cloud Functions coded with the same language.

It is problematic for three reasons, yet acceptable for 3 functions:

  • requirements.txt will include packages that are used by all the functions. If function_2 needs a specific package, function_1 and function_3 will also need to import it.
  • README.md is a documentation for the 3 functions
  • it’s not possible to have a nodejs, a python and a Go function in the same folder

Good structure

Instead, I am proposing this structure, having none of the previous drawbacks:

  • Use 1 folder per Google Cloud Function
  • Add 1 README.md to every folder
  • Add 1 requirements.txt or any package.json per Google Cloud Function

Don’t forget the function itself ;)

Code splitting

I saw a few time a function being divided into a few file, utils file, config file… apparently for clarity. Why not. But Google Cloud Functions were designed for specific tasks and short functions. ⇒ If a code is so long that it needs to be divided in multiple files, maybe Google Cloud Functions isn’t the right product.

gcloud ignore

If we adopt the previous structure (and we should!) and deploy a function using command line, we will see a README.md file in Google Cloud Functions. We don’t want that. It’s a useless file that don’t need to be deployed.

We can simply add a .gcloudignore file in every folder, this file include itself and the README.

README.md
.gcloudignore

The final structure looks like that:

With this simple structure change I could:

  • Create a js GCF and a py GCF in the same Github repository
  • Have distincts packages for specific functions
  • Have a different README for every function

This structure is proposed by Google Cloud in their Google Cloud Function sample Github repository. They even create a subfolder to add the function file. Honestly, I don’t know why they have been that far 😀

Last but not least → Github

Once this structure is set up, it’s easy to be tempted to simply push update to github like “update image link”, “update readme”…

It is what they do in the GCF sample Github repo:

No

The structure previously proposed is mixing many functions and many projects, committing this way doesn’t help the team to see what are the current projects.

I suggest to include the function name in every Github commit, like: “function_1: add attribute name”, “function_2: code refactoring”…

This way, it’s easier to see what have been recently developed, current projects…

Hope to see a change in your commits, Google developers ;)

At first, don’t use Google Cloud Functions

After following tutorials about GCF, the normal reflex is to keep going development using the Cloud Function environment itself, deploying after each update.

That’s not a good idea.

Tutorials are sharing a clean code, tested many times. But when normal developers are coding, they expect to have a clean code running properly at the first iteration… It’s, let’s be honest, never the case.

It’s not a good practice to deploy a GCF for every code change because deploying a GCF takes time, up to 2 minutes. Plus the testing time, plus the time to wait for the logs to appear, it easily takes 5 minutes from an update to a result, it’s enormous!

A good developer would always forget a [ or a “ or a check. There is nothing more frustrating than having a “KeyValue Error” after a few minutes waiting.

The deployment must be used only at the very end, when the GCF is ready for deployment. When everything had been tested locally.

What I always do when starting a new project for a Cloud Function is opening a Jupyter Notebook, code everything, structure my code, do all the necessary test and checks.

Once a can execute my whole code in a single notebook cell…

It’s not time to deploy…

It’s time to…

At second, use functions-framework

The code is clean, it’s time to run the function. But before deploying it, Google provides an emulator for running the function locally.

Functions-framework is a framework that spin up a local development server for quick testing.

The difference with local code is that is simulates the Cloud Functions server and respond to an event.

When the functions-framework command is run, it simulates the server and give a local address to call the function.

Let me explain the process for a python GCF:

  • First, in the GCF repository, install functions_framework:
pip install functions-framework
  • Secondly, create a folder test_functions_framework and include all the files we previously discussed about, the folder now have this structure:
.
├── .gcloudignore
├── README.md
├── main.py
└── requirements.txt
  • In requirements.txt, add this line:
flask
  • In main.py add this simple code:
from flask import Response
import uuid

def main(request):

try:

print("I was called !", uuid.uuid4())

return Response(response = 'ok', status = 200)

except Exception as e:
print("ERROR ", e)
return Response(response = 'AN ERROR OCCURED', status = 400)
  • In the test_functions_framework folder, we can run functions-framework with this command:
functions-framework --target=main --debug
  • It gives us a localhost address we can just ping, it directly gives all the logs and the result as if it was deployed online.

Everything is set for deployment!

A last reason to crash

The function is clean and tested, work is almost done but there are still a chance that it crashes once deployed.

Because of authentication.

There are chances that your function is interacting with entities of Google Cloud (Secret Manager, Pub/Sub, Bigquery…)

During development, you probably used a service account or your own credentials to access these services.

But once deployed, the Cloud Functions don’t have the same access.

There are two possibilities:

During deployment, you could specify a service account that must be used by the functions

If not, Cloud Functions uses a default service account as its identity for function execution:

  • Cloud Functions (1st gen) uses the App Engine default service account, PROJECT_ID@appspot.gserviceaccount.com.
  • Cloud Functions (2nd gen) uses the default compute service account, PROJECT_NUMBERcompute@developer.gserviceaccount.com.

Be sure that these service accounts have access to the resources needed by the function.

Now, there are no reason for the function to crash. If it crashed, leave a comment and give me the reason.

You are ready for deployment, padawan!

At last, deploy! Using command lines!

I would say that +80% of the tutorials about Google Cloud Functions explain how to deploy an instance using UI.

It works.

But if you made it to this point, you are now mastering Google Cloud Functions and you don’t have time to click 3-4 buttons, wait for the UI to load and the function to deploy.

Moreover, if you have followed carefully the first subpart, your code is well structured and you are ready to deploy using command line.

“Life is too short to learn german and to deploy a function manually.” once said a great man

Take 5 minutes to install the gcloud SDK and run a deploy command from a terminal.

It will take 2 minutes to deploy, meanwhile you can clap 50 times this article ;)

Don’t forget to save your code before running the deploy command! ;)

Deploy for http functions triggered functions (python)

gcloud functions deploy your-function  --region=us-central1  --entry-point main --runtime python310 --trigger-http --no-allow-unauthenticated

Deploy for Pub/Sub triggered functions (nodejs with a longer timeout)

gcloud functions deploy your-function  --runtime nodejs14 --memory=128MB --timeout=200s --region=europe-west2 --trigger-topic=your-function

To see more, check the deploy documentation

Another good tip is to automate deployment from Source Repository, every time you push to GitHub, the function is deployed on Cloud Functions.

Bon… I don’t like it much because it means pushing a code to Github that hasn’t been tested live.

--

--

Beranger Natanelic
Google Cloud - Community

Daily Google Cloud Platform user. I am sharing learnings of my tries, struggle and success.