Cloud Functions Best Practices (3/4) : Secure the Cloud Functions

Beranger Natanelic
Google Cloud - Community
10 min readFeb 2, 2023


Protect your Function perfectly

This article is part of a 4 articles serie in which I give various advices about Google Cloud Functions development. This work is a result of two years of daily practice, deployment and monitoring. Some of those best practice are directly from the official documentation, others are from my experience, what was proven to be the most effective. For any different point of view, feel free to comment this (free) article.


<<< Cloud Functions Best Practices (1/4) : Get the environment ready

<<< Cloud Functions Best Practices (2/4) : Optimize the Cloud Functions

Cloud Functions Best Practices (4/4): Monitor and log the executions >>>

Secure the Cloud Functions

Super Secured Google Cloud Functions

Online tutorials are cool, mine are even super cool! ;) With a bunch of articles, anyone can learn how to use Secret Manager in Google Cloud Functions, how to protect a Google Cloud Functions, how to rate limit them, how to use Cloud Storage, Cloud Pub/Sub and every possible tools.


It has a major drawback ⇒ Tutorials are made for a single, short, specific topic.

And business projects are everything but short and specific.

Business projects combine different access, contain sensitive data, act on real world data with major impact if accessed.

In this article, I am explaining how to protect the different sensitive layers of a Cloud Functions.

Specific functions = specific permissions

In many tutorials (all of them actually), explaining how to create a Cloud Functions, they keep the default “Runtime Service Account”.

The Runtime Service Account represents the service account, the identity, used by the Function to call, trigger, share data with other GCP services.

That’s bad. No good. Niet.

Default runtime service account

A Function has its own identity (🤖), its own behaviour and, therefore, needs its own access.

See this structure:

function_1 is accessing services not accessed by function_2 and vice-versa. Using the default Runtime Service Account, they would both have more permissions than needed whereas function_2 would only need access to BigQuery and Cloud Vision.

Just above, we saw that the default Runtime Service Account is “App Engine default service account”. This default service accounts have the Editor role, which allows functions broad access to many Google Cloud services.

Maybe, you don’t even have editor role yourself!

But if you have Cloud Functions Admin role, you can do anything you want on the GCP Project via a Cloud Functions deployed with the default Runtime Service Account (I guess/hope it is only true if the Function is already deployed by someone with Editor role).

⚠️ And if you can do anything, so can anyone with access to Cloud Functions.

To avoid this terrible situation, give an identity to every Cloud Functions.

  • Create a service account
  • Grant it the appropriate roles, based on what resources the Function needs to access
  • Connect the service account with your function. If the function is not deployed yet, go to the “Runtime, build, connections and security settings” tab, if it was already deployed, edit it and go to the same tab. Update the Runtime service account with your new SA
  • Ah, after updated, your function may not work if you’ve forgotten permissions… it happens. Just test it ;)

To live happily, live hidden

An immense majority of tutorials are showing how to send emails, send slack notifications, prepare a coffee or post on twitter via Cloud Functions using an HTTP trigger!

The HTTP trigger is convenient for tutorials, input, output, URL, click, click, check, zip, pouf, it’s live! No need to talk about Pub/Sub, create access, blablabla.

A beginner would naturally keep going with this setup.

He/She would keep going having a Cloud Functions accessible by everyone and use this convenient HTTPS address to paste on Postman, to do various sensitive tasks, sometimes accessing costly services (sometimes will even write a blog post, and share this exact URL, yeah, it happens).

Dear reader, you are not a beginner, and if you were, this time is over:

Cloud Functions should never (never!) be public!

Cloud Functions should never be public

In this big subpart, I am taking about the different ways to hide a Cloud Functions.

Go background

For many many cases, Cloud Functions can work in background, responding to events in an asynchronous way.

Example cases:

  • React after a new file in a Cloud Storage bucket
  • Periodically check for a new email/tweet/message…
  • Every morning, create a list of something from BigQuery data
  • Send message every X minutes if a condition is verified
  • Periodically send notification

These are simple examples, but if you check online tutorials, most of them suggest to create an HTTP cloud function.

They don’t even mention to delete this function at the end

The truth is, if the trigger doesn’t need HTTP, the Function doesn’t have to be HTTP.

It could be a background Function, responding to an event:

  • New object in bucket
  • Cloud Scheduler trigger
  • New file in Google Drive
  • And more than 100 events using Eventarc (GCF gen 2)

If you check my previous best practices article. You see the following suggested structure:

On the right side, we are using background functions to access Gmail, Slack…

Because an email is by nature asynchronous, Google Sheets, Slack notifications don’t need to be live. And when I mean live, I mean real live, like a chat conversation ⇒ it needs to be live. Using background functions, there is a 5s latency, which is acceptable for many many cases.

The Functions on the right side are internal, only people and services with correct access can invoke them, not a random bot or attacker.

I can assure you, that already makes a huge security difference (it also makes them reusable, but that’s another topic).

These background Functions could be triggered periodically via Cloud Scheduler, after an event via Eventarc or via another Cloud Functions.

I know what you think…

In the suggested structure, background Functions are invoked by HTTPS Functions, it doesn’t change anything…


True… First, I simply wanted to clarify the fact that a Cloud Functions doesn’t have to be HTTP and there could be plenty of case where a background function is enough and is never accessed by an HTTP function. Therefore being 100% hidden.

Now, what if we still need to have an http Function?

Protect HTTP triggered Functions — OAuth

If an HTTP Functions is required (ie. a request is send and waits for an answer), then the Cloud Function must have an authentication process.

Google Cloud proposes to secure the Cloud Functions with IAM roles and permissions using service account.

This solution requires a bearer authorization token for authentication, the solution is simple and secure, as the token changes periodically.

Let’s see how it works:

First, create a Cloud Function, select trigger HTTP and check “Require authentication”.

If you are using gcloud, and you should use gcloud because you read Cloud Functions Best Practices (1/4) : Get the environment ready: add --no-allow-unauthenticated at the end of the deploy command line.

Once the Function is deployed, paste the given URL in a browser, you should see: Error: Forbidden Your client does not have permission to get URL /your_function from this server.

Great! How can we call it then?

Using gcloud

gcloud functions call your_function --region europe-west2

The following result will be prompted:

gcloud functions call your_function --region europe-west2
> executionId: 4fd95u1gge60
> result: HELLO BRO

I agree, this way is not so useful, except for testing.

Using Postman

Get your Google Identity token using gcloud:

gcloud auth print-identity-token

In Postman, paste the function URL and add an

Authorization bearer eyJhbGciOiJ…

attribute with the identity token.

Also useful for testing, now let’s call a protected Function from another Function.

Call a Function from another Function?

To call a protected Function from another Function, we need to authorise and identify the caller.

The main idea is:

  • Allow function_1 to be called by function_2 using function_2’s service account (remember part 1?)
  • In function_2, create a Google-signed ID using the authentication libraries, as shown below, to generate this token.
  • Include the ID token in an Authorization: Bearer ID_TOKEN header in the request to the function

The full code for function_2 to be able to call function_1 (protected):

To learn more, the official documentation is very clear:

Protect HTTP triggered functions — VPC

This article wouldn’t be complete if I were not mentioning VPC.

A Function could be set inside a VPC. VPC Service Controls is a set of tools which you can apply to restrict access to Google’s services within your GCP projects.

It’s like a big box where we can find some GCP services able to communicate together with no requirements for authentication.

This box is not accessible from the outside world, assuring an ideal security.

The process is a bit complex and requires specific network skills, I won’t go into details cause this article is already way too long but you can find all the steps in the official documentation.

Protect HTTP triggered functions — API Key

For some use cases, OAuth 2 is not an ideal solution, but we still want to filter requests. For example, if a Function has to be accessed by an external app, we need an API key.

Well… Google Cloud Functions doesn’t support natively this type of authentication.

If API Keys aren’t proposed for securing API endpoint, it’s because of its higher security risk.

We need to add an additional layer : API Gateway!

API Gateway is, before everything, a gateway. Meaning that it can manage our APIs, route requests to the appropriate function(s), log and monitor requests AND it has a built-in mechanisms, including authentication and key validation.

Along with these pros, API Gateway allows us to add an api key in front of all the functions specified in the config.

Cool hu?

To create a Gateway, we need:

  • A Cloud Function with --no-allow-unauthenticated ⇒ this will protect our Function
  • API Gateway enabled for the GC project
  • A YAML Config file

The YAML file would look like that:

And that’s all!

This solution isn’t perfect and should be used only if an external service is using the API!

To see the full process step by step, I recommend this widely appreciated article:

Rate limit

Once you are used to API Gateway you might want to protect your routes against DDoss attacks and flood of any kind. You can do that using rate limiting.

This is possible with Cloud Functions only using API Gateway (as of January 2023).

Rate limit will allow you to protect your route with an API Key that you will send to the API users. You can give different API Key to different users and limit their call number per minute.

The process is quite easy and is explained in this brilliant article:

Update periodically

When you deploy your Function, Cloud Functions downloads and installs dependencies declared in the requirements.txt, package.json or any other file depending your language.

Yup! Google Cloud Functions downloads and installs dependencies during deployment!

It means two things:

  • Once your Function is deployed, it doesn’t update any package itself
  • Putting a “^” or “>” to say “hey take the last available version” when declaring the required packages is risky and can lead to failures

Packages used in Functions will be updated when, and only when, Functions are deployed

So, if there is a security leak in a package, critical bug fixes in a new version, if a faster version is published, if a failure case is managed by a newer version… All Functions using these packages will stand still!

Above, I said, putting a “^” or “>” to say “hey take the last available version” is risky: imagine the case that you created a Function last year, didn’t update it for a year (bad), and you just want to update a string from the function, no code changed, only a string.

You say your boss “oh it takes 10 sec to update that!” and you deploy…BAM CRASH

Crash because one of the package used last year, removed a method or updated it’s dependencies. And because you said during deployment “hey take the last available version”. You now have a new version, not compatible with other packages. You now have to dig deeper to find the issue, fix and update your function.

It happened to me this year with an update from Flask and itsdangerous.

The same situation is true with runtime languages. Google does not automatically update the base image in use for already-deployed functions.

In November 14, 2022, GCF team fixed a security leak. The only way to be up-to-date was to redeploy all GCF.

⇒ Redeploy periodically your Functions to be always up-to-date. And update the runtime languages if needed (don’t skip the test part even if the code wasn’t edited).

Secret manager

Not so related to Cloud Functions itself but still an important security concept:

  • Setting passwords, api key directly into the Cloud Functions code is terrible
  • Setting passwords, api key in env variable is terrible
  • Setting passwords, api key in env variable using Secret Manager native implementation is better but not ideal

If you are using passwords, api keys or any kind of credentials in your Cloud Functions, use Secret Manager to store them. Therefore, you are able to control the access and update them at one single place.

To use Secret Manager into a Google Cloud Functions, I highly recommend Secret Manager client library or native implementation with secret mounted as a volume.

You can check this largely read article to get a full comparison and explanation of the methods :


I hope this article brings your Cloud Functions development to a more secure level.

Previous parts:

<<< Cloud Functions Best Practices (1/4) : Get the environment ready

<<< Cloud Functions Best Practices (2/4) : Optimize the Cloud Functions

Next part:

Cloud Functions Best Practices (4/4): Monitor and log the executions >>>

Thank you for reading, thank you for supporting!

Don’t forget to change the world!



Beranger Natanelic
Google Cloud - Community

Daily Google Cloud Platform user. I am sharing learnings of my tries, struggle and success.