Cloud Functions to Cloud Run

Neil Kolban
Google Cloud - Community
4 min readDec 18, 2020

When we use the GCP product called Cloud Functions, we are supplying the body of a function that contains the code logic we wish to have executed. By doing this, we are separating ourselves from any concern or implementation of how that function is invoked. It is Cloud Functions that causes the execution of our code when incoming requests arrive. We do not have to develop any form of serving scaffolding. Cloud Functions also takes care of starting up as many instances as we need based on load and scaling down to zero when no calls are in-flight.

An alternative to Cloud Functions is the service known as Cloud Run. Cloud Run has similarities to Cloud Functions in that it scales to zero and takes care of the startup/shutdown of instances. However the development model changes considerably. With Cloud Run, it becomes the developers responsibility to construct a Docker image that will run as a container started by Cloud Run. The container will be responsible for becoming a full REST server and receiving the incoming request and then passing control to the business logic code.

If we were to try and contrast these stories in diagrams, we would have the following:

This first diagram shows the function body (the code we want executed) being supplied to Cloud Functions. We focus exclusively on the function body and hand it off to Cloud Functions for execution.

For Cloud Run, the function body that we want executed is part of the REST Server Framework that we are also responsible for building. We then have to package all this in a Docker image and finally we hand the Docker image off to Cloud Run for execution. As we can see, there are additional steps.

What if we have previously built a Cloud Functions based solution and now wish to host this in Cloud Run; what stories do we have in that regard?

Here we will look at a feature of GCP called Build Packs. At the highest level, Build Packs can be considered a technology whose purpose is to take source code as input and generate Docker containers as output. Classically, if one wants to create a Docker container, one has to manually create a Dockerfile and then pass that and the source to Docker with the result being a container. While not onerous, it does add additional steps and hence introduces toil and opportunities for errors. The Build Packs story automates the detection of the programming language of the applications to be hosted in a container and removes the burden of writing Dockerfiles and selection of recipes to build the code within the containers.

Let us look at an example of using a Build Pack. If we have an application that was written in Node, we may have a source file (index.js) and a package file (package.json). If we wish to build a container that contains our application, we could run:

gcloud builds submit --pack image=us-central1-docker.pkg.dev/[PROJECT]/[REPO]/[IMAGE],env=GOOGLE_FUNCTION_TARGET=functionName

There is a lot in that command so let us take it apart. The command is a request to leverage the GCP Cloud Build service. Cloud Build is able to perform Docker container construction within GCP rather than in your local environment. Running gcloud builds submit takes the source files in the current directory, uploads them to GCP and then performs container construction. Normally, Cloud Build expects to also be supplied a Dockerfile which contains the recipe to build the container. By adding the --pack flag, we are leveraging Cloud Build’s integration with the Build Pack story. By providing this flag, we are telling Cloud Build to not expect a Dockerfile as input but instead use Build Pack technology to determine how to transform the source to a container. We also specify the repository into which the resulting image will be placed. In this example we use GCP’s Artifact Registry. A final flag specified by the GOOGLE_FUNCTION_TARGET parameter is some specialized Google magic. If this flag is supplied it informs Cloud Build that the source is only a function body and that the REST framework that will invoke the function has not been supplied. Cloud Build will then inject the Cloud Functions framework into the resulting container such that it will call the code supplied function when a request arrives.

This is a wordy description. Making it simpler, if we give this command an input of two files:

  • index.js
  • package.json

where index.js contains only the function body as we might supply to a Cloud Function environment, then the result will be a Docker image stored in the GCP Artifact Registry. From that image, we can then create a Cloud Run instance which, when it is invoked, will call the function code found in index.js in exactly the same fashion as that exposed by a Cloud Functions deployment.

To illustrate the story further, here is a video showing a full end to end migration of a Cloud Function to Cloud Run.

See also:

--

--

Neil Kolban
Google Cloud - Community

IT specialist with 30+ years industry experience. I am also a Google Customer Engineer assisting users to get the most out of Google Cloud Platform.