How to schedule a Cloud Function with parameters

Use Cloud Scheduler to schedule a Cloud Function in Google Cloud with varying parameters

Lee Doolan
Qodea Google Cloud Tech Blog
6 min readJan 18, 2023

--

Introduction

Google Cloud Functions are a great serverless way to deploy your code and run it when triggered by external events, such as eg. a file dropping into a cloud storage bucket, a message being added to a PubSub topic or even a simple HTTP call.

But what if we simply want a cloud function to run at a specific time each day, or similar schedule for example?

And what if we want to schedule the same function but with different parameters?

Luckily, we can make use of Google Cloud Scheduler, for this, which is also serverless and a fully managed enterprise-grade cron job scheduler.

You can read the full documentation for Cloud Scheduler here.

Practical usage

To be honest, the magic is whatever you develop in your Cloud Function, but scheduling them more easily opens up many practical usages, such as:

  • Running daily big data / data science pipelines
  • Nightly jobs to shut down expensive cloud resources not required out of core hours
  • Starting backups for databases or other maintenance tasks
  • Batch emailing customers subscribed to a contact list
  • Copying BigQuery datasets between regions

Overview of the Method

In this short blog we will quickly cover deploying a simple function and then schedule it twice, with different parameters, using Cloud Scheduler.

All the commands used here are nicely grouped into scripts, and an optional Make file provided in the shared git repo here.

Just pull out, modify and use what you need!

Environment Config
All Config items are set per environment in the git repository folder config.

Environment config files are named ${ENV}_env where ${ENV} is the name of the required environment i.e. dev, test or prod for example, and is a system variable.

Config variables in the config file are described as below:

https://github.com/leedoolan77/cloud-function-schedule#environment-config

Note: All build scripts run the `scripts/env.sh` script first to gather variables.

This is currently hard coded to set the ‘ENV’ system variable to the `dev` environment, and should be changed / removed if setting a system variable elsewhere.

Develop Cloud Function
My example will deploy a simple Python language Cloud Function, stored in the git repository folder src/Cloud Function and requires 2 files:

  • main.py This is the python code that will run your Cloud Function. You can change this to do whatever you require.
  • requirements.txt This file includes all the additional modules your python code will need to run e.g. google-cloud-pubsub==2.13.10

You can read more about developing Cloud Functions here.

Deploy Cloud Function
Script files have been created to help with the deploy. Before deploying ensure you have added the required config items for your environment as detailed above.

You will need to ensure you have rights to create cloud functions, cloud scheduler jobs and set IAM policies depending on the tasks you are carrying out.

Scripts should be run in the the order below (if required).

Note, they can also be called using the Make file too. Simply run commands in the root of the repository, like below:

make auth_gcp
make project_create
etc

  1. scripts/auth_gcp.sh — To authenticate with Google Cloud, and set the current project to the one defined in the config.
  2. scripts/project_create.sh — OPTIONAL to create the Cloud Function Project in required location and attach the billing account.
  3. scripts/iam_create.sh — OPTIONAL to create your Cloud Function IAM invoke user.
  4. scripts/function_create.sh — To create your Cloud Function and give your IAM user invoke rights.

IMPORTANT NOTE 1: The Cloud Function will be created using default settings. Change as required.

IMPORTANT NOTE 2: The IAM user may need additional permissions to use other resources. Add as required.

These scripts run simple gcloud commands — if you’re not familiar with them, it may be useful to view them in the git repository here.

Schedule Cloud Function

As this blog is predominantly about scheduling a Cloud Function, I will show here the required gcloud commands that create the schedules.

A reminder, we will be creating 2 schedules, calling the same Cloud Function with varying parameters.

The official Google documentation can be found here, where we will be using OIDC token type to authenticate.

Below are the commands we will run:

echo "Set variables"
SCHEDULE_NAME="Dummy_Schedule"
SCHEDULE_DESCR="Execute a cloud function"
SCHEDULE_CRON="0 10 * * *"

echo "Enable required services"
gcloud --project=${PROJECT_ID} services enable cloudscheduler.googleapis.com
sleep 5

echo "Create the schedules"
gcloud --project="${PROJECT_ID}" scheduler jobs create http "${SCHEDULE_NAME}_1" --location="${REGION}" --description="${SCHEDULE_DESCR}" \
--schedule="${SCHEDULE_CRON}" --time-zone="Europe/London" \
--http-method="post" \
--uri="https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION}?message=Dummy_Schedule_1" \
--oidc-service-account-email="${IAM_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \
--oidc-token-audience="https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION}"

gcloud --project="${PROJECT_ID}" scheduler jobs create http "${SCHEDULE_NAME}_2" --location="${REGION}" --description="${SCHEDULE_DESCR}" \
--schedule="${SCHEDULE_CRON}" --time-zone="Europe/London" \
--http-method="post" \
--uri="https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION}?message=Dummy_Schedule_2" \
--oidc-service-account-email="${IAM_ACCOUNT}@${PROJECT_ID}.iam.gserviceaccount.com" \
--oidc-token-audience="https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION}"

Key things to note with this command are:

  • The schedule and time-zone parameters work together to determine the expected run times and frequency. The schedule is simple crontab notation. In this example I’ve set both schedules to run daily at 10am.
  • The http-method controls the request type sent to the cloud function.
  • The uri includes the path to the http trigger address, and the additional parameters we wish to send and use in our cloud function affecting functionality.
  • The OIDC parameters control the authentication between the scheduler and function i.e. which service account should invoke the cloud function. Note: the http trigger URL is repeated in this section and is required.

These commands also exist in the git repo as a script too as below.

scripts/schedule_create.sh — To create the schedule for your Cloud Function.

IMPORTANT NOTE: The Cloud Schedule will be created using given settings. Change as required

Testing
You can use the console to see if the schedule has been created successfully here, where you can also force run the schedule for testing purposes.

All being well the schedule will send the message to the cloud function and it will commence, returning a green success status as shown below:

Cloud Scheduler Jobs in Google Cloud Console

Both Cloud Scheduler and Cloud Function have logs where success and any failures can be monitored.

Note: Cloud Scheduler is designed to return the success of the Cloud Function, but occasionally this link can ‘break’.

If a schedule is continually returning a failure it may be beneficial to check the logs and status of the underlying Cloud Function in this case.

Conclusion

This is a fairly short and straightforward blog, that in all honesty I could have just highlighted the simple gcloud command to create the Cloud Schedule jobs — but I wanted to give the process more context, and in particular how to send parameters to your Cloud Function.

This method doesn’t require a middle layer PubSub topic or any other resources which many solutions I’ve seen do. You just need the Cloud Function and Cloud Scheduler Job.

I hope this demonstrates how easy it is to schedule and automate regular tasks, making use of single cloud functions with varying parameters. A truly pay-as-you-go serverless solution!

P.S Big thanks to my CTS colleague Max Buckmire-Monro for their help in this publication.

About CTS

CTS is the largest dedicated Google Cloud practice in Europe and one of the world’s leading Google Cloud experts, winning 2020 Google Partner of the Year Awards for both Workspace and GCP.

We offer a unique full stack Google Cloud solution for businesses, encompassing cloud migration and infrastructure modernisation. Our data practice focuses on analysis and visualisation, providing industry specific solutions for; Retail, Financial Services, Media and Entertainment.

We’re building talented teams ready to change the world using Google technologies. So if you’re passionate, curious and keen to get stuck in — take a look at our Careers Page and join us for the ride!

--

--

Lee Doolan
Qodea Google Cloud Tech Blog

Cloud Data Warehouse Architect & Data Engineer | UK Based | https://www.linkedin.com/in/leedoolan77 | Thoughts are my own and not of my employer