Automate Cloud SQL backups hourly using Cloud Scheduler

Devashish Patil
Google Cloud - Community
3 min readJan 9, 2023
Photo by benjamin lehman on Unsplash

While Google Cloud SQL has a feature of automated backups and point-in-time recovery, you might still have a requirement to have backups with a higher frequency. One reason to do this can be compliance.

On-demand backups can be done via the console, gcloud or with the REST API. Automation can be done utilizing gcloud or the REST API, but in case of gcloud you’ll need to run the command somewhere periodically. The most common way to do that is to spin up a VM and put that in a CRON script.

The downside of this approach is that you are wasting a lot of computing resources. And what happens if the VM goes down, you’ll need to account for that with some kind of failover.

This is where Cloud Scheduler comes into the picture. Cloud Scheduler is basically a Serverless CRON in the Cloud. You won’t have to manage a thing, and you’ll pay for only what you use.

Cloud Scheduler has the following components:

1. Defining the Schedule

Here you will define the CRON pattern to run the backup. In the example below, we are scheduling for hourly execution.

2. Configure the Execution

In this section, you define what action you want to perform. At the time of writing this article, there are 4 types of operations you can do: 1) Hit an HTTP endpoint, 2) Call a Pub/Sub topic, which in turn can call another service like Cloud Functions, 3) Call an App Engine HTTP, and 4) Trigger a Google Cloud Workflow.

We will be using the HTTP endpoint here and will be sending a POST request to the Cloud SQL REST API. The format for the REST API is as below:

https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/backupRuns

Remember to replace the project-id and the instance-id here.

In the example above, for the Auth header, select ‘Add OAuth token’ and select/create a service account that has access to create backups for Cloud SQL.

The last section is optional and can be used if you are looking to add a retry logic if something fails.

Once everything has been configured, the job can be created and the backing up of the Cloud SQL instance will start hourly. You can also force a job run for testing purposes.

Few points to note:

  1. Hourly backups are too frequent for a database and it will have negative implications on the performance.
  2. If the database is too large, there is a chance that the backups take more than an hour. You might see unexpected behavior in this case.
  3. [Important] These are created as on-demand backups and are not deleted automatically.
    You’ll need to build a logic to delete older backups, otherwise, the costs will go up. One example to do this is to create another Cloud Scheduler job, call a pub/sub topic which in turn calls a Cloud Function. This cloud function will delete the older backups based on custom business logic.

--

--