Google Cloud SQL database exports using Cloud Scheduler


This is going to be a quick and to the point blog. As many of you know (if you’re here at this point, I know you know), Google Cloud SQL backups are directly linked to the instance itself. WHAT!? That means if the instance is deleted, so are your backups. As a DevOps engineer, this is a very frustrating issue that should be addressed ASAP. Until they do, here we go.

  • Cloud SQL Instance name: itsmetommy
  • Database name: itsmetommy-db

https://cloud.google.com/solutions/scheduling-cloud-sql-database-exports-using-cloud-scheduler

Create Cloud Storage bucket

export BUCKET_NAME="itsmetommy-cloud-sql-backup"

gsutil mb -c archive gs://$BUCKET_NAME

Add a lifecycle rule (optional).

https://cloud.google.com/storage/docs/gsutil/commands/lifecycle

Create lifecycle.json.

cat <<EOF > lifecycle.json
{
"rule":
[
{
"action": {"type": "Delete"},
"condition": {"age": 365}
}
]
} EOF

Add lifecycle.

gsutil lifecycle set lifecycle.json gs://$BUCKET_NAME

Verify.

gsutil lifecycle get gs://$BUCKET_NAME

Grant permissions to your existing Cloud SQL instance service account

Being that my Cloud SQL instance already exists, I need to give its service account access to my bucket.

export PROJECT_ID=`gcloud config get-value project`
export CLOUD_SQL_INSTANCE="itsmetommy"

CLOUD_SQL_SA=(`gcloud sql instances describe $CLOUD_SQL_INSTANCE \
  --project=${PROJECT_ID} \
  --format="value(serviceAccountEmailAddress)"`)

Grant your Cloud SQL instance service account write permissions to export data to Cloud Storage:

Create a storage role.

export STORAGE_ROLE="cloud.storage.sql.backup"

gcloud iam roles create ${STORAGE_ROLE} \
  --project ${PROJECT_ID} \
--title "Cloud SQL Storage Role" \
--description "Grant permissions to view and create objects in Cloud Storage" \
--permissions "storage.objects.create,storage.objects.get"

Assign the Storage Role to your bucket.

gsutil iam ch serviceAccount:${CLOUD_SQL_SA}:projects/${PROJECT_ID}/roles/${STORAGE_ROLE} gs://${BUCKET_NAME}

Create Cloud Function service account

Create a Cloud Function service account.

export CLOUD_FUNCTION_SA="cloud-function-sql-backup"

gcloud iam service-accounts create ${CLOUD_FUNCTION_SA} \
--display-name "Service Account for Google Cloud Function and SQL Admin API"

Create Cloud SQL export role.

export SQL_ROLE="cloud.sql.exporter"

gcloud iam roles create ${SQL_ROLE} \
  --project ${PROJECT_ID} \
--title "SQL Exporter Role" \
--description "Grant permissions to export data from a Cloud SQL instance to a Cloud Storage bucket as a SQL dump or CSV file" \
--permissions "cloudsql.instances.export"

Attach the Cloud SQL export role to the Cloud Function service account.

gcloud projects add-iam-policy-binding ${PROJECT_ID} \
--member="serviceAccount:${CLOUD_FUNCTION_SA}@${PROJECT_ID}.iam.gserviceaccount.com" \
--role="projects/${PROJECT_ID}/roles/${SQL_ROLE}"

Create Pub/Sub topic

Create a Pub/Sub topic, which is used to trigger the Cloud Function and interacts with the Cloud SQL database.

Create the Pub/Sub topic.

export PUBSUB_TOPIC="cloud-sql-backup"

gcloud pubsub topics create ${PUBSUB_TOPIC}

Create Cloud Function

Create a directory and cd into it.

mkdir cloud-sql-backup && cd cloud-sql-backup

Create main.py.

cat <<EOF > main.py
import base64
import logging
import json

from datetime import datetime
from httplib2 import Http

from googleapiclient import discovery
from googleapiclient.errors import HttpError
from oauth2client.client import GoogleCredentials

def main(event, context):
pubsub_message = json.loads(base64.b64decode(event['data']).decode('utf-8'))
credentials = GoogleCredentials.get_application_default()

service = discovery.build('sqladmin', 'v1beta4', http=credentials.authorize(Http()), cache_discovery=False)

datestamp = datetime.now().strftime("%Y%m%d%H%M") # format timestamp: YearMonthDayHourMinute
uri = "{0}/backup-{1}-{2}.gz".format(pubsub_message['gs'], pubsub_message['db'], datestamp)

instances_export_request_body = {
"exportContext": {
"kind": "sql#exportContext",
"fileType": "SQL",
"uri": uri,
"databases": [
pubsub_message['db']
]
}
}

try:
request = service.instances().export(
project=pubsub_message['project'],
instance=pubsub_message['instance'],
body=instances_export_request_body
)
response = request.execute()
except HttpError as err:
logging.error("Could NOT run backup. Reason: {}".format(err))
else:
logging.info("Backup task status: {}".format(response))
EOF

Create a requirements.txt.

cat <<EOF > requirements.txt
google-api-python-client
Oauth2client
EOF

Deploy the code.

When you are asked if you want to allow unauthenticated invocations of the new function, answer no.

export CLOUD_FUNCTION="cloud-sql-backup"

# When you are asked if you want to allow unauthenticated invocations of the new function, answer no.

gcloud functions deploy ${CLOUD_FUNCTION} \
  --trigger-topic ${PUBSUB_TOPIC} \
  --runtime python37 \
  --entry-point main \
  --service-account ${CLOUD_FUNCTION_SA}@${PROJECT_ID}.iam.gserviceaccount.com

Example

gcloud functions deploy ${CLOUD_FUNCTION} \
  --trigger-topic ${PUBSUB_TOPIC} \
  --runtime python37 \
  --entry-point main \
  --service-account ${CLOUD_FUNCTION_SA}@${PROJECT_ID}.iam.gserviceaccount.com
Allow unauthenticated invocations of new function [cloud-sql-backup]?
(y/N)?  N

Describe Cloud Function

gcloud functions describe ${CLOUD_FUNCTION}

Create Cloud Scheduler job

Create a Cloud Scheduler job to periodically execute the data export function.

https://crontab.guru/

export CLOUD_SCHEDULER_JOB="itsmetommy-sql-backup"
export DATABASE="itsmetommy-db"

gcloud scheduler jobs create pubsub ${CLOUD_SCHEDULER_JOB} \
  --schedule '0 23 * * *' \
  --topic ${PUBSUB_TOPIC} \
  --message-body '{"db":'\"${DATABASE}\"',"instance":'\"${CLOUD_SQL_INSTANCE}\"',"project":'\"${PROJECT_ID}\"',"gs":'\"gs://${BUCKET_NAME}\"'}' \
  --time-zone 'America/Los_Angeles'

Test

Run the Cloud Scheduler job manually to trigger a MySQL dump of your database.

gcloud scheduler jobs run ${CLOUD_SCHEDULER_JOB}

List the operations performed on your MySQL instance.

gcloud sql operations list --instance ${CLOUD_SQL_INSTANCE} --limit 1

Example

gcloud sql operations list --instance ${CLOUD_SQL_INSTANCE} --limit 1
NAME TYPE START END ERROR STATUS
d5422849-e009-4b39-bc3c-571ccf621c06 EXPORT 2020-07-07T01:23:23.214+00:00 2020-07-07T01:23:33.713+00:00 - DONE

Check your Cloud Storage bucket to see if the database dump file was created.

gsutil ls gs://${BUCKET_NAME} | grep backup-${DATABASE}

Example

gsutil ls gs://${BUCKET_NAME} | grep backup-${DATABASE}
gs://itsmetommy-cloud-sql-backup/backup-itsmetommy-db-202007070123.gz

Logs

gcloud functions logs read ${CLOUD_FUNCTION}