How to send database backups to Google Cloud Storage on platform.sh?
Hi everyone,
Today’s story is about backing up its database from platform.sh to Google Cloud Storage.
This is a 4-step tutorial.
Requirements
- A running patform.sh project with a database set up (postgreSQL in this example).
- A Google Cloud account with billing set up.
- An ssh access to platform.sh.
Step 1: Google Cloud Platform
First, we are going to create a bucket in Google Cloud Storage. The database backups from platform.sh are going to be saved in this bucket.
Log in to the Google Cloud console and navigate to Cloud Storage. You should see your existing buckets at that point, none if you never created any.
Create a new bucket:
- Name your bucket (“bucket_test_me” in this example).
- Choose the storage location of your data (“Region” — “europe-west9” in this example).
- Pick a storage class (“Archive” in this example).
- Choose your access control (“Uniform” in this example)
- Choose how to protect your data (“None” in this example)
- Finally, click on “Create”
Optional (but recommended): you can create a service account that will be used in the gcloud auth login command in order to avoid using your user account. This way you can have a better control over permissions.
Step 2: update your .platform.app.yaml file
First, let’s install the platform.sh CLI by adding the following to the build hooks in the .platform.app.yaml file:
hooks:
build: |
...
curl -sS https://platform.sh/cli/installer | php
...
We’ll need the platform.sh CLI later when dumping the database on the platform.sh server.
Next, let’s add two mounts (backups and .gsutil):
mounts:
...
'backups':
source: local
source_path: backups
'/.gsutil':
source: local
source_path: .gsutil
...
“backups” is the mount where we are going to install the Google Cloud CLI and download the database backups. “.gsutil” is where the Google Cloud CLI will store your credentials.
You can now deploy your code to production on platform.sh.
Step 3: install the Google Cloud CLI on platform.sh
Once your .platform.app.yaml file deployed, let’s connect to your platform.sh server via ssh to install and initialize the Google Cloud CLI.
~$ cd backups
# create the dumps folder
~/backups$ mkdir dumps
# download and install google cloud CLI
~/backups$ curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-cli-422.0.0-linux-x86_64.tar.gz
~/backups$ tar -xf ./google-cloud-cli-422.0.0-linux-x86_64.tar.gz
~/backups$ ./google-cloud-sdk/install.sh
# login to Google Cloud. The next command will give you a link that you can copy/past in your browser and follow the instructions to be logged in
~/backups$ ./google-cloud-sdk/bin/gcloud auth login
# set the project that is linked to the Google Cloud CLI
~/backups$ ./google-cloud-sdk/bin/gcloud config set project YOUR_GOOGLE_PROJECT_ID
Step 4: automate your backups
At this point, the gcloud CLI should be properly installed and initailized. We’ll now create the backup on platform.sh and send it to Google Cloud Storage.
Let’s create a bash script named database_backup.sh:
#!/usr/bin/env bash
TODAY_DATE=$(date +%Y-%m-%d-%H:%M:%S)
DUMP_FILE=${TODAY_DATE}-dump.sql
platform db:dump --gzip --file=backups/dumps/$DUMP_FILE
./backups/google-cloud-sdk/bin/gsutil cp -r ~/backups/dumps/* gs://bucket_test_me
rm -rf ./backups/dumps/*
You can deploy this script on your production server. If you want to test it, you can run the following command:
$ bash ./database_backup.sh
You should now see the backup in Google Cloud Storage.
Last but not least, we are going to set up a cron job in order to fully automate the backups. In the .platform.app.yaml file you can add:
crons:
...
database_backup:
spec: '0 3 * * *'
cmd: |
if [ "$PLATFORM_BRANCH" = main ]; then
bash ./cron/database_backup.sh
fi
The cron job will run every day at 3:00 am on the main branch only in this example.
👾 Thank you for reading so far!