Custom Websites. Quality Communication.

Use AWS S3 and Ruby to backup your MYSQL databases

By on Dec 7, 2016 in Code | 0 comments

Many web Applications store their data in MYSQL databases. Unfortunately, ordinary backups of the file system will not back up the data generated by the web application. This script dumps the MYSQL data into a file and uploads it with the day’s date attached to an AWS S3 Bucket (a simple and cheap place to store data). Make sure the bucket is private!

You will need your AWS Key ID, and Key active in the environment.

require 'aws-sdk'
require 'date'

# Upload the dump to s3 with todays date attached. 
def upload_dump(mysql_u, mysql_p,bucket_name,region)
	# This executes the mysql bundled command "mysqldump". 
        # Change the folders if necessary on your system.

        # Build the string to execute. This is mainly done so it is easier to present.
        exec_string = "/usr/bin/mysqldump -u #{mysql_u} -p#{mysql_p} --all-databases |" +
        "gzip > /scripts/mysqldumps/backup_all_db 2> /scripts/mysqldumps/dump.log"
	# This runs the command

	s3 =
	bucket = s3.bucket(bucket_name)
	fullkey = "/p/mysqldumps/backup_all_db#{}"
	object = bucket.object(fullkey)
	rescue StandardError=>e
	    puts e


The function upload_dump takes these parameters:

  • mysql_u – a mysql username with appropriate permissions to access the databases
  • mysql_p – the mysql password for the user
  • bucket_name – the name of the AWS S3 bucket you want to use. Must be unique. Make sure it is correct and corresponds to a private bucket you have set up!
  • region – the AWS region your bucket is in

The function then:

  1. Dumps the MySQL databases to a file. Note that you will need to define all the folders – /usr/bin may not be correct on your system, you may decide to store the local dumps and logs elsewhere etc.
  2. Creates a filename for the upload that has today’s date appended.
  3. Uploads the MySQL dump file generated in (1) with the filename in (2) to the bucket and region specified in the parameters. Note that if you change the folder names in the command in (1) you will need to specify the correct upload file location.

Add a call to the function at the bottom of the script with the right parameters, run this daily with a cron-job, and just like that you a daily off-site backup of your database data.