Aliaspooryorik
ColdFusion ORM Book

Backup MySQL to Amazon S3

I've recently launched a site using MySQL as a database, and wanted to be able to schedule database backups to Amazon S3

I did a bit of searching and found Ray Camden's blog post from way back in 2006 about using cfexecute to create the MySQL backups. I took what Ray had done and added some new functionality so that each database is backed up into a seperate file, zipped and then uploaded to Amazon S3.

Here is by final script, which is called via a scheduled task.


<!--- MySQL backup details --->
<cfset path = "C:\Program Files\MySQL\MySQL Server 5.1\bin\mysqldump">
<cfset zipdir = "C:\backup\mysql\">
<cfset backupdir = zipdir & "databases\">
<cfset username = "root">
<cfset password = "letmein">

<cfset datetime = DateFormat(Now(), "YYYYMMDD" ) & TimeFormat(Now(), "HHMMSS" )>
<cfset zipfilename = datetime & "_dbbackup.zip">

<!--- get MySQL databases --->
<cfquery name="qDatabases" username="#username#" password="#password#">
SHOW DATABASES;
</cfquery>

<!--- loop and backup each database --->
<cfloop query="qDatabases">
<cfset filename = "#datetime#_#qDatabases.Database#.sql">
<cfexecute name='"#path#"'
arguments=' --user=#username# --password=#password# --databases --log-error="#zipdir#" #qDatabases.Database#'
outputfile='#backupdir##filename#'
>

</cfexecute>
<cfoutput>backup: "#backupdir##filename#" successful<br></cfoutput>
</cfloop>

<!--- zip backups --->
<cfzip action="zip" source="#backupdir#"
file="#zipdir##zipfilename#"
overwrite="true">


<!--- upload to Amazon S3 --->
<cffile action="copy"
source="#zipdir##zipfilename#"
destination="s3://MyBucket/backup/">

My Amazon S3 credentials are defined in Application.cfc like so:


component
{
this.name = Hash( getDirectoryFromPath( getCurrentTemplatePath() ) );
// S3 details
this.s3.accessKeyId = "A1B1C1D1E1F1G1H1;
this.s3.awsSecretKey = "a1b2c3d4e5/A1B1C1D1E1F1G1H/1A2B";
}

 


3 comments

  1. Hi John,

    Good to see you taking steps into the AWS cloud offerings.

    We backup a stack of databases to both Amazon S3 and Rackspace Cloud Files, we use a piece of software called Jungle Disk, which creates a mapped drive on the system, anything put in their get's synced up to the online store - very effective.

    Another great tool from AWS is the RDS Relational Database Service, which allows you to run Oracle, MySQL or SQL Server databases right in the cloud, all automatically backed up to S3 with point-in-time backups.

    Both alternative options to wrapping up your own backup/push script.

    Rob

    Comment by Robert Rawlins – May 10, 2012
  2. Hi Rob,

    Yeah - liking AWS so far! I have used JungleDisk for backing up local assets. Not used it on a web server though.

    The Amazon Relational Database Service sounds really interesting as I know cloud hosting is not normally best suited to Databases, so it's good that Amazon have tackled the issue!

    Comment by John Whish – May 11, 2012
  3. Personally I have a cron job that does a mysqldump to a Dropbox folder so the data is mirrored/available to my desktop and also covered by the CrashPlan service that runs on the box.

    Comment by Richard Herbert – December 19, 2012

Leave a comment

If you found this post useful, interesting or just plain wrong, let me know - I like feedback :)

Please note: If you haven't commented before, then your comments will be moderated before they are displayed.

Please subscribe me to any further comments
 

Search

Wish List

Found something helpful & want to say ’thanks‘? Then visit my Amazon Wish List :)

Categories

Recent Posts