├── .gitignore ├── LICENSE ├── README.md └── mysqltos3.sh /.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | .svn 3 | nbproject 4 | Thumbs.db 5 | Desktop.ini 6 | .DS_Store -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2011 FoOlRulez 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: 4 | 5 | The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. 6 | 7 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | woxxy / MySQL-backup-to-Amazon-S3 2 | ================================= 3 | 4 | (This is not really an application, just a manual and some lines of code) 5 | 6 | Amazon S3 can be an interestingly safe and cheap way to store your important data. Some of the most important data in the world is saved in... MySQL, and surely mine is quite important, so I needed such a script. 7 | 8 | If you have a 500mb database (that's 10 times larger than any small site), with the priciest plan, keeping 6 backups (two months, two weeks, two days) costs $0.42 a month ($0.14GB/month). With 99.999999999% durability and 99.99% availability. Uploads are free, downloads would happen only in case you actually need to retrieve the backup (which hopefully won't be needed, but first GB is free, and over that $0.12/GB). 9 | 10 | Even better: you get one free year up to 5GB storage and 15GB download. And, if you don't care about all the durability, later you can get the cheaper plan and spend $0.093GB/month. 11 | 12 | The cons: you need to give them your credit card number. If you're like me, Amazon already has it anyway. 13 | 14 | Another thing that is real nice: HTTPS connection and GPG encryption through s3cmd. Theorically it's safe enough. 15 | 16 | Setup 17 | ----- 18 | 1. Register for Amazon AWS (yes, it asks for credit card) 19 | 2. Install s3cmd (following commands are for debian/ubuntu, but you can find how-to for other Linux distributions on [s3tools.org/repositories](http://s3tools.org/repositories)) 20 | 21 | wget -O- -q http://s3tools.org/repo/deb-all/stable/s3tools.key | sudo apt-key add - 22 | sudo wget -O/etc/apt/sources.list.d/s3tools.list http://s3tools.org/repo/deb-all/stable/s3tools.list 23 | sudo apt-get update && sudo apt-get install s3cmd 24 | 25 | 3. Get your key and secret key at this [link](https://aws-portal.amazon.com/gp/aws/developer/account/index.html?ie=UTF8&action=access-key) 26 | 4. Configure s3cmd to work with your account 27 | 28 | s3cmd --configure 29 | 30 | 5. Make a bucket (must be an original name, s3cmd will tell you if it's already used) 31 | 32 | s3cmd mb s3://my-database-backups 33 | 34 | 6. Put the mysqltos3.sh file somewhere in your server, like `/home/youruser` 35 | 7. Give the file 755 permissions `chmod 755 /home/youruser/mysqltos3.sh` or via FTP 36 | 8. Edit the variables near the top of the mysqltos3.sh file to match your bucket and MySQL authentication 37 | 38 | Now we're set. You can use it manually: 39 | 40 | #set a new daily backup, and store the previous day as "previous_day" 41 | sh /home/youruser/mysqltos3.sh 42 | 43 | #set a new weekly backup, and store previous week as "previous_week" 44 | /home/youruser/mysqltos3.sh week 45 | 46 | #set a new weekly backup, and store previous month as "previous_month" 47 | /home/youruser/mysqltos3.sh month 48 | 49 | But, we don't want to think about it until something breaks! So enter `crontab -e` and insert the following after editing the folders 50 | 51 | # daily MySQL backup to S3 (not on first day of month or sundays) 52 | 0 3 2-31 * 1-6 sh /home/youruser/mysqltos3.sh day 53 | # weekly MySQL backup to S3 (on sundays, but not the first day of the month) 54 | 0 3 2-31 * 0 sh /home/youruser/mysqltos3.sh week 55 | # monthly MySQL backup to S3 56 | 0 3 1 * * sh /home/youruser/mysqltos3.sh month 57 | 58 | Or, if you'd prefer to have the script determine the current date and day of the week, insert the following after editing the folders 59 | 60 | # automatic daily / weekly / monthly backup to S3. 61 | 0 3 * * * sh /home/youruser/mysqltos3.sh auto 62 | 63 | And you're set. 64 | 65 | 66 | Troubleshooting 67 | --------------- 68 | 69 | None yet. 70 | -------------------------------------------------------------------------------- /mysqltos3.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | # Updates etc at: https://github.com/woxxy/MySQL-backup-to-Amazon-S3 4 | # Under a MIT license 5 | 6 | # change these variables to what you need 7 | MYSQLROOT=root 8 | MYSQLPASS=password 9 | S3BUCKET=bucketname 10 | FILENAME=filename 11 | DATABASE='--all-databases' 12 | # the following line prefixes the backups with the defined directory. it must be blank or end with a / 13 | S3PATH=mysql_backup/ 14 | # when running via cron, the PATHs MIGHT be different. If you have a custom/manual MYSQL install, you should set this manually like MYSQLDUMPPATH=/usr/local/mysql/bin/ 15 | MYSQLDUMPPATH= 16 | #tmp path. 17 | TMP_PATH=~/ 18 | 19 | DATESTAMP=$(date +".%m.%d.%Y") 20 | DAY=$(date +"%d") 21 | DAYOFWEEK=$(date +"%A") 22 | 23 | PERIOD=${1-day} 24 | if [ ${PERIOD} = "auto" ]; then 25 | if [ ${DAY} = "01" ]; then 26 | PERIOD=month 27 | elif [ ${DAYOFWEEK} = "Sunday" ]; then 28 | PERIOD=week 29 | else 30 | PERIOD=day 31 | fi 32 | fi 33 | 34 | echo "Selected period: $PERIOD." 35 | 36 | echo "Starting backing up the database to a file..." 37 | 38 | # dump all databases 39 | ${MYSQLDUMPPATH}mysqldump --quick --user=${MYSQLROOT} --password=${MYSQLPASS} ${DATABASE} > ${TMP_PATH}${FILENAME}.sql 40 | 41 | echo "Done backing up the database to a file." 42 | echo "Starting compression..." 43 | 44 | tar czf ${TMP_PATH}${FILENAME}${DATESTAMP}.tar.gz ${TMP_PATH}${FILENAME}.sql 45 | 46 | echo "Done compressing the backup file." 47 | 48 | # we want at least two backups, two months, two weeks, and two days 49 | echo "Removing old backup (2 ${PERIOD}s ago)..." 50 | s3cmd del --recursive s3://${S3BUCKET}/${S3PATH}previous_${PERIOD}/ 51 | echo "Old backup removed." 52 | 53 | echo "Moving the backup from past $PERIOD to another folder..." 54 | s3cmd mv --recursive s3://${S3BUCKET}/${S3PATH}${PERIOD}/ s3://${S3BUCKET}/${S3PATH}previous_${PERIOD}/ 55 | echo "Past backup moved." 56 | 57 | # upload all databases 58 | echo "Uploading the new backup..." 59 | s3cmd put -f ${TMP_PATH}${FILENAME}${DATESTAMP}.tar.gz s3://${S3BUCKET}/${S3PATH}${PERIOD}/ 60 | echo "New backup uploaded." 61 | 62 | echo "Removing the cache files..." 63 | # remove databases dump 64 | rm ${TMP_PATH}${FILENAME}.sql 65 | rm ${TMP_PATH}${FILENAME}${DATESTAMP}.tar.gz 66 | echo "Files removed." 67 | echo "All done." 68 | --------------------------------------------------------------------------------