Skip to content

My amazing web server backup script

For my own server setup on my machine, I’ve just finished a better version of my automated backup script.

It backs up everything in the Apache document root, all my virtual hosts directories and also does a MySQL dump, before saving all three of those files (date in the filename) in a specified directory. But it gets better – it also copies the backups to another location (on a different physical disk) for redundancy. It also automatically prunes out old backups and deletes them (you set a specified number of backups that you want to keep in each folder).

It runs daily automatically via cron on my machine.

And I think it’s quite good and I thought it might come in handy as a base for someone else’s script.

#!/bin/bash

DATE1=`date | awk '{print $3}'`
DATE2=`date | awk '{print $2}'`
DATE3=`date | awk '{print $6}'`

THEDATE="${DATE1}${DATE2}${DATE3}"

########################### CONFIGURATION START ##################################

KEEPORIG=6			# how many backups to keep in original dir?
KEEPSECOND=30			# how many backups to keep in second dir?

KEEPSQLORIG=15			# how many MySQL dumps to keep in original dir?
KEEPSQLSECOND=50		# how many MySQL dumps to keep in second dir?

ORIGDIR="/root/Backups/apache"	# original directory to put backups
SECONDDIR="/secondplace/"	# secondary directory to copy backups
APACHEDIR="/var/www/html"	# apache docroot
VHOSTSDIR="/var/www/vhosts"	# apache virtual hosts directory

MYSQLUSER=root			# MySQL user for dump
MYSQLPASS=password		# MySQL password for dump

########################### CONFIGURATION DONE ###################################

cd ${ORIGDIR} # chdir to the right dir
tar -cjvf "./Webdev_${THEDATE}.tar.bz2" ${APACHEDIR} # tar up apache dir
tar -cjvf "./Vhosts_${THEDATE}.tar.bz2" ${VHOSTSDIR} # tar up vhosts dir

mysqldump -u ${MYSQLUSER} -p${MYSQLPASS} -A > "./MySQL_${THEDATE}.sql" # dump db
bzip2 "./MySQL_${THEDATE}.sql" # compress db

cp -v "./Webdev_${THEDATE}.tar.bz2" ${SECONDDIR} # copy
cp -v "./Vhosts_${THEDATE}.tar.bz2" ${SECONDDIR} # copy
cp -v "./MySQL_${THEDATE}.sql.bz2" ${SECONDDIR} # copy

# prune .tar.bz2 in original folder
if [ `ls -1 "${ORIGDIR}" | grep .tar.bz2 | wc -l` -gt $KEEPORIG ]; then
   i=1
   for each in `ls -1t "${ORIGDIR}" | grep .tar.bz2`; do
       if [ $i -gt $KEEPORIG ]; then
           echo Removing "${ORIGDIR}/${each}"
           rm -fv -- "${ORIGDIR}/${each}"
       fi
       let "i = i + 1"
   done
fi

# prune .tar.bz2 in second folder
if [ `ls -1 ${SECONDDIR} | grep .tar.bz2 | wc -l` -gt $KEEPSECOND ]; then
   i=1
   for each in `ls -1t "${SECONDDIR}"`; do
       if [ $i -gt $KEEPSECOND ]; then
           echo Removing "${SECONDDIR}/${each}"
           rm -fv -- "${SECONDDIR}/${each}"
       fi
       let "i = i + 1"
   done
fi

# prune db dumps in original folder
if [ `ls -1 "${ORIGDIR}" | grep .sql.bz2 | wc -l` -gt $KEEPSQLORIG ]; then
   i=1
   for each in `ls -1t "${ORIGDIR}" | grep .sql.bz2`; do
       if [ $i -gt $KEEPSQLORIG ]; then
           echo Removing "${ORIGDIR}/${each}"
           rm -fv -- "${ORIGDIR}/${each}"
       fi
       let "i = i + 1"
   done
fi

# prune db dumps in second folder
if [ `ls -1 "${SECONDDIR}" | grep .sql.bz2 | wc -l` -gt $KEEPSQLSECOND ]; then
   i=1
   for each in `ls -1t "${SECONDDIR}" | grep .sql.bz2`; do
       if [ $i -gt $KEEPSQLSECOND ]; then
           echo Removing "${SECONDDIR}/${each}"
           rm -fv -- "${SECONDDIR}/${each}"
       fi
       let "i = i + 1"
   done
fi

You can also download it here, I don’t know how reliable the copy/paste will be.

Like this post?

If you would like to support the time and effort I have put into my tutorials and writing, please consider making a donation.

One Comment

  1. Chris wrote:

    Thank goodness for Dreamhost’s auto-backup. I don’t think I could get by without it!

    Wednesday, February 21, 2007 at 23:03 | Permalink |

Post a Comment

On some sites, you must be logged in to post a comment. This is not the case on this site.
Your email address is not made public or shared. Required fields are marked with *.
*
*
*

Posting a comment signifies you accept the privacy policy.
Please note — your comment will not appear straight away, as all comments are held for approval.