Nerdrium Home

Macintosh

Linux

Web Design

Case Mods

Gardening

Photography

Recipes

 

My Blog

 

Facebook

Twitter

Backing up Files using crontab

by Michael Greifenkamp (June 3rd, 2004)

Okay, now that we have figured out how to connect between servers without having to enter a password each time, it is time to generate a script to automate the backup of our important directories.

For purposes of this demonstration, the "important directories" that I need to backup will be an entire website, located at /web/mywebsite, and a set of database tables, located at /usr/local/mysql/data/mydatabase. I am going to make a tape archive (.tar) of each of those directories, and copy that archive to a remote computer for safekeeping. In fact, I would like to make a backup every day of the week, and save the last seven days' worth of backups. This way, if I leave to go fishing on Thursday, and the database gets screwed up Friday, and I get back to work Monday, I can go all the way back to the Wednesday or Thursday night backup and restore from there. Depending on how worried you might be about losing stuff, and how much hard drive space you feel like taking up, this backup strategy can easily be modified to make more than a week's worth of backups.

For the interest of simplicity, as well as security, I created a new user on both the production machine, and the machine that will receive the backup archives. There are a couple of important things to note regarding this process: the new user must have at least group read permissions for any file that is going to be backed up. For databases, that was easy: I just made my new user a member of the mysql group, and changed the permissions so that groups have both read and write for the data directory, and read access for anything contained therein.

# su -l
(enter password)
# cd /usr/local/mysql/data
# chmod 770 data
# cd data
# chmod 750 mydatabase
# cd mydatabase
# chmod 660 *

For the website directory, I already had a group named "web" that contains, well, everyone that works on, or adds to, our website. Well, and I guess that all of the web files have to have read access for everyone so that they can be viewed in a browser, so this is almost a moot point. Just make sure that the new user has write permission in the main web directory, in my case, /web.

The write access in both /usr/local/mysql/data and /web is necessary so that the actual .tar file can be written. Even if all of the files inside can be read by the archiving user, if they do not have write permissions in the parent directory, the .tar file cannot be saved.

Now it is time to create the backup script. I created it in the home directory of my backup user, callimachus.

# exit (to get out of superuser mode)
# cd /home/callimachus
# vi mondaybackup

And here's what I entered into that file:

cd /usr/local/mysql/data
tar --create --preserve --file=mydatabase_monday.tar mydatabase
scp mydatabase_monday.tar 192.168.1.12:.
rm mydatabase_monday.tar
cd /web
tar --create --preserve --file=mywebsite_monday.tar mywebsite
scp mywebsite_monday.tar 192.168.1.12:.
rm mywebsite_monday.tar

Here is what this script does, in detail. First, it goes into the mysql data directory and makes a .tar image of the entire mydatabase directory and saves it as mydatabase_monday. Then it secure copies (scp) that .tar file over to the backup server, and finally cleans up by deleting the tar file from the production server. It then follows the same scenario to make a backup of the website directory as well.

Now what can be done for multiple backups is to make scripts for each day of the week--mondaybackup, tuesdaybackup, etc. I simply copied mondaybackup to my OS X machine and did a find/replace with BBEdit Light for the day of the week and then saved each file and put all seven back on the server (find monday, replace with tuesday, save as tuesdaybackup, find tuesday, replace with wednesday, save as wednesdaybackup, etc.).

The final step is to create a cronjob that will execute the correct script each day. I had no other cronjobs scheduled for this user, so I had to create one. I want the correct backup to run each day at 1:15 in the morning. Here is what I did to create that cronjob.

# crontab -e
(vi editor opens)
15 1 * * 0 /home/callimachus/sundaybackup
15 1 * * 1 /home/callimachus/mondaybackup
15 1 * * 2 /home/callimachus/tuesdaybackup
15 1 * * 3 /home/callimachus/wednesdaybackup
15 1 * * 4 /home/callimachus/thursdaybackup
15 1 * * 5 /home/callimachus/fridaybackup
15 1 * * 6 /home/callimachus/saturdaybackup

(save the file and quit vi)

So, every Sunday (the 0), no matter what month (*), or what day of the month it is (*), at 1:15 a.m. (the 15 is the minutes and the 1 is the hour, of course), cron executes the sundaybackup file.

The beauty of the scp part is that it automatically just overwrites the old file from seven days ago as it copies the new one (so you do not have to delete last Sunday's archive before write the current one).

If you have any questions, feel free to email me, of course. Or if you think I am a bonehead for doing this the way that I did, feel free to let me know that too. :)

--Michael