Easy Automated Snapshot-Style Backups with Rsync

Have rsync, have RAID volume, have ssh connection to server. What’s the best way to back it up? Here’s one way. Use hard links to make “duplicate” archives of remote content with the minimum of wasted space.
I’ve adapted the idea with the following:

for t in `ls remote/`
do
    for i in `ls remote/$t/`
    do
        export older=7
        rm -fr remote/$t/$i/7
        for n in `seq 6 1`
        do
            if [ ! -d remote/$t/$i/$n ]; then
                mkdir -vp remote/$t/$i/$n
            fi
            mv remote/$t/$i/$n remote/$t/$i/$older
            export older=$n
        done
        cp -al remote/$t/$i/0/. remote/$t/$i/1
    done
done

rsync -v -v --progress -az --delete --exclude-from=exclude.txt -e 'ssh' www.example.com:/home/ remote/www/home/0/
rsync -v -v --progress -az --delete --exclude '*logs/*' -e 'ssh' www.example.com:/usr/local/apache/ remote/www/apache/0/
rsync -v -v --progress -az --delete --exclude-from=exclude.txt -e 'ssh' mail.example.com:/home/ remote/mail/home/0/

Using this script you can have multiple hosts (here we have www.example.com and mail.example.com) and multiple directories(/home and /usr/local/apache) on each host backed up.
This script and post will be updated as I refine the backup procedure.
Later… After reading through the linked article above, I see that the author uses a slightly different way of rotating the archive snapshots. Instead of re-using the oldest one, he creates a hard link copy of the latest snapshot, “0″, in the “1″ snapshot. When rsync downloads changed files it “deletes before copying” so that the old file is preserved in “1″, but the new file is now in “0″. Read the “hard links” section of the above for more on how that works!
Here’s a diff of what I changed:

7,12c7,8
<         if [ ! -d remote/$t/$i/7 ]; then
<             mkdir -vp remote/$t/$i/7
<         fi
<     
<         mv remote/$t/$i/7 remote/$t/$i/7.tmp
<         for n in `seq 6 0`
---
>         rm -fr remote/$t/$i/7
>         for n in `seq 6 1`
20,21c16
<         mv remote/$t/$i/7.tmp remote/$t/$i/0
<         cp -al remote/$t/$i/1/. remote/$t/$i/0
---
>         cp -al remote/$t/$i/0/. remote/$t/$i/1

Later Still… I found and installed Backuppc via delicious this morning and I’ve got it working. It uses the same idea of space saving hard links but also provides a web based interface to the backup and restore procedures. I think it can be automated and there are loads of other features I couldn’t possibly hope or wish to duplicate in a timely manner!
Installation is relatively well explained, even down to installing the required Perl modules from CPAN, but configuration is slightly harder. Just make sure to override the defaults in config.pl with config.pl in the $backupdir/pc/$host/config.pl file. That took som figuring out where that file was.
It’s already backed up two Windows machines and it’s working on a Linux box. Backup contents can be examined over the web and because it uses a “pool” mechanism, it can find duplicate files, even among different backups! That should save a lot of disk space and network bandwidth as time goes by!