Subversion backup script
The following script backs up a group of SVN repositories. It does so by creating a SVN dump and zipping the result up. Backups older than 3 days get removed automatically.
#!/bin/sh
set -e
rootdir="/srv/svn"
dump_dir="$rootdir/dump"
repo_dir="$rootdir/repos"
svnadmin=/usr/bin/svnadmin
function delete_old_files {
name="$1"
find "$dump_dir" -name "$name-*.dump.bz2" \
| sort | head -n -3 \
| xargs --no-run-if-empty rm -f
}
date=`date --iso`
for repo in "$repo_dir"/*; do
name=`basename $repo`
delete_old_files "$name"
$svnadmin dump --deltas --quiet "$repo" | \
bzip2 --compress --stdout > "$dump_dir/$name-$date.dump.bz2"
done
Update with a more resource-friendly script
Update 28 April 2013: The script below is a improvement on the above script. It will dump a repository only if it has been updated since the last backup was taken. To find out i a repository has changed since the last backup, it finds the timestamp of the most recent file in the repository and compares it with the timestamp of the most recent SVN dump file. This saves a lot of unnecessary CPU cycles, especially on big repositories.
#!/bin/sh
set -e
rootdir="/srv/svn"
dump_dir="$rootdir/dump"
repo_dir="$rootdir/repository"
svnadmin=/usr/bin/svnadmin
delete_old_files() {
name="$1"
find "$dump_dir" -name "$name-*.dump.bz2" \
| sort | head -n -3 \
| xargs --no-run-if-empty rm -f
}
newest_timestamp() {
for path in $@; do
find "$path" -type f -printf "%T@\n"
done | awk '{ time = $0
if (time > newest)
newest = time
}
END { printf("%d\n", newest) }'
}
date=`date --iso`
for repo in "$repo_dir"/*; do
name=`basename $repo`
newest_repo_ts=`newest_timestamp "$repo"` 2>/dev/null || true
newest_backup_ts=`newest_timestamp "$dump_dir/$name"-*` 2>/dev/null || true
if [ -z "$newest_backup_ts" -o -n "$newest_repo_ts" -a "$newest_repo_ts" -gt "$newest_backup_ts" ]; then
nice $svnadmin dump --deltas --quiet "$repo" | \
bzip2 --compress --stdout > "$dump_dir/$name-$date.dump.bz2"
fi
delete_old_files "$name"
done