[Techtalk] Backup solutions
Terri Oda
terri at zone12.com
Fri Apr 26 14:15:21 EST 2002
At 09:47 AM 25/04/02 -0400, Davis, Jennifer wrote:
>/home/
>/usr/local/
>/var/spool/
>/var/www/
>/etc/rc.d/
>/usr/src/linux/.config
>(at least for my distro (Slackware 8.0)
I would do all of /etc and anywhere else that config files are
stored. I've found this helpful -- if a box is compromised, I may not want
to copy them over, but having them there to do diffs on is very
useful. Although it's not too hard to reconfigure stuff once you've done
it once, if
anything goes wrong you'll be kicking yourself for not having the record of
"what worked last time".
If you have any databases of note on your machine, make sure they're backed
up to. I have mysql databases which I actually run dbdump on to back up
(it handles the locking and it's really easy to recreate a database from
the text file generated -- I've used these backups to make a duplicate copy
for testing purposes too.)
>What I was thinking of doing was a cron job to put these into a tgz file and
>keep as many of these files as the backup computer will hold. The only
>thing I have not figured out is how do I stop the tar program from following
>the links, ie from my home directory to the /var/www/htdocs directory and
>how to give each tgz file a unique name, like the date.
The unique file name with a date is actually pretty simple. I use "date"
something like this:
tarball=$backupdir$1-`date +%Y-%m-%d`.tar
(Thie line from a little shell script I have which backs up a bunch of
websites hosted on our server)
Look up "date" in man to figure out what syntax you want and away you go!
tar -czf /backup/MyBackup-`date +%Y-%m-%d`.tar.gz FilesToBackup
if you want the one-line wonder to put in cron. :)
Terri
More information about the Techtalk
mailing list