Backup routine for ISPConfig-equipped server

Discussion in 'General' started by IntnsRed, Nov 9, 2005.

  1. IntnsRed

    IntnsRed Member

    Yes, I know this is a basic, sort of silly question. But it may prove useful for some (not to mention me ;) ).

    I'm in the process of working up my shell script do semi-occasional off-server backups. This is done by tarring various subdirectories, renaming and date-time stamping them, and then scp'ing them off the server to a backup machine.

    My mindset for this type of backup is that in a worst-case disaster (read: the server is fried with dead hard drives), I can quickly install a new Debian copy with the same software packages and "barebones" configuration as before; then I can untar all of my backed-up tarballs and have a fully working ISPConfig-equipped server again. Since these types of backups are not done daily, I realize there could be some data loss -- the idea is to get a "pretty close" type of restore for a worst-case disaster scenario.

    With that in mind, and thinking of ISPConfig and its user/web data, what I'm considering backing up is this:

    • /etc -- recursively thrown into a tarball; this should also get my DNS data, which is critical
    • /home/www
    • Running "mysqldump --add-drop-table" on various databases (including "ispconfig" and "mysql") and bzipping them
    • /root/ispconfig
    • /home/admispconfig

    So, you have the concept of this particular backup technique. Two questions come to mind:

    (1) Will this strategy work with ISPConfig -- is the concept itself flawed?

    (2) Am I forgetting some component or subdirectory? (e.g. Does ISPConfig drop something in a subdir I forgot?)
     
    Last edited: Nov 9, 2005
  2. falko

    falko Super Moderator Howtoforge Staff

    No, this should work.

    You should maybe also back up the named directory (usually under /var/lib/named, /var/named, ...).


    Please also have a look at these howtos:
    http://www.howtoforge.com/linux_rdiff_backup
    http://www.howtoforge.com/howto_linux_systemimager
    http://www.howtoforge.com/dedicated_server_backup_restore_systemimager
     
  3. SleeperZ

    SleeperZ New Member

    I dont think theres any reason why the backups shouldn't work. From my understanding the information is in a database which in turn build the config files etc. So I would be backing up the database daily. I would also backup the /etc/ partition daily.

    I would backup the /home/ partition daily, and if storage / bandwidth offsite is an issue only put it offsite every couple of days. Make sure you're doing an offsite backup tho, very very important :)

    A good lil program to use if you're using scp is rsync - you can use it to check ur local files vs ur remote files and just transmit the ones that have changed. The other advantage of course is that it can be tunneled through ssh (Secure)
     
  4. IntnsRed

    IntnsRed Member

    I've since opted for redundant backup techniques, which is what I usually do.

    I like the ISPConfig stuff since it's easy to tar up and has a sane layout.

    To date/time stamp various databases and subdirectories, all I did was to work up a script containing routines similar to the below for each subdir/database I want to back up:

    Code:
    #!/bin/sh
    cd /home/backups
    
    # Backup Debianhelp.Org's /home/www/web7/web subdirectory
    nice -n 17 tar -cjvvf /home/backups/nexo-www-debianhelp-org_home-www-web7-web-`date +%y%m%d`.tar.bz2 /home/www/web7/web
    
    # Do debianHELP.org's PostNuke database
    nice -n 17 mysqldump --user=web7_u3 --password=secret --add-drop-table web7_db1 > nexo-www-debianhelp-org-`date +%y%m%d`.sql
    nice -n 17 bzip2 nexo-www-debianhelp-org-`date +%y%m%d`.sql
    
    
    I nice the routines just because they're not high priority. I'll use routines similar to the above for each web site, and other various system subdirectories and other databases. As you see above, all these are collected into the /home/backups subdirectory.

    The script above is invoked remotely simply by running the script via ssh:

    Code:
    ssh [email protected] /home/user/bin/RoutineFileTarballBackup.sh
    then copy the files off the remote server:

    Code:
    scp [email protected]:/home/backups/nexo*bz2 /pub/backups/websitebackups
    and then I clean up the /home backups by simply doing a:

    Code:
    ssh [email protected] rm /home/backups/nexo*bz2
    As described, the script is very simple -- just a series of in-line commands, no error checking, nothing fancy. But since this is one backup technique of three I use, and since I keep files for many days, it doesn't have to be bulletproof.

    Of course, as described above using ssh, you'll have to log in for each ssh/scp call. But by exchanging keys this can easily be made password-prompt-less.
     
    Last edited: May 5, 2006
  5. till

    till Super Moderator Staff Member ISPConfig Developer

    Thats correct, but not all information is in the database. the passwords are stored only in /etc/shadow for security reasons. So you will have at least to backup /etc/passwd, /etc/shadow and /etc/group too.
     

Share This Page