Transfer Backup via Rsync

Discussion in 'Tips/Tricks/Mods' started by ZeroEnna, Feb 28, 2016.

  1. ZeroEnna

    ZeroEnna Member

    Hello everyone,

    I'm running two servers with Debian 8 and ISPConfig 3. Currently, I run a cronjob with which I backup databases and websites manually. The built-in functionality provides the possibility to copy the backupped files to the backup folder of the website, but what I want is to transfer the files to an external Storage via rsync (like I do right now).

    Can anyone help me with this? As a starting point It would be great to know where ISPConfig saves the backups, so I can run an Rsync-Script vis root-cronjob.

    Thank you very much in advance.

    Kind Regards

    Zero
     
  2. ztk.me

    ztk.me ISPConfig Developer ISPConfig Developer

    The backups are in the domains subfolder "backup".
     
  3. florian030

    florian030 ISPConfig Developer ISPConfig Developer

    The backup-function stores your backups in the path defined on the server (not domains subfolder) like /var/backup.
    You can resync /var/backup to any destination.
     
  4. ztk.me

    ztk.me ISPConfig Developer ISPConfig Developer

    indeed, I was wrong did I just mix something or were I just in a different universe? :( :D
     
  5. ZeroEnna

    ZeroEnna Member

    Is there any way to "get them back"?
    I mean, if you have a website that is 5GB heavy, and you have 7 Backups, that's 35 GB....
    So, is there any way to transfer them to a remote storage, and then re-transfer the file if the customer requests it without having to manually do anything?

    Like: The server is set to create ONE Backup, a cronjob automatically scp's the file to a remote storage. The customer asks to have a backup that is 5 days old. The admin transfers the file back to the server, and ISPconfig automatically detecs the backup and allows the customer to download it.

    That would be great and enough for my needs
     
  6. Jesse Norell

    Jesse Norell ISPConfig Developer Staff Member ISPConfig Developer

    can you just mount the remote storage at /var/backups?
     
  7. DDArt

    DDArt Member

    I use s3cmd to Amazon S3 bucket and it is fairly cheap. I do nightly at midnight or past that. I also backup nightly all my mysql databases compressed each in its own *.tar.gz as well as emails, compressing them based on domains into /var/backups and after, my entire /var/backup(s) gets sent to amazon. The bill is so low it's more than a cup of mocha per month for your 30+ gigs.
     
  8. 30uke

    30uke Active Member HowtoForge Supporter

    Might be an old(er) post but I am struggling with the same issue an decided to use Duplicity and RSync. I am running just one server at the moment. There is SSD storage (150GB) which hosts sites and mail and a big disk (2TB) for additional storage. At home I am running an OpenMediaVault as backup destination.
    My script is not yet finished but at least it backups files and databases. I have to add some more variables (there are some hard coded items) and I have to add some checks and mail reports. Although not ready yet I am hoping this might be helpful.
    Code:
    #!/bin/bash
    ###############################################################
    # Vars
    ###############################################################
    # timestamp
        tstamp=$(date +%Y%m%d_%H%M%S)
    # logging :: dirs
        parentlogdir=/var/log/duplicity
        logdir=/var/log/duplicity/$tstamp
    # Duplicity
        dpdir=file:///mnt/storage/s1.backup
        dpencryption=--no-encryption
    # MySQL
        sqluser=root
        sqlpass=`cat /root/nova-01/mysql_pass`
        sqldir=/mnt/storage/s1.backup/databases
    # RSync
        rspwfile=/root/nova-01/rsync_pass
    
    ###############################################################
    # Pre Taks
    ###############################################################
    # Create dirs
        mkdir -p $parentlogdir
        mkdir -p $logdir
        mkdir -p $sqldir
        mkdir -p $sqldir/$tstamp
    # Read databases
        DATABASES="$(/usr/bin/mysql --user=$sqluser --password=$sqlpass -Bse 'show databases')"
    
    ###############################################################
    # Run Duplicity
    ###############################################################
      echo R::rsync
        rsync -vrzh --password-file=$rspwfile /mnt/storage/s1.backup [email protected]::vps-data/s1.myserver.com
      echo Duplicity :: etc
        duplicity --log-file $logdir/etc.log $dpencryption --full-if-older-than 1M /etc $dpdir/etc
      echo Duplicity :: var
        duplicity --log-file $logdir/var.log $dpencryption --full-if-older-than 1M /var $dpdir/var
      echo Duplicity :: usr
        duplicity --log-file $logdir/usr.log $dpencryption --full-if-older-than 1M /usr $dpdir/usr
      echo Duplicty :: home
        duplicity --log-file $logdir/home.log $dpencryption --full-if-older-than 1M /home $dpdir/home
      echo Duplicty :: root
        duplicity --log-file $logdir/root.log $dpencryption --full-if-older-than 1M /root $dpdir/root
      echo Duplicty :: boot
        duplicity --log-file $logdir/boot.log $dpencryption --full-if-older-than 1M /boot $dpdir/boot
      echo Duplicty :: bin
        duplicity --log-file $logdir/bin.log $dpencryption --full-if-older-than 1M /bin $dpdir/bin
      echo Duplicty :: sbin
        duplicity --log-file $logdir/sbin.log $dpencryption --full-if-older-than 1M /sbin $dpdir/sbin
      echo Duplicty :: opt
        duplicity --log-file $logdir/opt.log $dpencryption --full-if-older-than 1M /opt $dpdir/opt
      echo Duplicty :: lib
        duplicity --log-file $logdir/lib.log $dpencryption --full-if-older-than 1M /lib $dpdir/lib
      echo Duplicty :: lib64
        duplicity --log-file $logdir/lib64.log $dpencryption --full-if-older-than 1M /lib64 $dpdir/lib64
    
    ###############################################################
    # Run mysqldump
    #
    # ## Thanks to  Johan Coates
    # ## Ref: https://stackoverflow.com/users/332240/john-coates
    # ## Ref: https://stackoverflow.com/questions/2773199/is-there-a-way-to-dump-all-mysql-databases-except-for-system-databases
    ###############################################################
      echo Dump databses
        for db in ${DATABASES[@]}
          do
            echo ${db}-$tstamp.sql.bz2 is being saved in $sqldir/$tstamp
            /usr/bin/mysqldump --user=$sqluser --password=$sqlpass $db --single-transaction -R | bzip2 -c > $sqldir/$tstamp/${db}-$tstamp.sql.bz2
          done
      unset sqlpass
    ###############################################################
    # Run RSync
    ###############################################################
      echo R::rsync
        rsync -vrzh --password-file=$rspwfile /mnt/storage/s1.backup [email protected]::vps-data/s1.myserver.com
    
     

Share This Page