doesn't ispconfig automatically backup the webs and database to a location available via ftp? i cant seem to find it anywhere. is there a way to do this if not? all i want to backup is the web and databases daily, and possibly retain current for a week or 2. thanks
No there is no built in function you can can just turn on. This is what I do on our system: Code: #!/bin/sh ### My Vars [email protected] [email protected] Email_Sub="Backup status" # I start this shortly after midnight, so I give it yesterdays date # Also this gives my 01, 02… 31 which means created files will be overridden monthly # You can choose maybe Mo, Tu… if you like or need fewer backups yesterday=$(date --date=-1day +%d) ### My Dirs & Files sqlDump_Dir="/var/www/__all_databases.sql" # will go in the whole backup in next step bakFile="/root/www-Day__$yesterday.tgz" bakDir="/var/www/" # OPTION: # I like keeping a file of the BackUp steps on the server # and have it sent in an email bakStatus=/var/log/myLogs/bakStatus_Day__$yesterday.log # OPTION: # I add the Time[ $(date +%T) ] to every line, # which tells me how long each step takes in oder to rethink the script, # when it starts to put too high of a load on the server ### My Header echo "$(date +%T ) _This is your faithful daily BackUp Script_ "> $bakStatus ### SQL BackUp echo " $(date +%T ) *SQL Backup start* ">> $bakStatus # As a password should be set and a diffrent host my be used, # put change the following values in the line "SQL_Statment" # +++ HOST = localhost (most likely) # +++ User = root (most likely) # +++ Password = ?????? (no space following the p!!!) SQL_Statment="-h localhost -u root -p?????? -c --add-drop-table --add-locks --all --quick --lock-tables --all-databases" mysqldump $SQL_Statment > $sqlDump_Dir echo "$(date +%T ) _SQL Backup done_">> $bakStatus ### WWW BackUp # this includes all emails if you use IMAP echo " $(date +%T ) *WWW Backup start -incl. Mails* ">> $bakStatus #tar -pczf $bakFile $bakDir echo "$(date +%T ) _WWW Backup done_">> $bakStatus ### Clean up # removing the SQL Dump, as it has been put into the "WWW BackUp" by now. rm $sqlDump_Dir ### Add the LOAD # just checking how stressfull it was myLoad="*Load*: $(uptime | awk '{ print $(NF-2),$(NF-1),$NF }')" echo " $myLoad" >> $bakStatus ### mailling BackUpStatus echo " Mailling BackUpStatus">> $bakStatus >&1 | mail -r "$Email_From" -q "$bakStatus" -s "$Email_Sub" "$Email_To" the emails will somehow look like this: Code: 03:10:02 _This is your faithful daily BackUp Script_ 03:10:02 *SQL Backup start* 03:10:23 _SQL Backup done_ 03:10:23 *WWW Backup start -incl. Mails* 03:15:29 _WWW Backup done_ *Load*: 0.63, 1.05, 0.20 Mailling BackUpStatus It may not be ideal to save the SQL-Dump to the web root, but I'm not too concerned. Also if you got a lot of webs this is maybe not ideal too, as I'm doing a full web backup every night, but again I'm not too concerned there… so all up to you. By the way… from here I use PHP and Curl to move the files to another server in the local network. Cheers
Hi gjcomputer, the part: Code: # OPTION: # I like keeping a file of the BackUp steps on the server # and have it sent in an email bakStatus=/var/log/myLogs/bakStatus_Day__$yesterday.log needs adjustment, where it says "myLogs" this is not an existing dir! I just like to have it, in order to keep the logs I generate in a separate folder. So if you like the name "myLogs" just do a Code: mkdir /var/log/myLogs Also it may be better to remove the "all_databases.sql" from "/var/www/" by adding one line after the "WWW BackUp" part of the script… (Have edited this in the above script) Code: ### Clean up # removing the SQL Dump, as it has been put into the "WWW BackUp" by now. rm $sqlDump_Dir As for "mail: invalid option -- r"… this is due to the mail system. Very often the mail system "nail" is is used/installed and a sysmlink is added to the system, so that the mail command is acctually using the program nail. If you have "nail" installed you'll find with Code: man nail DESCRIPTION Nail is an intelligent mail processing system, which has a command syntax reminiscent of ed(1) with lines replaced by messages. It is based on Berkeley Mail 8.1, is intended to provide the functionality of the POSIX mailx command, and offers extensions for MIME, IMAP, POP3, SMTP, and S/MIME.… -r address Sets the From address. Overrides any from variable specified in environment or startup files. Tilde escapes are disabled. The -r address options are passed to the mail transfer agent unless SMTP is used. This option exists for compatibility only; it is recommended to set the from variable directly instead. So if you have nail, but no symlink to it simply use: Code: >&1 | nail -r "$Email_From" -q "$bakStatus" -s "$Email_Sub" "$Email_To" or if you have to use "mail", just leave out the " -r "$Email_From" ", as this is sort of cosmetic, allowing you a different Senders address, then your system user name… Reading a bit about the whole thing may be a good idea, too Cheers
I know I am resurrecting an old thread, but I wanted to thank you for this script. I came across this yesterday as I wanted to come up with a script that I could use to back up websites, databases each night. I modified the script to connect to an off-site FTP server each time it runs and send the tarball via FTP. When that finishes, it sends the email confirmation and then deletes the original tarball from the /root directory. Since I am running this via cron at 3AM, I decided to have it shut down httpd and postfix while it is creating the tarball. On my system this process takes about three minutes so I don't see that as a problem. It restarts httpd and postfix as soon as the tarball is created. I put the script in the /root directory and created a cron entry to point to /root/backup.sh Code: #!/bin/sh ### My Vars Email_To=whoever Email_From=whoever Email_Sub="Backup status" # I start this shortly after midnight, so I give it yesterdays date # Also this gives my 01, 02. 31 which means created files will be overridden monthly # You can choose maybe Mo, Tu. if you like or need fewer backups yesterday=$(date --date=-1day +%d) ### My Dirs & Files sqlDump_Dir="/var/www/__all_databases.sql" # will go in the whole backup in next step bakFile="/root/www-Day__$yesterday.tgz" bakDir="/var/www/" fileName="www-Day__$yesterday.tar.tgz" # OPTION: # I like keeping a file of the BackUp steps on the server # and have it sent in an email bakStatus=/var/log/myLogs/bakStatus_Day__$yesterday.log # OPTION: # I add the Time[ $(date +%T) ] to every line, # which tells me how long each step takes in oder to rethink the script, # when it starts to put too high of a load on the server ### My Header echo "$(date +%T ) _This is your faithful daily BackUp Script_ "> $bakStatus ### SQL BackUp echo " $(date +%T ) *SQL Backup start* ">> $bakStatus # As a password should be set and a diffrent host my be used, # put change the following values in the line "SQL_Statment" # +++ HOST = localhost # +++ User = user # +++ Password = password SQL_Statment="-h localhost -u (user) -p(password) -c --add-drop-table --add-locks --all --quick --lock-tables --all-databases" mysqldump $SQL_Statment > $sqlDump_Dir echo "$(date +%T ) _SQL Backup done_">> $bakStatus ### WWW BackUp # this includes all emails if you use IMAP echo " $(date +%T ) *WWW Backup start -incl. Mails* ">> $bakStatus service httpd stop service postfix stop tar -pczf $bakFile $bakDir service httpd start service postfix start echo "$(date +%T ) _WWW Backup done_">> $bakStatus ### Clean up # removing the SQL Dump, as it has been put into the "WWW BackUp" by now. rm $sqlDump_Dir ### FTP to FTP server # set up ftp connection and send .tar.gz file echo " $(date +%T ) *FTP start* ">> $bakStatus HOST='ftp-host' USER='username' PASSWD='password' ftp -nv <<EOF open $HOST user $USER $PASSWD put $fileName EOF echo "$(date +%T ) _FTP transmission done_">> $bakStatus ### Clean UP # removing .tgz file, it has been sent to backup server rm $fileName ### Add the LOAD # just checking how stressfull it was myLoad="*Load*: $(uptime | awk '{ print $(NF-2),$(NF-1),$NF }')" echo " $myLoad" >> $bakStatus ### mailing BackUpStatus echo " Mailing BackUpStatus">> $bakStatus >&1 | mail -r "$Email_From" -q "$bakStatus" -s "$Email_Sub" "$Email_To"
Ok so I've copied this script and created a file called backup.sh When I run backup.sh from the prompt I get "command not found" Any ideas ? Chris Sorry got that now sh /root/backup.sh