Extending Perfect Server - Debian Squeeze [ISPConfig 3]

Discussion in 'HOWTO-Related Questions' started by 8omas, Mar 20, 2011.

  1. 8omas

    8omas Member HowtoForge Supporter

    As I see my tutorial is posted. Please, whoever follow it, MUST be very careful. Especially the page 5 with the firewall.

    The page 6 is just a suggestion for the ISPConfig developers, and no a final solution. The logic is to give the client a full back of his data IN his directory. I know that this means that his space must be bigger, but I think that this is worth it.
     
  2. till

    till Super Moderator Staff Member ISPConfig Developer

    Personally I see the installation of webmin on a ispconfig server very critical, beacuse you may not use webmin to configure any service that is already configured by ispconfig (apache, postfix, imap, pop3, dns). Otherwise you might destroy the setup or your changes will conflict with ispconfig.

    The "client" directory is not acccessible by the client and it should not be accessible, as the access in ispconfig is handled on the website level. For that reason, the backups are created per website and not per client and made available in the websites so that the client can access them.
     
  3. 8omas

    8omas Member HowtoForge Supporter

    I understand. That was my first thought, but I saw duplicate backups in the web folders (in cases of alias etc). That time I didn't check in the sql query if the domain was an alias, or a 'real' domain. Now I think I can prevent that.

    I will try to follow your advice and change the storing folder. I will post here the results.

    As of webmin, I totally agree with you. I only install it for web based access in case of problems (e.g. I once managed to lock myself out of ssh. Thank God, I had installed webmin, and I unlocked myself.) Probably I must mention it in the tutorial.

    P.S.
    Could you move this thread to the right section of forum? I accidenty put it here, when it should be placed to 'How To related"
     
    Last edited: Mar 24, 2011
  4. 8omas

    8omas Member HowtoForge Supporter

    Why do you say that the 'client' directory is not accessible?

    I have a client webxy that can access '/var/www/clients/clientz/' directory (through sftp).
    I followed the instructions in the manual and Perfect Server and the client can download his backup as long as the permissions on the files are ok for him (660 for webxy:clientz).

    Am I missing sth here?

    I do not have jail enabled as I haven't solved the problem with the greek characters yet ( http://www.howtoforge.com/forums/showthread.php?t=51047 )
     
    Last edited: Mar 23, 2011
  5. 8omas

    8omas Member HowtoForge Supporter

    I now know what you mean :(
    I always use ssh (sftp) to access my files, and of course (assumed that jail isn't enabled) I can access the whole filesystem
    With ftp, my root dir is the clientz/webxy folder.
    I will change the script accordingly.
    Thanks.
     
    Last edited: Mar 24, 2011
  6. 8omas

    8omas Member HowtoForge Supporter

    Ok. I did it.
    Now the backups (websites and dbs) are stored in the webxy dir.
    After a very few tests, I see that there are no dublicates (concerning clients with alias domains etc).
    I can also access them through ftp, as you mentioned.

    I updated the tutorial, too.

    Thanks again Till.
     
  7. tazmon95

    tazmon95 New Member

    8omas,

    Thanks for this How-To. A lot of great things in it. I've been able to get most of it working but there are a couple parts where I'm getting error messages I haven't been able to trouble shoot.

    When running /root/scripts/mytail I get:
    and when I try /root/scripts/mybackup.sh I get:
     
  8. 8omas

    8omas Member HowtoForge Supporter

    As for the mytail, you probably forget the \ (backslash) at the end of the line:
    I updated the tutorial, so it is more readable:
    Code:
    #!/bin/bash
    multitail -ci yellow -e "ailed" -n 1000 /var/log/auth.log  \
    -ci red -e "Ban" -n 1000 -I /var/log/fail2ban.log \
    -ci red -e "[COLOR="Red"]fw[/COLOR]" -n 1000 -I /var/log/messages \
    -ci green -e "Unban" -n 1000 -I /var/log/messages \
    -ci blue -e "fail" -n 1000 -I /var/log/syslog
    please change the word iopen with fw.

    In my backup, there is a mistake:
    Code:
    QRY="use dbispconfig; SELECT web_domain.system_user, web_domain.system_group, \
    web_domain.document_root, web_domain.domain FROM web_domain WHERE  \
    web_domain.type!='alias' AND web_domain.system_user IS NOT NULL AND LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL) ;"
    must be written as:
    Code:
    QRY="use dbispconfig; SELECT web_domain.system_user, web_domain.system_group, \
    web_domain.document_root, web_domain.domain FROM web_domain WHERE  \
    web_domain.type!='alias' AND web_domain.system_user IS NOT NULL AND (LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL) ;"
    I corrected this.
    Thanks
     
    Last edited: Mar 29, 2011
  9. 8omas

    8omas Member HowtoForge Supporter

    Sorry, I just saw that is a copy-paste mistake.
    The same mistake appears in all queries.
    Just change the NULL) to NULL and you 'll be ok

    Not correct. See the next post
     
    Last edited: Mar 29, 2011
  10. 8omas

    8omas Member HowtoForge Supporter

    The mistake is the missing ( before LENGTH -- NOT what I mention above. So you must have this:
    Code:
    (LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL)
    I reupdated the tutorial

    sorry for the inconvenience :(
     
  11. tazmon95

    tazmon95 New Member

    8omas,

    Thanks for the quick updates, they work great now.

    There is one behavior of the backup script that I'm not 100% happy with though. Say I have 1 client with 4 databases and 10 sites. Then when the backup script runs, I get 4 backed up database files in each of the 10 site folders along with the backup files for that site. So instead of 4 backup files for the 4 databases, there are 40.

    I think it would work better if the files were stored at the client level in a backup folder and not at the site level. I just don't know what to think about the permissions and access to them via ftp...

    ~Taz
     
  12. 8omas

    8omas Member HowtoForge Supporter

    As I see, there are still duplicates. Unfortunately, I cannot find any solution for this for now. This is because, there are no data in the database of ISPConfig that associate databases with sites.

    Till, Falko (or any other developer), you made a schema that is website-centric but the databases are owned by clients and (databases) are not associated with websites. This means that the 'system' has no knowledge of which database is used by which website. I know that this gives access to client to all his databases, but shouldn't it be the same for his websites? Isn't it meaning that the db logic is client-centric and not website-centric?
    Am I right?

    If I am, would it be possible to have a field (for every database) to optionally associate it (or even mandatory) with the web site?

    If the previous suggestion is not possible, would it be logical to create an ftp user for the client to have access to /var/www/clients/clientxy through ISPConfig interface? (From the advanced settings in ftp user I see that we can change his path to access all his websites)
     
    Last edited: Mar 29, 2011
  13. 8omas

    8omas Member HowtoForge Supporter

    I am not happy too. As you see I tried to change this, but this is not possible since there are no data that associates databases with websites (I hope I am mistaken but as much as I searched the ISPConfig db I couldn't find anything)

    My first approach was to store files in client directory, but Till suggested not to do this.

    My suggestion for now is to improve the script so as to backup websites to web folders and databases to client folders.

    This means two alternative things
    1) that you have to login 2 times, with 2 different ftp accounts (one for the client and one for the website owner - ftpuser -- see my previous post )
    2) or that you have to login through sftp (ssh) and download the files.

    In each case I must change the script and in each case someone has to have access to /var/www/clients/clientXY

    I will try with a new script in the afternoon.

    Thank you very much for your comments
     
    Last edited: Mar 29, 2011
  14. till

    till Super Moderator Staff Member ISPConfig Developer

    Why dont you use the backup function which is already part of ispconfig? It makes a backup of the websites and places them in a direcetory which is accessible by the clients when they login to their site. The databases are not included yet for the reason that you found out yourself, as databases are not attached to a web, you can not make a backup for them in a web as you dont know to which web they belong. This will be changed in a future version as explained in the dev forum some time ago.

    When it comes to database backups, the function to backup and restore databases is builtinto phpmyadmin and can be accessed by the user with his phpmyadmin login and the normal regular database backups are done by the system admin anyway.
     
    Last edited: Mar 29, 2011
  15. 8omas

    8omas Member HowtoForge Supporter

    Because the backups are not accessible through ftp.

    How are they accessible if they login? Through the panel?

    We have already talked about this in those threads:
    http://www.howtoforge.com/forums/showthread.php?t=51723
    http://www.howtoforge.com/forums/showthread.php?t=51586

    I think that the only viable solution is to create an ftp user with access to all his websites (changing his path). Why is this bad? The sites and the db belong to him, and he is not a reseller. I did a small test and I thinks it works.
     
  16. till

    till Super Moderator Staff Member ISPConfig Developer

    You might get a lot of support requests from your clients then because the user can see his sites but uses the wrong uid to access the files in the sites correctly. Or when he upload files, they belong to the wrong user and he might get problems with e.g. suphp or suexec.

    The solution for the database backup problem is quite simple and I explained it already in the dev forum: A website select field will get added to the database settings and as soon as this has been implemented, the database backups are part of the ispconfig backup script.

    Regarding accessability of the files with FTP, this has been changed already in 3.0.3.3.rc1. Also you could just modify the existing backup scripts to backup to a backup folder in the website instead of a symlink if that would have been the problem for you.
     
  17. 8omas

    8omas Member HowtoForge Supporter

    If an ftp user has access to /var/www/clients/clientXY, he cannot store (upload) files. But as I tested he can download them. So...

    I could do 2 things, until the new release:
    1) revert the script to previous version, and give one or all ftp accounts of the client, access to client folder (having in mind that he cannot upload -- I think that it has to do with the ftp server). With this he could do whatever he would do and have no problem (I think)
    2) Keep the script almost as is. I will remove the db part and maybe just store all db's in one site

    Any suggestion?
    I tend for the second one.

    One more thing. A lot of clients tend to use the same db for a lot of sites. The association isn't the best solution even though a lot of panels use this approach. This looks like the second of the above with the only difference that all the dbs are in one site.

    Last. Is it worth trying to solve this one, or the new version will be released soon?
     
  18. till

    till Super Moderator Staff Member ISPConfig Developer

    I do not mean that he can upload files to /var/www/clients/clientXY, but he might be able to upload files to /var/www/clients/clientXY/web3/web/... which might cause problems later.

    I think that 2) might be the better option. But you might wnat to add a notice that the FTP user that you add may not be used for anything else except of downloading the backups to not mess up the sites.

    Thats the reason why the databases were not assigned to a site yet.

    I expect that it will be released in 3.0.4 as there are database changes nescessary whic we dont do in minor releases in the stable branch. So it will take some time until it gets released.
     
  19. 8omas

    8omas Member HowtoForge Supporter

    Thank you all for your comments.

    The solution I followed is the one that all the databases are stored in one site. In case, a client has more than one database, then all the databases will be backed up in his first (based on webID) site.

    I updated the tutorial, so you can empty the local script (mybackup.sh) file and paste all the lines from the manual.

    If you want you can just replace the second query with the following (you just loose some comments):

    Code:
    QRY="use dbispconfig; SELECT web_database.database_name , web_database.database_user ,\
     min(web_domain.system_user) as muser, web_domain.system_group, min(web_domain.document_root) as mpath, \
    web_domain.domain FROM web_database, web_domain WHERE web_database.sys_userid=web_domain.sys_userid \
    AND web_database.sys_groupid=web_domain.sys_groupid AND web_domain.type='vhost' \
    AND web_domain.system_user IS NOT NULL AND (LENGTH(web_domain.redirect_path)<5 OR web_domain.redirect_path IS NULL) \
     GROUP BY web_database.database_name , web_database.database_user,  web_domain.system_group;"
    Don't worry about the duplicated dbs in websites' folders. They will be removed at the next run. In you main folder you don't have duplicated dbs.

    Please let me know, if everything is ok now.
     
    Last edited: Mar 29, 2011
  20. 8omas

    8omas Member HowtoForge Supporter

    Hmmmm ... finally you had to worry about duplicate files :eek:.
    I had some mistakes with the rm and the wildcards.

    instead of
    Code:
                    [ -f ${col[2]}/*BU.tar.gz ] && rm ${col[2]}/*BU.tar.gz 
                    [ -f ${col[2]}/*BU.gz ] && rm ${col[2]}/*BU.gz 
    you must have
    Code:
                    for delfile in ${col[2]}/*BU*gz ; 
                    do [ -f $delfile ] && rm $delfile; 
                    done
    The same happens in one more place :
    Code:
                    [ -f $SITES/*.$BACK22.gz ] && rm $SITES/*.$BACK22.gz
    		[ -f $SITES/*.$BACK22.tar.gz ] && rm $SITES/*.$BACK22.tar.gz
    where instead should be:
    Code:
                    for delfile in $SITES/*$BACK22*gz ; 
                    do [ -f $delfile ] && rm $delfile;  
                    done
    Please copy, all the new code from the 6th page of the manual.
    I am talking about mybackup.sh
     
    Last edited: Mar 29, 2011

Share This Page