Hello, I have several questions about ISPconfig multiserver setup... web1 - master server we2 - secondary server ( has joined the existing installation ) 1) I can not add any sites, mails, FTP accounts etc. from the secondary server. Is it a bug or a feature ( if it's a feature, so why is it like this ? ). 2) Monitoring module doesn't work at the secondary server: I'm not able to view monitor data, but I can view the secondary server data from the master server. Is it a bug or a feature ( if it's a feature, so why is it like this ? ). Thank's
Did you use web1 and web 2 as sub(Alias) like web1.mydomain.com and web2.mydomain.com or how you do that?
Hello Quaxth, I have two servers: web1.example.com and web2.example.com. I have installed web1.example.com a time ago and now i have added a new server - web2.example.com to the we1.example.com. So now web1.example.com - primary, web2.example.com is secondary.
That seems to be OK, What's about the Port for Apache and ISPConfig? if web1 has 8080 as post, than web 2 should have like Port 8090, me think. Also the Port 8081 is in use already on web1. Check for other ports, you could not have double ports running on the same LAN. And just to mention, I'm new to this system as well as Debian and Linux in total. So before you do anything which you not 100% sure it's ok, wait til Monday and ask either Till or Falko.
It has nothing to do with the ISPConfig port. If you look into monitor_core_module.php , you will see, that the slave server will send the monitor data to master server. And when you will try to view the monitor data, ISPConfig will try to load data from the local server ( so not from the master ).
Sorry may I didn't understand the right way?! I didn't run multi Server, just single server for ISPConfig. I run Multi Server on Windows Networks. That said, I could not see what you see in monitor_core_module.php! Reading your last post, you talk about Slave, Master and Local Server. Did that mean that the Master is not inside your LAN instead on an WAN connection like in an Data Center and the Slave is inside your LAN? I just try to understand your Environment. Thanks.
I have two servers in the rack. web1.example.com and web2.example.com, that's my environment. It has nothing to do with my installation or with my environment, I made some debugs. The slave server monitor works fine, but it sends the data to master.
Thats correct behaviour as the master server is the only server in a multiserver enviroment that runs a interface. The purpose of a multiserver setup is to provide a single login and configuation interface from were you can manage addition servers or even addition server clusters which mirror other slaves. So there exists only one interface server in a multiserver setup and the inteface has to connect only to the local database for that reason. Please see multiserver setip guides, the setup is described there in detail. If you would install a interface on a slave server, then all cahnges made in that interface will cause conflicts with the master and destroy the setup.
Okey, but if the master server goes down, than the whole cluster will be down, right? Then you will not be able to see any monitor data, not to add any sites/databases/ftp etc.... And the first question is still active: 1) I can not add any sites, mails, FTP accounts etc. from the secondary server. Is it a bug or a feature ( if it's a feature, so why is it like this ? ).
No, the cluster will not be down. None of the slaves is affected by this as the master is only the control node, it is not required to run the sites, mail accounts etc. on the slaves. If you want to mirror the master as well to avoid a single point of failure, see mirror / cluster guides at ispconfig.org which contain mysql mirroring of the master database on 2 nodes. I answered that above already: 1) your slave may not run a interface that is connected to the "local" mysql database, so there is no place were you could have logged in to add a site on the slave. 2) If you installed a interface on the slave even without using the special configuration and mysql master/master replication described in the cluster guide, then this broke your setup.