I had an issues a few months back where a user of mine could not make use of the WebFTP built into ISPConfig. I have finally tracked down the issue. This user has 3.5 GB of information in his web directory. He also has an extensive network of folders and subfolders. I am assuming that the WebFTP application parses through the whole directory before displaying. I am curious as to what the breaking point is for this part of the application. Building a huge directory structure can takes lots of time in a web application and I am not sure increasing the timeout settings in Apache/PHP would help. Please look into this if you get a chance. Sincerely, TeleRiddler
Changing PHP Values Falko, No dice. I changed my php.ini file setting to the following: ;;;;;;;;;;;;;;;;;;; ; Resource Limits ; ;;;;;;;;;;;;;;;;;;; max_execution_time = 180 ; Maximum execution time of each script, in seconds max_input_time = 180 ; Maximum amount of time each script may spend parsing request data memory_limit = 500M ; Maximum amount of memory a script may consume (8MB) Still getting an error. Any other ideas? Sincerely, TR
You might need a lot of memory for 3.5 GB of files... Anyway, I suggest that your users use normal FTP clients like SmartFTP instead of Web-FTP. WEB-FTP is a solution for the technically unaware that don't know what FTP is, and how to install and configure an FTP client...
That is what I have been telling my users. I just have one user who has a large amount of files and is not familiar with FTP and SSH. I have taught him how to upload and change file permissions with an FTP Client. I used to work at a startup company where we developed a web based interface to parse through files. I remember that we ran into the same issue developing and we found some interesting solutions to speed up the directory parsing in PHP. I was just wondering if anyone was aware of this issue and I also understand that it is not a high priority. Just wanted to bring it to your attention. Thanks Falko TR