I have 2 servers setup and one server is uploading list of files to another server, this all worked for over a year and last month it started making me problems, and i know exactly what is happening where, but i don't know how to resolve the issue. Server 1 is web hosting server with pure-ftpd Server 2 is script server which connects to hosting server and upload list of files command for uploading files goes like this: wput /var/www/games/*.json ftp://ftpuser:thepassword@server_IP This works fine, it uploads all scripts to server 1 ftp folder. Problem arises that my `pure-ftpd` hangs and use 80% of cpu all the time, not only that, it constantly trying to transfer one file. And i don't understand why. Upon looking at `top` command it shows pure-ftpd with exactly which user is causing a spike, which is that one from copying batch of files. Further on i checked `pure-ftpd` logs and they don't say anything specific except transfered files logs. Than i checked the list of all opened files and connections of pure-ftpd and i found that there is open connection between `Server 1` and `Server 2`. There is one specific file that is opened in connection. /var/www/games/game_2031.json I tried to shut down `Server 2` but ftp still show open connection to it. I tried to kill the process with pure-ftpd but it just starts again after few minutes. I went to folder `/var/www/games/` and deleted all files. File `game_2031.json` reappears back after a minute. On top of that, the file should be about 7-10kb. But every time i refresh the folder file is just 0-1kb and date of creation changes. Even though `Server 2` is shut down. It looks like some internal pure-ftp issue and it's stuck with that one file. What should i do ?
Killing the wput process should stop the transfer attempts. My guess is wput retries the transfer untill it succeeds or some condition triggers causing it to stop. Find out what is wrong with that one file. Can you copy it to some other directory proving it can be read. Is the disk full on the target host? For what it is worth, I would not transfer files with wput. I prefer scp or rsync.
Thanks for the reply. But if wput retries transfer it would stop if i shut down that server? Even if i shut down that server from which one i copy files. pure-ftpd keeps idling with high cpu. I tried to remove and reupload all files together manually and it's still the same. There is enough space on target server i have more than 20gb free. I tried to open same file on both servers. On main server from where file originate, file is normal, perfectly fine organized and working. On server with pure-ftpd file is empty. I killed the wput command and switched to different one using bash. This one should exit ftp and make sure connection is not hanging. Code: #!/bin/bash # file upload cd /var/www/games/ ftp -n pureftpserver.tld <<EOF prompt user username password mput *.json quit EOF I just checked again pure-ftpd transfer.log file and it's filled with that one specific file trying to upload, log is filling up fast, like every second a new upload. Strange is that response is 200 which means file successfully uploaded. Edit: Is there maybe some scheduling on pure-ftpd if it already built up list of files to transfer and moving them from temp folder or something? I just refreshed directory listing few times and it shows some .pureftpdsomethingbinarynumbers file. It might queued files like email does ?
Is the name of that problem file somehow unusual? Contains space character, contains non-ASCII, contains special characters?
No, it's exactly as i posted it: game_2031.json and inside file it's just json format of some data, no special characters or anything. The file works fine when i copy it from source server to my pc. I don't know what else except to leave it for a while without copying files and see if somehow pure-ftpd queued files in some temp folder and maybe it will cleanup the list.