Weird FTP Issue

Discussion in 'Server Operation' started by BobGeorge, Aug 22, 2017.

  1. BobGeorge

    BobGeorge Member HowtoForge Supporter

    When I try to FTP files to the server - using Filezilla - then everything works, except that file transfers don't complete.

    I can login - with SSL authentication - and I get a directory listing. I can create directories, delete files and all the rest of it just fine. Downloading from the FTP server is also 100%.

    I can also even upload files. Sort of. It starts the transfer and the progress bar goes to 100%. But then it just sits there, never completing the transfer. Eventually, the timeout will expire and it'll reconnect to try again. Then it'll do the same thing. Reach 100% and just sit there until it times out.

    The thing is that if I abort this then refresh the directory listing, the file is there. Either it's 100% there - so the transfer did complete but it's failing to let the client know this to end the transfer properly - or it's mostly there. Say, about 1KB or so short of 100% when I look at the file size.

    Sometimes, when I abort and refresh, the file's there but its name is ".pureftpd-uploadXXXX" where the "XXXX" is some random code. Usually, when this happens, the file is 100% there, but it's got the wrong name and the transfer never properly completes.

    Now, a slight complication in all this is that I'm using an LVS load balancer in front of the FTP server. But I'm not load balancing the FTP, just passing through packets marked with a firewall mark of "21" to the single FTP server I have in my cluster. And I'm using LVS-DR, so all it does is rewrite the MAC address on the packet, otherwise it's untouched and just routed on to the real server.

    I've set up iptables to mark anything coming into the cluster on ports 20, 21 and the passive port range with the firewall mark "21" and have persistent connections turned on. Pureftpd's configuration has this same passive port range and a "ForcePassiveIP" that's the external IP address of the cluster. This works, as I can see that the numbers in the "PASV" command are all obeying this (i.e. it is reporting the external IP address and the port numbers are within my passive port range). The firewall server is configured to let these packets through.

    And, well, most of this must largely be correct, as everything bar the transfers completing works. The file transfers are even making it to the FTP server, as I can abort, refresh and check that the contents is all or mostly there and correct. But it just won't properly complete.

    Does anyone have any clue what could be causing this annoying glitch?
  2. BobGeorge

    BobGeorge Member HowtoForge Supporter

    Interestingly, trying to diagnose the problem via the command line FTP tool, I found I couldn't because it works just fine there.

    A shame that I cannot realistically recommend my clients to use the command line FTP.

    I also tried it on my Android phone with some random FTP client I found on the store but that's like Filezilla, it all works up until the file transfer should be completing but it doesn't complete. Eventually, a "java.timeoutexception" or something appears. So the same problem. But, nevertheless, I checked the PNG file I uploaded from my phone and it did actually transfer to the server 100%.

    I think the problem is that when sending data by FTP, the server uses the data connection from the client being closed as an end-of-file. But I think this isn't happening here. Must investigate further.

Share This Page