FTP problems PDF Print E-mail
Wednesday, 25 January 2012 22:57
Like every web designer, I run into problems now and then that take some effort to solve. The most recent problem had to do with my FTP client, Filezilla in this case, being forcibly disconnected from the server whenever I tried to upload or download larger numbers of files. I also tried using CuteFTP, which I've had for some time, with the same result.

The host checked into it for me and found that my FTP client was creating way too many connections.  Now I had purposely set Filezilla to use only 4 concurrent connections, but the server stated that I had anywhere from 300 to 650 connections! Apparently this is a known problem with Filezilla--I saw way too many posts in forums about it. But it was also killing CuteFTP. Limiting the number of simultaneous connections in both programs wasn't helping.

I found this very succinct explanation of why this happens in one of the forums, and I am going to add it here for explanation. Thank you to "boco" on this page.

"The Too many connections problem is not easy to understand for the normal user, here my attempt to explain:

-A connection is not as simple as it sounds, there are always (with FTP at least) two endpoints. So you need to know the error the server shows does NOT concern the connections at your side, but at the server's.
-A fully established connection is called ''open'' and you can transfer data over it. A not yet established or broken connection may be ''half-open'', means it exists at one endpoint but not the other. The arbitrary server limitation takes both the open and half-open connections into account.
-As far as connection closure concerns, the normal TCP procedure is for both endpoints/peers to notify each other and wait for a response from the peer. Then both peers close the connection. Exactly here is the weak point.
Most people imagine such a connection as a straight line. However, nowadays the endpoints are very rarely directly connected. There are firewalls and routers in between forwarding and intercepting the connection.
-What you and others in this topic experience is that the connection closure notification (or the response to it) gets lost. Or a firewall closes the connection itself without notifying the enpoints properly. In both cases this causes the connection to stay half-open at either side.
If that happens at the client's (FileZilla) side, users experiencing a connection timeout after a file reaches 100% (or 99% because of buffering).
But the user doesn't notice if that problem is at the server side; FileZilla won't display any error because all is fine client-wise. However, that half-open connection at the server side still exists and counts against your limit! Note that it will timeout after some time, but if you manage to accumulate enough stalled server-side connections to saturate your limit, you're effectively locked out. Every new connection attempt will give that confusing error.

So why FileZilla triggers that problem more often than other clients? Answer: speed. FileZilla is greatly optimized for speed. So it's much easier to reach the critical amount of stalled connections. And the faster an application transfers data over the network, the faster routers and firewalls have to work. Some will simply give up at some point and start to produce errors. FileZilla already managed to lock up quite a few routers already."

I also found a suggestion from another user regarding Core FTP, which I decided to install and try to see if it would help. And it has so far. Not that a 421 (too many connections has gone away completely, but it's a whole lot better than it was.
TonerDesign.biz, Joomla templates by SG web hosting