Home

Rescuing a PC in trouble PDF Print E-mail
Monday, 14 January 2013 18:51
Great article/tutorial on how to use a USB drive to rescue a PC in trouble:
http://www.pcworld.com/article/2021326/turn-your-flash-drive-into-a-portable-pc-survival-kit.html
 
FTP problems PDF Print E-mail
Wednesday, 25 January 2012 22:57
Like every web designer, I run into problems now and then that take some effort to solve. The most recent problem had to do with my FTP client, Filezilla in this case, being forcibly disconnected from the server whenever I tried to upload or download larger numbers of files. I also tried using CuteFTP, which I've had for some time, with the same result.

The host checked into it for me and found that my FTP client was creating way too many connections.  Now I had purposely set Filezilla to use only 4 concurrent connections, but the server stated that I had anywhere from 300 to 650 connections! Apparently this is a known problem with Filezilla--I saw way too many posts in forums about it. But it was also killing CuteFTP. Limiting the number of simultaneous connections in both programs wasn't helping.

I found this very succinct explanation of why this happens in one of the forums, and I am going to add it here for explanation. Thank you to "boco" on this page.

"The Too many connections problem is not easy to understand for the normal user, here my attempt to explain:

-A connection is not as simple as it sounds, there are always (with FTP at least) two endpoints. So you need to know the error the server shows does NOT concern the connections at your side, but at the server's.
-A fully established connection is called ''open'' and you can transfer data over it. A not yet established or broken connection may be ''half-open'', means it exists at one endpoint but not the other. The arbitrary server limitation takes both the open and half-open connections into account.
-As far as connection closure concerns, the normal TCP procedure is for both endpoints/peers to notify each other and wait for a response from the peer. Then both peers close the connection. Exactly here is the weak point.
Most people imagine such a connection as a straight line. However, nowadays the endpoints are very rarely directly connected. There are firewalls and routers in between forwarding and intercepting the connection.
-What you and others in this topic experience is that the connection closure notification (or the response to it) gets lost. Or a firewall closes the connection itself without notifying the enpoints properly. In both cases this causes the connection to stay half-open at either side.
If that happens at the client's (FileZilla) side, users experiencing a connection timeout after a file reaches 100% (or 99% because of buffering).
But the user doesn't notice if that problem is at the server side; FileZilla won't display any error because all is fine client-wise. However, that half-open connection at the server side still exists and counts against your limit! Note that it will timeout after some time, but if you manage to accumulate enough stalled server-side connections to saturate your limit, you're effectively locked out. Every new connection attempt will give that confusing error.

So why FileZilla triggers that problem more often than other clients? Answer: speed. FileZilla is greatly optimized for speed. So it's much easier to reach the critical amount of stalled connections. And the faster an application transfers data over the network, the faster routers and firewalls have to work. Some will simply give up at some point and start to produce errors. FileZilla already managed to lock up quite a few routers already."


I also found a suggestion from another user regarding Core FTP, which I decided to install and try to see if it would help. And it has so far. Not that a 421 (too many connections has gone away completely, but it's a whole lot better than it was.
 
Duplicate Content and Google PDF Print E-mail
Wednesday, 30 March 2011 18:25
Duplicate content has long been a concern for webmasters due to the Google and other search engines penalizing a site showing duplicate content. I know it is often a problem when doing the SEO on a site when a particular page can be reached via various URLs, and htaccess must be adjusted to make sure that only one URL is seen.

I recently came across this page, however, under the Google Webmaster Guidelines, wherein Google states that you can specify the preferred version of a URL. If a page can be reached by several different URLs as in the examples on that page, one can put a LINK attribute in the header of the page to let Google know which URL is the preferred one, like so:

<link rel="canonical" href="http://www.example.com/product.php?item=swedish-fish">

This code is, of course, placed in the <head> section of the webpage in question.

Obviously, duplicate content on different pages is still an issue and should not be used if one wants to keep a site in the good graces of the search engines.
 
A Flying Car? PDF Print E-mail
Wednesday, 30 June 2010 19:57

Looks like the Jetsons are not far off..  this little flying car has now cleared a hurdle with the FAA and is expected to go to market as early as next year:

Terrafugia-flying-car-gets-FAA-clearance

This causes some concerns. Even now we can't seem to stop people from texting and driving at the same time. Just imagine how much more damage they can do when airborne. It gives the phrase "we'll drop by" a whole new meaning....

At least for now the cost of one of these will be out of most people's budgets, so we don't have to worry YET. But like all technology over time, the price eventually goes down. It's bad enough dealing with drivers on the road, but the potential is now there to have them cluttering up the air as well, dropping their trash all over your property instead of just alongside the road (bad enough that they drop it there!) and polluting areas that were so far pristine because one couldn't easily drive there. As nice as the technology could be, I sure hope they get some rules into place before they let these things loose.


 
Get IT Out Of The SEO Business--Link to a Good Article PDF Print E-mail
Friday, 21 May 2010 19:40

It needed to be said...

IT departments do a great job with the things they do, but they can't do it all. Nor should they be asked to. It's enough enough these days dealing with security and keeping everything running smoothly. IT has highly trained people, but they are not web designers and they are most certainly not schooled in SEO (search engine optimization).

That said, here's the article.

 
« StartPrev1234NextEnd »

Page 1 of 4
TonerDesign.biz, Joomla templates by SG web hosting