Watchout For Hosting Companies Who Block Search Engine Spiders

Be careful your webhosting company doesn’t block search engine spiders. Companies such as has been found guilty of blocking Googlebot for some web accounts. Dreamhost claims this is for your benefit, and recommends blocking googlebot from your site.

Here is a copy of the email:

This email is to inform you that a few of your sites were getting hammered by Google bot. This was causing a heavy load on the webserver, and in turn affecting other customers on your shared server. In order to maintain stability on the webserver, I was forced to block Google bot via the .htaccess file.

order allow,deny
deny from 66.249
allow from all]

You also want to consider making your files be unsearchable by robots and crawlers, as that usually contributes to high number of hits. If they hit a dynamic file, like php, it can cause high memory usage and consequently high load…

Huh? Block search engine traffic? This is as crazy as shutting your webserver off at 5pm when you leave work and restarting it at 8am when you start work. Blocking Google will cost you money.

You need a reliable host, and Netpaths will never throttle or block search engine spiders from indexing your content and ranking your websites.


27 Little-known Tips to GET TRAFFIC to your Website

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

100 New Ideas to Boost Conversions

Get our ultimate guide to maximizing your ad campaigns.