If you haven’t checked out your Google webmaster tools lately, you may be surprised to see some new additions next time you log in. Google recently updated their webmaster tools and added a new “Settings” area with several new features for SEO’s including geographic targeting, preferred domain, an option for inclusion in Google Image Labeler, and custom crawl rate.
The new geographical targeting feature is pretty straightforward – it allows you to tell Google what geographical areas your website is targeting. It works similar to the way a .us or .co.uk domain works. This may be a very helpful feature for those who have a regular top level domain and only target specific regions.
The preferred domain option allows you to tell Google whether you want to set a preference for a “www.” or no “www.” in your URL. Don’t ignore this, as it is the root of canonical indexing problems. An example of canonical duplicate content is: www.site.com/page.html and site.com/page.html . My preference is to use ‘www’. This feature probably isn’t relevant in most cases, but it is better to make a decision, rather than leave it up to chance. The more specific you are with your site configuration, the easier time Googlebot will have indexing and ranking your site.
Another feature is the ability to opt into the Google Image Labeler program. This will allow other people to label your images and help Google index them better. This feature is useful for a wide variety of sites to help propagate images across the web and increase overall ranking in Google image search results. I am amazed at how much traffic our clients receive from Google Image search – it is 50% of all traffic on one site! Never underestimate the search engine optimization power of having relevant images on your site.
The last feature is the custom crawl rate. Google will let you have a little input over how fast and how long the Googlebot stays on your site. If you’re having problems getting “hard to reach” areas of your site indexed, this could be the answer you’ve been looking for. Another great thing about this feature is it will automatically reset after 90 days. While it may be beneficial to crank up the juice for the Googlebot to get him spidering your entire site, you probably don’t want to be giving up a huge amount of bandwidth over a long period of time.