Parameter Handling Tool Assists With Duplicate Content Issues
19 October 2009, Jonathan Saipe
For a number of years, the issue of duplicate content has been a hot topic for many SEO professionals.
Google has made regular posts offering guidelines to webmasters; these guidelines have varied from content advice, to the use of 301 redirects, domain canonicalisation and more recently the implementation of rel=canonical (now supported by the 3 major search engines) when dealing with specific duplicate pages.
This month Google has added to its WMT the parameter handling tool – a feature which enables webmasters or developers to highlight up to 15 URL parameters for the Googlebot to ignore during its crawl.
For example, your preferred URL for a training course might look something like this: http://www.domain.com/training.php?course=SEO-Training
However, you might also be adding additional parameters such as affiliate tracking codes or session id’s which means your URL could actually look like this:
Using the Google’s parameter handling you can request that the crawler ignores the parameters “affiliate” and “sessionid”. Not only could this increase the crawl efficiency (as it tends to ignore subsequent instances of the stated parameters) but it should hopefully remove duplicate content issues caused by those URLs.
Equally, you can inform Google which parameters not to ignore if those should indeed be included in the crawl.
There’s still a recommendation to use rel=canonical on specific pages, but if the parameters are site-wide, it makes sense to inform Google using the parameter handling tool.
Whilst I can see the benefits of this tool – especially if webmasters don’t have access to URL rewrites – be warned that Google will stop crawling certain URLs which in turn will affect their link equity being passed.