Undefined index: server_offset_time

« Evo Parse Error »

Error : mysqli::mysqli(): (42000/1203): User emarket_emarket already has more than 'max_user_connections' active connections
Error information
ErrorType[num] WARNING[2]
File /home/emarket/public_html/manager/includes/extenders/dbapi.mysqli.class.inc.php
Line 64
Source $this->conn = new mysqli($host[0], $uid, $pwd, $dbase, isset($host[1]) ? $host[1] : null);

Basic info
REQUEST_URI http://www.emarketeers.com:443/api/head
User Agent
Current time 2024-06-16 23:31:11

MySQL 0.0000 s (0 Requests)
PHP 0.0414 s
Total 0.0414 s
Memory 1.7894439697266 mb

index.php on line 139
manager/includes/document.parser.class.inc.php on line 2746
mysqli->mysqli('localhost', 'emarket_emarket', ',K@bSp$Dh[L&', 'emarket_emarketeers', 0)
manager/includes/extenders/dbapi.mysqli.class.inc.php on line 64
Jonathan Saipe

Parameter Handling Tool Assists With Duplicate Content Issues

19 October 2009, Jonathan Saipe

For a number of years, the issue of duplicate content has been a hot topic for many SEO professionals.
Google has made regular posts offering guidelines to webmasters; these guidelines have varied from content advice, to the use of 301 redirects, domain canonicalisation and more recently the implementation of rel=canonical (now supported by the 3 major search engines) when dealing with specific duplicate pages.

This month Google has added to its WMT the parameter handling tool – a feature which enables webmasters or developers to highlight up to 15 URL parameters for the Googlebot to ignore during its crawl.

For example, your preferred URL for a training course might look something like this: http://www.domain.com/training.php?course=SEO-Training

However, you might also be adding additional parameters such as affiliate tracking codes or session id’s which means your URL could actually look like this:

Using the Google’s parameter handling you can request that the crawler ignores the parameters “affiliate” and “sessionid”. Not only could this increase the crawl efficiency (as it tends to ignore subsequent instances of the stated parameters) but it should hopefully remove duplicate content issues caused by those URLs.

Equally, you can inform Google which parameters not to ignore if those should indeed be included in the crawl.
There’s still a recommendation to use rel=canonical on specific pages, but if the parameters are site-wide, it makes sense to inform Google using the parameter handling tool.

Whilst I can see the benefits of this tool – especially if webmasters don’t have access to URL rewrites – be warned that Google will stop crawling certain URLs which in turn will affect their link equity being passed.