SEO, Web Development

Should search pages be blocked in robots?

I recently got a problem on a site I co-manage whereby Goolge Webmaster Tools suddenly stated one day, after some extensive crawling, that the site had far too many URLs.

In the list I saw many search result pages and after some investigation various links directed me to this page whereby, at the bottom, it says:

Typically, you should consider blocking dynamic URLs, such as URLs that generate search results,

where it states solutions.

So it seems that this, normally, grey area is quite black and white now-a-days and you should be blocking your search from robots, else Google could penalise you by not correctly crawling your site.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s