SEO, Web Development

Should search pages be blocked in robots?

I recently got a problem on a site I co-manage whereby Goolge Webmaster Tools suddenly stated one day, after some extensive crawling, that the site had far too many URLs.

In the list I saw many search result pages and after some investigation various links directed me to this page whereby, at the bottom, it says:

Typically, you should consider blocking dynamic URLs, such as URLs that generate search results,

where it states solutions.

So it seems that this, normally, grey area is quite black and white now-a-days and you should be blocking your search from robots, else Google could penalise you by not correctly crawling your site.