SEO, Web Development

Should search pages be blocked in robots?

I recently got a problem on a site I co-manage whereby Goolge Webmaster Tools suddenly stated one day, after some extensive crawling, that the site had far too many URLs.

In the list I saw many search result pages and after some investigation various links directed me to this page whereby, at the bottom, it says:

Typically, you should consider blocking dynamic URLs, such as URLs that generate search results,

So, it seems that this, normally, grey area is quite black and white now-a-days and you should be blocking your search from robots, else Google could penalise you by not correctly crawling your site.

SEO, Web Development

SEO Keyword Density

I was sceptical about exactly why keyword density was so important rather than placement of the keywords, having seen many “myths” in SEO.

What do I mean by placement over density? Placement could be classed as putting the keywords in the right places like in the meta, page title, header tags and content in a natural manner whereas density is, basically, keyword stuffing.

So I decided to research this a little. I soon came across: http://www.highervisibility.com/blog/what-is-the-proper-keyword-density-for-seo/ which not only provides a meta type discussion between numerous SEO experts but also provides a short but descriptive abstract from Matt Cutts:

“the first time you mention a word, you know, ‘Hey, that’s pretty interesting. It’s about that word.’ The next time you mention that word, ‘oh, OK. It’s still about that word.’ And once you start to mention it a whole lot, it really doesn’t help that much more. There’s diminishing returns. It’s just an incremental benefit, but it’s really not that large.”

So already Matt states that really once you start going crazy with those words and trying to cram them in they become almost useless.

What is also more interesting is that one of the consulted SEO experts actually did some research to find out the optimum keyword density for Google and other search engines vary:

He used pictures from gorank.com to determine that Yahoo recommends a keyword density of about 3% while Google seems to like sites that have a 1-2% keyword density. Below is an example of the chart he used to form this opinion:

With this in mind, assuming one single density for all could be dangerous, what if the search engine thinks you are keyword stuffing? This used to be a common (and can still be) problem whereby scammers and hackers would keyword stuff to ensure they get fake sites to the top. So, you can imagine that if a search engine thinks you are keyword stuffing they will actually give you a penalty possibly?

In fact all of the consulted SEO experts seem to agree that keyword density:

  • Is not a fixed calculable number
  • Is not a big problem to most sites
  • Could be premature optimisation for a page (if you are a programmer you will understand this one)

So with all this in mind I decided to recommend, with evidence, that maybe we should rethink our SEO tactics and not fall into this common myth trap.

SEO, Web Development

SEO: 100 links per Page?

I recently had an SEO consultant state that Google recommends, for SEO or “linkvalue” for a page, 100 links per page.

I have never seen the word “linkvalue” before so I decided to Google it, ironically and found nothing. After that I decided to Google about 100 links per page. I soon came across a post made by Matt Cutts (http://www.mattcutts.com/blog/how-many-links-per-page/) whereby he states:

The original reason we provided that recommendation is that Google used to index only about 100 kilobytes of a page. When we thought about how many links a page might reasonably have and still be under 100K, it seemed about right to recommend 100 links or so.

So the problem was that the Google bot would only read up to 100kb of data before potentially truncating a page. Of course, even back then this was just an advisory (this was back in 2009) and it was Google’s attempt to predict max page size for things.

Matt goes on to explain about the modern day tactics of the Google bot:

Does Google automatically consider a page spam if your page has over 100 links? No, not at all. The “100 links” recommendation is in the “Design and content” guidelines section

So you see, the whole 100 links per page thing is really only about design and in reality, as can be seen from this picturegram: http://www.nickbilton.com/98/ many of the top 98 sites on the web from back then had lots more than 100 links and still do.

However this being said Matt does go on to explain:

At any rate, you’re dividing the PageRank of that page between hundreds of links

So if you are really keen on passing as much of your pagerank to pages on your site that might not be able to hold their own (in which case you should probably question why you want them listed) and you have 600 odd links on one page then you probably want to cut down on the links a little otherwise it is completely up to the design of the site.