As many of you may have noticed, keyword density within content is becoming less and less significant when it comes to search engine rankings. The search engines are more intelligent than they once were and don’t need keywords shoved down their gullet 27 times within 400 words of content to figure out what your page should rank for. If you have a focused and unique Title Tag, combined with a focused and unique Meta Description and on-page content, they probably already have a clear idea of how they should index your content. Just because you have more keywords or the exact same amount of keywords as the top ranking sites in a search engine, it doesn’t mean you’re going to rank; especially if you’re just relying on content. I mean it doesn’t hurt to look, but it shouldn’t completely control your content.
The entire goal of a search engine is to provide searchers with content that relates to their search query AND provides a significant depth of topical content. Having a keyword 27 times on a page isn’t going to convince any search engine that you have a significant depth of topical content. But what can convince them is if you have the right combination of words.
Keyword Diversity vs. Density
This isn’t a new concept, but definitely one that’s starting to get more exposure and discussion. The fundamental idea behind keyword diversity vs. density is that when you naturally write about a topic, you’re going to include a range of terms that all relate to the larger topic being discussed.
For example, if you’re writing a paper/article/report on 2009 Hybrid Cars, you’re naturally going to include terms like “Toyota Prius”, “Ford Fusion”, and “plug-in capabilities” in your content. On the other hand, you probably aren’t going to include the term “2009 Hybrid Cars” 20 times. If you were to read an article that did have the term “2009 Hybrid Cars” 20 times and didn’t include terms like “Toyota Prius”, you’d question the validity and expertise of the article.
The search engines are the same way, or at least they’re moving in that direction. The intelligence of the search engine algorithms is increasing by leaps and bounds year after year, so it’s no surprise that they’re capable of analyzing content from a content and keyword relevancy standpoint rather than a “whoever has the most keyword usage wins” perspective. And it’s very refreshing to see the search engines turning this way. Of course it will make content optimization significantly more difficult, but it will help to ensure that the optimization work that is done is more about creating better, more relevant and comprehensive content rather than shoving in a keyword an assumed correct number of times.
- 4 Ways SEO has NOT Changed - May 15, 2014
- Social Signals and Google Rankings – Facebook and Twitter Out…Google+ In! - January 28, 2014
- Big Brands Don’t Come with Big Digital Security - January 17, 2014
- 7 Celebrities I Hope Adopt Jelly - January 16, 2014
- Facebook + NETGEAR + Your Local Coffee Shop = Win for Local Coffee Shops!!! - January 15, 2014
- You Can Now Email Everyone on Google+ !!! - January 14, 2014
- New Jelly Social Search App – Prediction: “Yer Mom!” to be Top Search Result - January 10, 2014
- New Exact Click Traffic Data in Google Webmaster Tools – Are Our Keywords Back? - January 9, 2014
- What Does Pinterest’s Acquisition of VisualGraph REALLY Mean? - January 7, 2014
- Content Marketing – Not Just Hype (Unless You Make It That Way) - April 15, 2013