When Google rolled out the first Penguin update in April of 2012, they also rolled out some on-page classifiers which penalized some pages that had excessive word repetition.
In this day and age the primary use of these types of analysis tools is not to keep dialing up the keyword density, but rather to lower the focus on the core terms while including alternate word forms, accronyms, synonyms & other supporting vocabulary.
High Density: The upside of aggressive repetition (in terms of helping boost rank for the core term) is fairly minimal & high keyword density increases the likelihood that the page may get filtered.
Low Density (with variation): The upside of greater word variation (in terms of helping boost rank for a wide variety of related words) is significant & lower density on the core terms decreases the risk of the page getting filtered.
The video to the right discusses optimizing your on-page SEO strategy both for conversions & to include keyword variations in the content.
Overdoing Keyword Density
Please note that due to web spam, density by itself is a fairly poor measure of relevancy (see slides 17 through 20 in this 2004 PDF from Google's Amit Singhal).
Early / primitive search technology was not very sophisticated due to hardward & software limitations. Those limitations forced early search engines like Infoseek to rely heavily on-page document scoring for relevancy scoring. Over the past 15 years search engines have grown far more powerful due to Moore's law. That has allowed them to incorporate additional data into their relevancy scoring algorithms. Google's big advantage over earlier competitors was analyzing link data.
Search engines may place significant weight on domain age, site authority, link anchor text and usage data.
Each search engine has it's own weighting algorithms. These are different for every major search engine.
Each search engine has it's own vocabulary system which helps them understand related words.
Some might place more weight on the above domain-wide & offsite factors, while others might put a bit more weight on on-page content.
The page title is typically weighted more than most any other text on the page.
The meta keywords tags, comments tags, and other somewhat hidden inputs may be given less weight than page copy. For instance, most large scale hypertext search engines put zero weight on the meta keyword tag.
Page copy which is bolded, linked, or in a heading tag is likely given greater weighting than normal text.
Weights are relative.
If your whole page is in an H1 tag that looks a bit off, and it does not place more weight on any of the text since all the page copy is in it.
You probably want to avoid doing things like bolding H1 text as it is doubtful it will make a page seem any more relevant.
Excessive focus on density falls short on a number of fronts.
When people focus too much on density they often write content which people would not be interested in reading or linking at.
Lots are queries are a bit random in nature. Roughly 20% to 25% of search queries are unique. When webmaster tweak up page copy for an arbitrarily higher density, they typically end up removing some of the modifier terms that were helping the page appear relevant for many 3, 4, 5 & 6 word search queries.
Semantic related algorithms may look at supporting vocabulary when determining the relevancy of a page. If you pulled the keyword phrase you were targeting out of your page copy would it still be easy for a search engine to mathematically model what that phrase was and what your page is about given the supporting text? If so, then your rankings will be far more stable AND you will likely rank for a far wider basket of related keywords.
Should I Even Use Density Analyzer Tools?
These types of tools are still quite valuable when used with the right strategies. The above points were referenced trying to fix old issues from outdated tips & was really just mentioning how 'optimizing' for some arbitrary exact density often misses the mark.
Using analysis tools can still help you uncover a lot of opportunities, including:
looking at competing sites and discovering some good phrases (and modifiers) to use in your page content, which you may not have noticed at a cursory glance
helping you to see if a page is way out of synch with top ranked pages
helping you determine if a particular writer is writing naturally or using excessive repetition
When I first got in the SEO game I remember some tools trying to tell me to tweak into these stupid arbitrary exact percentages & realizing (after the fact) how futile that was only fuled my rage toward such tools. So we created this tool to serve the legitimate functions of keyword density tools AND warn against some of the futile (& even counter-productive) uses as well. ;)
Other Usage Notes
Default settings: By default this tool...
includes the meta tags
does not show words that are part of the default stop list or terms with 2 or less characters in them.
You can click the check box to turn any of these features on and off.
if a word appears as part of a longer word, the stem may show up under the word count of the core word
this is particularly common for things like the plural version of the word also being counted under the signular version of that word
to help visually highlight where terms appear on the page, you can use our SEO toolbar's highlighting function
Gain a Competitive Advantage Today
Want more great SEO insights? Read our SEO blog to keep up with the latest search engine news, and subscribe to our SEO training program to get cutting edge tips we do not share with the general public. Our training program also offers exclusive SEO videos.
Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
An exclusive interactive community forum
Members only videos and tools
Additional bonuses - like data spreadsheets, and money saving tips