Someone was effectively DDoSing our server by hitting this tool thousands and thousands of times. Thus we are now forced to require account registration to use this tool.
Please login to use this tool. If you do not yet have a an account, you can register one for free in 30 seconds here.
Calculate Ideal Keyword Density Percentage for SEO
What is The Competition Doing?
Whenever you search for something in Google you only see what ranks. You don't see the hundreds or thousands of pages that have been filtered for pushing too hard.
If you are in the dark on what density levels are reasonable, consider patterning your approach after what is working right now.
- search for your target keyword in Google
- grab 5 of the top ranked pages from the search results
- analyze each of them in a separate tab using this tool
Keep in mind that some highly trusted brands rank more based on their brand strength than the on-page content, thus if you are creating content for a newer & less-trusted website you would likely be better off putting more weight on results from smaller & lesser known websites which still managed to rank well in Google.
What Should My Keyword Density Be?
There is no single optimal or universal keyword density percentage. Each search query is unique & search engines compare (or normalize) documents against other top documents to determine some of their specific thresholds. Some keywords like "credit cards" naturally appear as a two word phrase, whereas other terms may be more spread out. Further, some highly trusted websites with great awareness, strong usage data & robust link profiles can likely get away with more repetition than smaller, less trusted sites can.
As a general rule-of-thumb, when it comes to keyword frequency...
- from a trusted corpus of internal content (like someone's internal site search, or a database of select known trusted content authors), higher is generally better
- from a broad corpus of external content (like general web search, where many people have an incentive to try to game the system), less is generally better
Google On-page Classifiers
When Google rolled out the first Penguin update in April of 2012, they also rolled out some on-page classifiers which penalized some pages that had excessive word repetition.
Lazy & uninformed cheap outsourced writing tends to be fairly repetitive - in part because people paid by the word to churn out cheap content have an incentive to bloat the word count, no incentive to trim the fat, and no incentive to do deep research. Google's leaked remote rater guidelines tells raters to rate low-information repetitive content poorly.
In this day and age the primary use of these types of analysis tools is not to keep dialing up the keyword density, but rather to lower the focus on the core terms while including alternate word forms, accronyms, synonyms & other supporting vocabulary.
- High Density: The upside of aggressive repetition (in terms of helping boost rank for the core term) is fairly minimal & high keyword density increases the likelihood that the page may get filtered.
- Low Density (with variation): The upside of greater word variation (in terms of helping boost rank for a wide variety of related words) is significant & lower density on the core terms decreases the risk of the page getting filtered.
The video to the right discusses optimizing your on-page SEO strategy both for conversions & to include keyword variations in the content.
Good vs Optimal vs Overdoing Keyword Density
Due to web spam, density by itself is a fairly poor measure of relevancy (see slides 17 through 20 in this 2004 PDF from Google's Amit Singhal).
Early / primitive search technology was not very sophisticated due to hardward & software limitations. Those limitations forced early search engines like Infoseek to rely heavily on page titles and other on-page document scoring for relevancy scoring. Over the past 15 years search engines have grown far more powerful due to Moore's law. That has allowed them to incorporate additional data into their relevancy scoring algorithms. Google's big advantage over earlier competitors was analyzing link data.
Dr. E. Garcia explained why density was a bad measure of relevancy in The Keyword Density of Non Sense.
- Other ranking factors
- Search engines may place significant weight on domain age, site authority, link anchor text, localization, and usage data.
- Each search engine has it's own weighting algorithms. These are different for every major search engine.
- Each search engine has it's own vocabulary system which helps them understand related words.
- Some might place more weight on the above domain-wide & offsite factors, while others might put a bit more weight on on-page content.
- The page title is typically weighted more than most any other text on the page.
- The meta keywords tags, comments tags, and other somewhat hidden inputs may be given less weight than page copy. For instance, most large scale hypertext search engines put zero weight on the meta keyword tag.
- Page copy which is bolded, linked, or in a heading tag is likely given greater weighting than normal text.
- Weights are relative.
- If your whole page is in an H1 tag that looks a bit off, and it does not place more weight on any of the text since all the page copy is in it.
- You probably want to avoid doing things like bolding H1 text as it is doubtful it will make a page seem any more relevant.
- Excessive focus on density falls short on a number of fronts.
- When people focus too much on density they often write content which people would not be interested in reading or linking at.
- Lots are queries are a bit random in nature. Roughly 20% to 25% of search queries are unique. When webmaster tweak up page copy for an arbitrarily higher density, they typically end up removing some of the modifier terms that were helping the page appear relevant for many 3, 4, 5 & 6 word search queries.
- Semantic related algorithms may look at supporting vocabulary when determining the relevancy of a page. If you pulled the keyword phrase you were targeting out of your page copy would it still be easy for a search engine to mathematically model what that phrase was and what your page is about given the supporting text? If so, then your rankings will be far more stable AND you will likely rank for a far wider basket of related keywords.
Should I Even Use Density Analysis Software?
These types of tools are still quite valuable when used with the right strategies. The above points were referenced trying to fix old issues from outdated tips & was really just mentioning how 'optimizing' for some arbitrary exact density often misses the mark.
Using analysis tools can still help you uncover a lot of opportunities, including:
- looking at competing sites and discovering some good phrases (and modifiers) to use in your page content, which you may not have noticed at a cursory glance
- helping you to see if a page is way out of synch with top ranked pages
- helping you determine if a particular writer is writing naturally or using excessive repetition
- we also created this tool to help you compare pages side by side.
When I first got in the SEO game I remember some tools trying to tell me to tweak into these stupid arbitrary exact percentages & realizing (after the fact) how futile that was only fuled my rage toward such software. So we created this free tool to serve the legitimate functions of keyword density tools AND warn against some of the futile (& even counter-productive) uses as well. ;)
Usage Notes on How to Calculate Keyword Density
- Default settings: By default this tool...
- includes the meta tags
- does not show words that are part of the default stop list or terms with 2 or less characters in them.
- You can click the check box to turn any of these features on and off.
- Stemming:
- if a word appears as part of a longer word, the stem may show up under the word count of the core word
- this is particularly common for things like the plural version of the word also being counted under the signular version of that word
- to help visually highlight where terms appear on the page, you can use our SEO toolbar's highlighting function
Gain a Competitive Advantage Today
Your top competitors have been investing into their marketing strategy for years.
Now you can know exactly where they rank, pick off their best keywords, and track new opportunities as they emerge.
Explore the ranking profile of your competitors in Google and Bing today using SEMrush.
Enter a competing URL below to quickly gain access to their organic & paid search performance history - for free.
See where they rank & beat them!
- Comprehensive competitive data: research performance across organic search, AdWords, Bing ads, video, display ads, and more.
- Compare Across Channels: use someone's AdWords strategy to drive your SEO growth, or use their SEO strategy to invest in paid search.
- Global footprint: Tracks Google results for 120+ million keywords in many languages across 28 markets
- Historical performance data: going all the way back to last decade, before Panda and Penguin existed, so you can look for historical penalties and other potential ranking issues.
- Risk-free: Free trial & low monthly price.
Your competitors, are researching your site
Find New Opportunities Today