How much keyword density for Google?
-
I have several pages on one site which have gone down during the past few months. They keyword density on those pages, which is not unnatural, pleased Google for many years. it still pleases Bing. But Google now seems very picky.
Based upon your experience, what is the ideal % keyword density for 2 and 3 word phrases, and should they be left out of alt tags even when proper to put them there?
While Google dominates, we do not wish to alienate BIng/Yahoo.
It is a huge mystery, and experimentation with more non-keyword-related text has so far not born any fruit.
Thank you,
GH
-
I realize this is an old thread, but I came across it when looking for an answer to the question, "What is the ideal keyword density for SEO?" After reading several high-ranking pages on the subject (most of which did not or could not provide an answer), I came up with what I believe to be an answer: The ideal keyword density for a given web page is either: (1) one keyword less than what would cause a visitor of the page to form an opinion that the page is not a credible source of information for that keyword, or (2) one keyword less than what would cause Google to form an opinion that the page is not a credible source of information for that keyword.
Now, I'll leave it to someone better at math to calculate what exactly that number is.
-
It's amazing that everyone here has answers, but no data. If you're going to give an answer, back it up. User-readable? Yes. Documented by Google. No copy? Links only? Works for some sites like CNN, ToysRUs, Walmart that get picked up just because they're huge (observation). But for the majority of the little guys, content plays a role and it would be great to know if the data supports keyword density as still being applicable to G. Tools still measure it (SEOQuake). In natural language, it seems to make sense that a certain percentage of words, on average, are repeated. Google has made it clear that they are trying to master how language is actually used in the real world and providing results based on how humans communicate, not computers. Thus, more people focus, less computer focus. YET, we all know that computers still play a huge role in how SERPs choose winners. We just have to find the balance, right?
-
Thank you for the link, which is useful, but I was surprised to find many very code-heavy sites (14%) ranking at the top as well, even in the era of the "thin page" penalty. The factors and changes in algorithms used are simply overwhelming, so I guess my answer simply lies in making the best site possible and giving up on SEO considerations almost entirely.
-
I still consider keyword density as a litmus test for how I expect spiders to consider my pages. Even more important, but touching on the same concepts as keyword density, is the text-to-code ratio.
http://www.seochat.com/seo-tools/code-to-text-ratio/
And this is something I do spend time optimizing for. With all of the analytical scripts, forms, nestled navigation bars, etc, on a standard site, it's easy to become code-heavy and be penalized for it.
-
I agree with Tom. When it comes to keyword density, ask yourself if it comes off natural, then ask a friend to read the copy. Ask him or her, does it come off natural and would they accept this for copy on a website.
-
Thank you.
You are likely right that there are other off-page issues Google may be taking into account to penalize our white hat site, though they are a mystery to me, as our link profile is very strong according to SEOMOZ, especially compared to much larger competitors. We even have pages which once ranked in the top 5, and which SEOMOZ claims have a very high authority, which have disappeared completely from the Google index (for all intents and purposes, except for precise search of the title).
I suppose that limiting links to other content on the page which use the keywords may be the next step, and largely ignoring the words I am trying to convey. Unlearning everything that worked for 10 years in SEO and still works with Bing (which is providing me personally with better answers to general questions, by the way).
-
Thank you. I agree, but have certainly seen sites (other my own) which go right to the top of the SERPS due to keyword density, as they have little content and no backlinks, so it does still seem to me to be a matter of some concern. If you don't mention keywords, how is an algorithm supposed to know what the page is about or is emphasizing on a site with thousands of pages?
Thank you again for your response.
-
I don't think you can put a general % on keyword density. So long as it reads well and doesn't appeared to be stuffed, it should be fine. Mention it as many times as you can without it appearing forced. There's no doubt that having a keyword appear more times on the page will help Google deduce what the page is about, but similarly anything that would compromise a user experience or attempts to over-optimise for the algorithm can easily be penalised. Saying what number this is though is highly dependent on context, so you can't put a broad figure on an "optimal level"
If you haven't changed the density on the page, I don't believe that your density level would have caused a fall in your rankings (unless it was overdone, as said before). The strength this signal has on your rankings would be small at best, so there's very likely another reason for the fall. I'd start looking at other on-page factors and especially what sort of links you might have earned recently (or indeed lost).
-
There is no longer such thing as "keyword density". This should not be part of any SEO startegy.
Calculating this is a waste of time.
There are pages that rank without having the keyword on the page - seomoz has a good blog post on the subject by Rand I think.
It dose help to have the keyword on url and in the title tag and in h1 and at least once in the actual content but there is no magic formula.
I hate the statement "what is good for the user" as it is over used ny Google but in this case it dose make sense - it can be used once or 10 times or 100 times in the page as long as it make sense for the user and if you read the text is natural and no forced sentences or words. Synonyms of the word or alternative of the phrase are also a very good choice and google can associate those very well.
Personally I never take this in consideration to any of the projects - I used to (back in 2004-2005) when it was important but now based on industry opinions, google's statements and personal tests there is no magic formula and no help if you work on keyword density.
My 2 cents. Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any update on Google Search Results
I am following some keywords for my website on google. About a month, on the first page of these keywords, there are a lot of changes on ranking. 3-4 website has been falling to 2.3.page and new 3-4 website are shown on 1.page. But these new sites has 0 pagerank and there are no backlinks..These are new websites. What is the reason is there any update on Google search results ?
Competitive Research | | fikhir0 -
Keyword Research Competition
Hello everyone. Was talking to my "competition" the other day and they mentioned the amount of traffic they are getting from google. It's double what I currently am receiving. I feel like I am not targeting the correct keywords. What is the best way to research competitions keywords? Anyway to find out what is driving traffic their direction?
Competitive Research | | Jasonalanmagic0 -
Competitor research: No data / results displaying on Keyword tools, Aexa
Hi there! I'm trying to research a few competitors using various Keyword tools (SEM Rush, Compete, Keyword Spy -- even Alexa for high level insight). While the bulk of the competitors generate expected results through these tools (a smattering of their top organic and paid search keywords, some traffic estimations through Alexa), ONE of these competitors lists "No results" across all categories and all tools: http://www.bgstar.ca Despite this, we know that they invest heavily in search -- and my SEM Rush toolbar indicates that they have a Google PR of 5 (though I recognize that that should be taken with a grain of salt). So I'm stumped! Has anyone encountered this before? Is there something structural that they might be doing, that's blocking not only Google-based platforms, but Alexa too? Thanks for your help!
Competitive Research | | MACJ0 -
SEO Keyword Research
Hi, We are SEO beginners so please bear with us! We are trying to promote "Web based Invoicing Software". The SEO company we have signed with offer us 5 keywords for the package we are on with them. They have suggested\offered us: 1. Invoicing Software - Fine
Competitive Research | | Studio33
2. Online Invoicing - Fine
3. Online Invoicing Software - Covered by 1 and 2
4. Small Business Invoicing Software - Covered by 1
5. Invoice Template - Fine. Will Invoice templateS be covered on this one too? My question is does number 3 cover number 1,2 & 4 anyway? If so I am thinking to not go for 1,2, & 4 just keep 3 and choose three other new keywords. Would this be a better strategy and "more for our money?" Or, keep 1 and 2 and lose 3 and 4, would that be a good option. So, in summary options are (all assuming keeping number 5) 1. Keep all
2. Keep 1 & 2 - Lose 3 & 4
3. Keep 3 - Lose 1,2 & 4 4. Any other combo you can suggest? Any advice welcomed Thanks nutnut0 -
How come the results in Google vary with domains
Hello, How is everyone doing? My question is about the google search engine results page. How come some results have the www. in front of them and some don't. Also what are the SEO implications of having www. in front of your search results vs. not. Is this something to do with canonical? I have included a screen shot so you will see what I mean. One result is www.gearyi.com and the result without the www is ingenexdigital.com. R6GLL.png
Competitive Research | | digitalops0 -
What keywords would you suggest for interesting concept?
Not sure if it's OK to ask just about anything as long as it is relevant, but I would love to hear thoughts about good keywords to promote an app that will make it easy to migrate/move phone data and files - contacts, photos, music, calendars - from one phone - like a BB, iPhone or WinMo - to an Android phone or device? I'm having the darnest time coming up with phrases that people who are searching for such an app would use? Thanks
Competitive Research | | holdtheonion0 -
Best methodology for creating local keywords when Google has no data?
Generally I'll look at data for specific geographical searches and incorporate the data from the other keywords, then track the metrics. I think there is likely a more efficient system but I'm not sure where to start.
Competitive Research | | DoriC0 -
Isn't unfair that Keyword domain Exactly Match just overpowers every domain and page authority?
Im currently doing a research for a low-medium competitive keyword (SEO Moz Keyword difficult Tool it showed 36% competition, its a one word keyword) in my country. That keyword had a Google AdWords Broad Match of 368.000 searchs and a Google AdWords Exact Match of 33.100 searchs in April. The currently number one site for that keyword have an exactly match for that keyword, www.KEYWORD.com and nothing else. Then I ran and advanced report to that keyword and heres the initial result: This number one site has a domain authority of only 11 and a page authority of 25. The second site have the following domain name -> www.companynameKEYWORD.com.br (its in Brasil, so theorically and .br should worth more than a .com domain right?) Anyway the second site have a domain authority of 37 and a page rank authority of 45. So after this link all the others are like that, www.companynameKEYWORD.com and the domain and page authority is according to how it suposed to be (higher domain and page are ranked better). The exactly same thing happen when I search for a more long tail of this keyword (wich are 2 words) happen. The exactly match are ranked 1st with a very low page and domian authority while the others come first. Some more info about that number 1 ranked site- The layout is terrible and not user friendly. The site took more than 10 seconds to load Have not a single inpage SEO optimization. According to alexa the bounce rate is around 50% Now follows the data from Linkscape data between the 1st and 2nd ranked pages Overal Score - 19% x 38% Page mozRank - 2.04 x 3.95 Page mozTrust - 4.92 x 5.45 External mozRank - 2.04 x 3.95 Subdomain mozRank - 1.81 x 3.45 Domains Linkin - 4 x 163 External Links - 8 x 265 So, looks like that only two things should be 90% of the focus from a SEO perspective. Have an old exactly keyword match domain and youre good to go 😄 Edited 1: About the linkbacks to each page The 1st page in rank biggest page authority linking back (dofollow) have an authority of 36 from a domain authority of 49 The 2nd in the rank the highest dofollow linkback have a page authority of 40 and domain of 85 Edit 2: 1st in rank were created in 2000 2nd in rank were created in 2007
Competitive Research | | bemcapaz1