How much keyword density for Google?
-
I have several pages on one site which have gone down during the past few months. They keyword density on those pages, which is not unnatural, pleased Google for many years. it still pleases Bing. But Google now seems very picky.
Based upon your experience, what is the ideal % keyword density for 2 and 3 word phrases, and should they be left out of alt tags even when proper to put them there?
While Google dominates, we do not wish to alienate BIng/Yahoo.
It is a huge mystery, and experimentation with more non-keyword-related text has so far not born any fruit.
Thank you,
GH
-
I realize this is an old thread, but I came across it when looking for an answer to the question, "What is the ideal keyword density for SEO?" After reading several high-ranking pages on the subject (most of which did not or could not provide an answer), I came up with what I believe to be an answer: The ideal keyword density for a given web page is either: (1) one keyword less than what would cause a visitor of the page to form an opinion that the page is not a credible source of information for that keyword, or (2) one keyword less than what would cause Google to form an opinion that the page is not a credible source of information for that keyword.
Now, I'll leave it to someone better at math to calculate what exactly that number is.
-
It's amazing that everyone here has answers, but no data. If you're going to give an answer, back it up. User-readable? Yes. Documented by Google. No copy? Links only? Works for some sites like CNN, ToysRUs, Walmart that get picked up just because they're huge (observation). But for the majority of the little guys, content plays a role and it would be great to know if the data supports keyword density as still being applicable to G. Tools still measure it (SEOQuake). In natural language, it seems to make sense that a certain percentage of words, on average, are repeated. Google has made it clear that they are trying to master how language is actually used in the real world and providing results based on how humans communicate, not computers. Thus, more people focus, less computer focus. YET, we all know that computers still play a huge role in how SERPs choose winners. We just have to find the balance, right?
-
Thank you for the link, which is useful, but I was surprised to find many very code-heavy sites (14%) ranking at the top as well, even in the era of the "thin page" penalty. The factors and changes in algorithms used are simply overwhelming, so I guess my answer simply lies in making the best site possible and giving up on SEO considerations almost entirely.
-
I still consider keyword density as a litmus test for how I expect spiders to consider my pages. Even more important, but touching on the same concepts as keyword density, is the text-to-code ratio.
http://www.seochat.com/seo-tools/code-to-text-ratio/
And this is something I do spend time optimizing for. With all of the analytical scripts, forms, nestled navigation bars, etc, on a standard site, it's easy to become code-heavy and be penalized for it.
-
I agree with Tom. When it comes to keyword density, ask yourself if it comes off natural, then ask a friend to read the copy. Ask him or her, does it come off natural and would they accept this for copy on a website.
-
Thank you.
You are likely right that there are other off-page issues Google may be taking into account to penalize our white hat site, though they are a mystery to me, as our link profile is very strong according to SEOMOZ, especially compared to much larger competitors. We even have pages which once ranked in the top 5, and which SEOMOZ claims have a very high authority, which have disappeared completely from the Google index (for all intents and purposes, except for precise search of the title).
I suppose that limiting links to other content on the page which use the keywords may be the next step, and largely ignoring the words I am trying to convey. Unlearning everything that worked for 10 years in SEO and still works with Bing (which is providing me personally with better answers to general questions, by the way).
-
Thank you. I agree, but have certainly seen sites (other my own) which go right to the top of the SERPS due to keyword density, as they have little content and no backlinks, so it does still seem to me to be a matter of some concern. If you don't mention keywords, how is an algorithm supposed to know what the page is about or is emphasizing on a site with thousands of pages?
Thank you again for your response.
-
I don't think you can put a general % on keyword density. So long as it reads well and doesn't appeared to be stuffed, it should be fine. Mention it as many times as you can without it appearing forced. There's no doubt that having a keyword appear more times on the page will help Google deduce what the page is about, but similarly anything that would compromise a user experience or attempts to over-optimise for the algorithm can easily be penalised. Saying what number this is though is highly dependent on context, so you can't put a broad figure on an "optimal level"
If you haven't changed the density on the page, I don't believe that your density level would have caused a fall in your rankings (unless it was overdone, as said before). The strength this signal has on your rankings would be small at best, so there's very likely another reason for the fall. I'd start looking at other on-page factors and especially what sort of links you might have earned recently (or indeed lost).
-
There is no longer such thing as "keyword density". This should not be part of any SEO startegy.
Calculating this is a waste of time.
There are pages that rank without having the keyword on the page - seomoz has a good blog post on the subject by Rand I think.
It dose help to have the keyword on url and in the title tag and in h1 and at least once in the actual content but there is no magic formula.
I hate the statement "what is good for the user" as it is over used ny Google but in this case it dose make sense - it can be used once or 10 times or 100 times in the page as long as it make sense for the user and if you read the text is natural and no forced sentences or words. Synonyms of the word or alternative of the phrase are also a very good choice and google can associate those very well.
Personally I never take this in consideration to any of the projects - I used to (back in 2004-2005) when it was important but now based on industry opinions, google's statements and personal tests there is no magic formula and no help if you work on keyword density.
My 2 cents. Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What a Brand should do first for Their Blog Keyword Research or Topic Research?
Hello Moz Community, Need your help guys. What a Brand should do first for their Blog? Keyword Research or Topic Research? Like for the Brand Product page/ eCom Pages, it's quite easy > 1. Buy "Product" Online
Competitive Research | | Max_
2. Buy "Product" "Geo."
3. Product Reviews [for Guest Posting or Reviews]
4. Best "Product" Online and so But, for the blog, it's tough for searching out the keywords that will work if the SEO team or the keyword research team is not aware or don't have the detailed knowledge about the niche they are working on. Please verify: Is it a correct method that an editor should provide a "topic" to the SEO team to find relevant keywords by using a Keyword Research tool that there are enough searches for it which can provide benefits to the blog via the search engine.0 -
Why would a newly created site, have ranked ahead of our site for keyword that we are optimized for?
This newly created site has a DA of 1 and PA of 1, no backlinks, no optimized urls, the keyword they ranked better than us on was listed a total of 6 times on the homepage. Our PA is 29 and DA is 18 for the page that ranks for this keyword. They really copied a few elements of our site but made sure to change a few things, but also list at the bottom of their site 5 keywords that are crucial to our niche industry but they're all linking to the same page. Any ideas? It's an SEO guy running the site, we've watched them toy with adwords trying to be number 1, but not liking the price, so they are here and there with it. Mainly I don't see why they'd rank better for this keyword, we our site have prolly 500% more content that's both of quality and relevance to our customers, in the form of Pdfs, infographics, help sections, and video. Very baffled here, any advice would rock!
Competitive Research | | Deacyde1 -
Can an https: impact (not provided) keywords in Google Analytics? Meaning if I make my site https: will it provide any more keywords. If not how can I get this data?
I have a site that I'd love to have more keyword data on. Can making my site an https: site give me more keyword data than leaving my site an http: site? Thanks!
Competitive Research | | mattdinbrooklyn0 -
Competitor research: No data / results displaying on Keyword tools, Aexa
Hi there! I'm trying to research a few competitors using various Keyword tools (SEM Rush, Compete, Keyword Spy -- even Alexa for high level insight). While the bulk of the competitors generate expected results through these tools (a smattering of their top organic and paid search keywords, some traffic estimations through Alexa), ONE of these competitors lists "No results" across all categories and all tools: http://www.bgstar.ca Despite this, we know that they invest heavily in search -- and my SEM Rush toolbar indicates that they have a Google PR of 5 (though I recognize that that should be taken with a grain of salt). So I'm stumped! Has anyone encountered this before? Is there something structural that they might be doing, that's blocking not only Google-based platforms, but Alexa too? Thanks for your help!
Competitive Research | | MACJ0 -
Keyword Research - tools
Hello all, I would like to find better synonyms for my keywords, and dig deeper to bid / place strategy into place for them. I am currently using the adwords too but it only gives me closely related keyword ideas. Is there something "free" which can give me a better co-relation data to work with? Thanks Aditya
Competitive Research | | shanky10 -
What are the competition's Google Places pages optimised for?
I'm doing some work on a client's Googe Places page, and wondered if there's any way to see what a completitors Places page is currently optimised or categorised for? Basically, we're trying to rank for 'Bathrooms Edinburgh' and almost all of the page 1 SERP's are (unsurprisingly) full of Places results, with #1 Organic slot right down at the bottom of the page. In short - we NEED to get our Places page kicked into shape, and pronto! So, is there any way to find out how the competition's Places pages are ranking so well? e.g. What have they categorised themselves under? Cheers in advance folks, JM
Competitive Research | | JamesMio0 -
Sometime I just don't get Google rankings
We currently rate #10 on google.com.au on Modern Cloth Nappies and the #4 site is a dead link to a page http://www.modernclothnappies.org/ who's total content is: Index of / <address>Apache Server at www.modernclothnappies.org Port 80</address> <address></address> <address>They have been at that rank for a quiet a while and even the cached version is full of broken links.</address> <address></address> <address>It seems Google is quick to jump on low value sites or ones with duplicate content, but what about stale links and sites? Has anyone else had similar experiences of being out ranked by domant or dead sites?</address>
Competitive Research | | oznappies0 -
"keyword" - rank the home page or sub page domain.com/keyword?
One of my clients has a pretty decent website that ranks 1st place for most major keywords in their line of business. EXCEPT one keyword that i've been struggling to get 1st position on Google (currently 2nd). My problem is: let's say "tennis shoes" as a keyword the home page of course has several other shoes listed but I've seen that Google took my home page and made it 2nd position (on 1st page). Where the section domain.com/tennis-shoes is on 2nd page of Google. My question is should i rel cannonical from the /tennis-shoes section to the home page so it focuses more on the specific keyword that i need to get the home 1st? Or should i leave the home page generic and focus more on /tennis-shoes to get that 1st position? What do you Moz'ers Think?
Competitive Research | | mosaicpro0