How much keyword density for Google?
-
I have several pages on one site which have gone down during the past few months. They keyword density on those pages, which is not unnatural, pleased Google for many years. it still pleases Bing. But Google now seems very picky.
Based upon your experience, what is the ideal % keyword density for 2 and 3 word phrases, and should they be left out of alt tags even when proper to put them there?
While Google dominates, we do not wish to alienate BIng/Yahoo.
It is a huge mystery, and experimentation with more non-keyword-related text has so far not born any fruit.
Thank you,
GH
-
I realize this is an old thread, but I came across it when looking for an answer to the question, "What is the ideal keyword density for SEO?" After reading several high-ranking pages on the subject (most of which did not or could not provide an answer), I came up with what I believe to be an answer: The ideal keyword density for a given web page is either: (1) one keyword less than what would cause a visitor of the page to form an opinion that the page is not a credible source of information for that keyword, or (2) one keyword less than what would cause Google to form an opinion that the page is not a credible source of information for that keyword.
Now, I'll leave it to someone better at math to calculate what exactly that number is.
-
It's amazing that everyone here has answers, but no data. If you're going to give an answer, back it up. User-readable? Yes. Documented by Google. No copy? Links only? Works for some sites like CNN, ToysRUs, Walmart that get picked up just because they're huge (observation). But for the majority of the little guys, content plays a role and it would be great to know if the data supports keyword density as still being applicable to G. Tools still measure it (SEOQuake). In natural language, it seems to make sense that a certain percentage of words, on average, are repeated. Google has made it clear that they are trying to master how language is actually used in the real world and providing results based on how humans communicate, not computers. Thus, more people focus, less computer focus. YET, we all know that computers still play a huge role in how SERPs choose winners. We just have to find the balance, right?
-
Thank you for the link, which is useful, but I was surprised to find many very code-heavy sites (14%) ranking at the top as well, even in the era of the "thin page" penalty. The factors and changes in algorithms used are simply overwhelming, so I guess my answer simply lies in making the best site possible and giving up on SEO considerations almost entirely.
-
I still consider keyword density as a litmus test for how I expect spiders to consider my pages. Even more important, but touching on the same concepts as keyword density, is the text-to-code ratio.
http://www.seochat.com/seo-tools/code-to-text-ratio/
And this is something I do spend time optimizing for. With all of the analytical scripts, forms, nestled navigation bars, etc, on a standard site, it's easy to become code-heavy and be penalized for it.
-
I agree with Tom. When it comes to keyword density, ask yourself if it comes off natural, then ask a friend to read the copy. Ask him or her, does it come off natural and would they accept this for copy on a website.
-
Thank you.
You are likely right that there are other off-page issues Google may be taking into account to penalize our white hat site, though they are a mystery to me, as our link profile is very strong according to SEOMOZ, especially compared to much larger competitors. We even have pages which once ranked in the top 5, and which SEOMOZ claims have a very high authority, which have disappeared completely from the Google index (for all intents and purposes, except for precise search of the title).
I suppose that limiting links to other content on the page which use the keywords may be the next step, and largely ignoring the words I am trying to convey. Unlearning everything that worked for 10 years in SEO and still works with Bing (which is providing me personally with better answers to general questions, by the way).
-
Thank you. I agree, but have certainly seen sites (other my own) which go right to the top of the SERPS due to keyword density, as they have little content and no backlinks, so it does still seem to me to be a matter of some concern. If you don't mention keywords, how is an algorithm supposed to know what the page is about or is emphasizing on a site with thousands of pages?
Thank you again for your response.
-
I don't think you can put a general % on keyword density. So long as it reads well and doesn't appeared to be stuffed, it should be fine. Mention it as many times as you can without it appearing forced. There's no doubt that having a keyword appear more times on the page will help Google deduce what the page is about, but similarly anything that would compromise a user experience or attempts to over-optimise for the algorithm can easily be penalised. Saying what number this is though is highly dependent on context, so you can't put a broad figure on an "optimal level"
If you haven't changed the density on the page, I don't believe that your density level would have caused a fall in your rankings (unless it was overdone, as said before). The strength this signal has on your rankings would be small at best, so there's very likely another reason for the fall. I'd start looking at other on-page factors and especially what sort of links you might have earned recently (or indeed lost).
-
There is no longer such thing as "keyword density". This should not be part of any SEO startegy.
Calculating this is a waste of time.
There are pages that rank without having the keyword on the page - seomoz has a good blog post on the subject by Rand I think.
It dose help to have the keyword on url and in the title tag and in h1 and at least once in the actual content but there is no magic formula.
I hate the statement "what is good for the user" as it is over used ny Google but in this case it dose make sense - it can be used once or 10 times or 100 times in the page as long as it make sense for the user and if you read the text is natural and no forced sentences or words. Synonyms of the word or alternative of the phrase are also a very good choice and google can associate those very well.
Personally I never take this in consideration to any of the projects - I used to (back in 2004-2005) when it was important but now based on industry opinions, google's statements and personal tests there is no magic formula and no help if you work on keyword density.
My 2 cents. Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I can see competitors ranking for certain long-tail keywords but cannot find them on web pages. What am I missing?
Hi there. I'm pretty new to SEO and I've been doing a fair bit of training but there is one aspect I have yet to grasp. When I carry out keyword research, I get all these results and I understand the metrics. What I'm not getting is, when a competitor is ranking highly for say "where can I buy fresh turkeys", I assume that that phrase must appear somewhere on the page, but it doesn't. I realise I'm just not thinking about this in the right way. Can anyone offer clarification, please? Kind regards, Bruce
Competitive Research | | BruceBarbour0 -
Is there any update on Google Search Results
I am following some keywords for my website on google. About a month, on the first page of these keywords, there are a lot of changes on ranking. 3-4 website has been falling to 2.3.page and new 3-4 website are shown on 1.page. But these new sites has 0 pagerank and there are no backlinks..These are new websites. What is the reason is there any update on Google search results ?
Competitive Research | | fikhir0 -
Does Keyword Spy Tool Work?
Does anyone have experience of using Keyword Spy? http://www.keywordspy.co.uk/ I wondered if its worth the money?
Competitive Research | | SamCUK0 -
Majestic gives me a 24 situation and 24 trust flow. Seomoz just a total number of 7\. How come the difference? My ranking is still bad, so is Majestic crawling faster then google?
Hi, my total domain value number on SEOmoz is 7. In Majestic it is 24 situation and 24 trust flow. My ranking is still bad (page2) and my competitors have a lower trust/ situation flow in Majestic. But in Seomoz the're better. Is the conclusion that Majestic is more up to date then Google itself and that Seomoz is more inline with the google crawling? Because Majestic doesnt reflect my ranking. (ps I started with the domain for a month, and I only have some history in registration)
Competitive Research | | remkoallertz0 -
Effort for "moderate competition" keywords
I'm rather new to this, and while I'm getting some sense of everything I'm trying to figure out what kind of scope of work lays ahead of me. The keywords I'm looking to rank for are "moderate difficulty" -- somewhere between the 45%-55% "difficulty scale" on seomoz's keyword difficulty report. Assuming I have a number of "A-grade" (according to SEOmoz's reports) optimized pages for these keywords, how many links of a given quality level should I be looking at building up? I mean, of course, the more the better, but if I'm gunning for high DA/PA pages, am I looking at dozens here or hundreds of such links? I can imagine that any answer isn't going to come with much specificity, but if there was just an "idea" of the scale of backlinking involved here, that'd be great!
Competitive Research | | yoni450 -
Keyword based link problem on site
So I think I might have identified an issue with a site that I'm trying to get ranked for a specific keyword but, wanted to get some opinions before I started making some big changes on the site. On my homepage I have the keyword that I would like to be ranked for in the title lets say "Blue Widgets - Company Name', also on the home page I have some descriptions of our services including the keywords. I also have a couple of the keyword based links within in the content, navigation and footer. But these keyword based links all point to another page on the site: blue-widgets.htm. If I really want my home page to rank for the keyword "Blue Widgets' should all of these links point to the home page instead of the sub page? I know there are a great number of other factors that contribute to rankings but looking at my competition, this is something that they seem to be doing. The keyword based links within the content, navigation or footer all point to the homepage. I also have a higher Domain Authority than some of the sites that rank higher than me so I'm not sure if building more links is the answer. Of course I always want to build natural links but these sites don't seem to be doing that either. Any comments, suggestions or input would be greatly appreciated.
Competitive Research | | TRICORSystems0 -
Multiple links from Dmoz/Google directories worldwide
I came across www.soundandvision.com and did a Link Analysis on them.... http://www.opensiteexplorer.org/www.soundandvision.com/a!links I noticed that the top links they have are from Google directories or Google IP's. How has this happened? I am listed in Dmoz in the UK does this mean I have automatically appeared around the world. Dmoz is pretty strict about rejecting links how can a company be listed so much? Is this a good practise? Cheers
Competitive Research | | JohnW-UK1 -
Google Places - Top Listing & Strange Analytics
Hello, we have been working with this customer for a few years, doing their PPC, organic marketing, and we had established one google places listing for them as well. I guess the owner got sold on having someone else work with us to do google places for an additional office location they recently set up, and for whatever reason, they bypassed having us do it. This company never gained FTP access to the website. And despite heavy competition (apparantly), they have that new location listed in the #1 - A spot, without making any changes to the website. And, to top it off, when you review the Google places performance, there is a weird result I had never before seen labeled as "* loc:". You can see what I'm talking in both screen shots. Is there any guidance you can offer, first as to what that listing label means, and second, do you have any ideas how to 'reverse engineer' how they were able to get top listing so quickly for our customer like that? local_results.jpg local_analytics.jpg
Competitive Research | | JerDoggMckoy0