Community Discussion - Do you think increasing word count helps content rank better?
-
In the online marketing community, there is a widespread belief that long-form content ranks better.
In today's YouMoz post, Ryan Purthill shares how his research indicated 1,125 to be a magic number of sorts: The closer a post got to this word count, the better it ranked. Diminishing returns, however, were seen once a post exceeded 1,125 words.
- Does this jibe with your own data and experiences?
- Do you think increasing word count helps content rank better in general?
- What about for specific industries and types of content?
Let's discuss!
-
I have back correlated data to performance looking in particular at content length, keyword and phase density and prominence both overall and within different page elements against SERP rank and page performance (engagement or conversion based on whatever the particular critical measure might be). There does appear to be a minimal length of non-boiler plate text necessary to achieve both, although optimal length of content for inspection and semantic determination does not appear to be the same as page outcome, which should not be surprising.
What I have also found is that just while its possible to be too short with content, it is equally possible to be too verbose, particularly if the content begins to extend into a wide variety of topics and subtopics. My guess is that search engines have a harder time deciding what the message of a page is when it turns into an encyclopedia.
-
Numbers, number, numbers.
Simply put, no. You can rank an article 1st page for a highly sought after term, if it says something that is going to perfectly answer a question. It isn't the length of the text, but the content therein.
One example always given, is "i F***ing Love Science". They don't need to write 2000-word articles in order to rank well. Strength is partly in numbers here. They can rank short articles that contain a video with seemingly little work, but Google knows just how accurately it will answer a question.
As Egol also mentioned, there is also lots of studies into the use of the correct keywords, supporting content, and then look at EAT (Expertise, Authority & Trust) and YMYL (Your Money, Your Life) and simply put, are your trustworthy enough to believe what is said, and are you enough of an expert to be making statements about the subject.
I am loving content marketing at the moment as there is a lot going on, and seeing some fantastic wins!
-Andy
-
I don't believe it's the length or the number of words so much as how much more information those extra words bring to the table. More words isn't better, but more information is.
-
we should point out that long content the most of the time is really well written. The creator is looking to engage with the visitors and puts a lot of effort in that.
From my experience, this is really the correct answer.
We have a target minimum of 1500 words per landing page for our content team but of course, if they get to 1100 words and are genuinely stuck for quality content from there, 1100 is perfectly ok.
In the early days we started out with a minimum of 500 words and after noticing positive results within days of content going up we started increasing that and measuring the response in terms of rankings and user interaction. Each increment (800, 1000, 1500) saw consistent improvement over the previous one but 1500 words did seem to be the tipping point; beyond that there were significantly diminishing returns.
As you mentioned, that longer content is typically going to have far more effort into it so really, what the Moz study has measured is a correlation between quality+wordcount and improved rankings.
-
I don't think there is a magic number at all when it comes to content length. Writing an extra 500 words just to fluff up an article or SEO page isn't going to help anything or anyone. The ultimate goal of search engines is to provide the best results for a query, therefore the ultimate goal of content writing should be to solve a problem, provide an answer, et al. If you can do that in 200 words, great, if your product/service is complex and requires much more education and it takes 2,000 words, great.
We should write with the user in mind, get into the mindset of someone searching for our offerings and think about what we'd want to read, no matter how long. I don't care how great the content is, if I'm searching for a new pair of running shoes, I'm not reading 1,125 words, and if that's all I see when I land there, I'm bouncing.
-
Thanks for the info. If I look to the southeast from my home or my office the first major ridge of the Appalachians rises out of the Earth and occupies a spectacular 180 degrees of my view. If I cross a few of those ridges to the south the way people talk changes and words seldom heard elsewhere are common in the spoken language. I worked in that area for about twenty years and loved the words, the cadence and the tone that most people used.
-
I can't claim I know the origins of the word. I use it here only as a synonym of "cling on to". My name is a bit more mundane, in that it was a street I used to work on when I created my SEO accounts.
-
Glom ?? A word, I used to hear in a previous life.
Now, maybe I understand the name "Highland" ?
From what I know glom is a word from the Scots dialect, used here in the states by people in parts of New England and the Appalachians.
-
I think that this is going to fall into the same category as some other ideas about "optimal content". Back in the day there was "keyword density". Then came "latent semantic indexing" where your words had to relate to other words on your page. And now we have a "magic" word count (don't get me wrong, it's an interesting stat)
I had to spend a LONG time deprogramming people from these ideas because people glom onto them as the limit lines of SEO. They're dangerous in the sense that if someone thinks the line is "10% keyword density" or "1125 words" people will start measuring them and making sure that their page on "blue widgets" has exactly 10% KD and 1123 words so Google will love it (who cares if it's crap nobody will read?)
My advice on content is that it should read naturally. Don't pull out any measuring sticks. Stop with the SEO hyper-focus. If it doesn't read like something you would tell a personal friend it's probably not worth writing. Or ranking...
-
For a while I was seeing Google respond to certain search queries with in depth article options. They experimented with a small section that was similar to page by / author results. It doesn't seem to turn up as much but I did find this:
https://support.google.com/webmasters/answer/3280182?hl=en
Is this still happening?
-
I'm with EGOL on this. "Don't underestimate the value of great media, probably more valuable than the text but without the text it's impotent." I'd add promotion efforts to the that statement. Even great and long content needs a bit of promotion to get the attention it deserves.
-
I just read Ryan's article a second time and reflected on my beliefs as described above.
They looked at "related search" to see if there were topics that would beef up their articles. It is possible that adding information about topics made their article more relevant to Google because it "covered topics that people are asking about". I wonder if the "hernia" and "gall stones" articles had that type of improvement. That could explain the jump in rankings because of a sudden increase in the relevance of the article to the query.
I've always belived that "a diversity of important query words" is key to rankings. Ryan's study points to where the important query words are recommended by Google. I really like how he did this and plan to look at it when I revise old articles or write new articles.
I have always believed that a "media beyond text" is important. My thinking was that photo, video, tabluar data was where to get this. However, his "Q & A" and callouts with "prevalence information" might have the same effect because they give the reader "something special" to consider while reading the article. It is possible that the article already has such information embedded within it, but calling it out with a diverse format could be "refreshing change" or "more interesting" for the reader.
I think that his article was one of the most important articles that has been on the Moz Blog. Reading it a second time has probably been one of the best investments of my time in the past year. Thank you Ryan.
-
I feel it does. To get away from just link stuffing. Having quality content surrounding your anchor text in an informative and relative way I feel always performs better. I agree with the above comment on the 1000 words + always do seem to perform well.
I try and structure things to around 1 link, or anchor text, per decent paragraph of quality information.
-
Hi,
First of all we should put a limit: how long is too long? Personally I'd like to put the limit over 2.000 words.
It's known fact that Google loves long content. But also we should point out that long content the most of the time is really well written. The creator is looking to engage with the visitors and puts a lot of effort in that. That's why long content also ranks high.
In my experience nearly and above 1000 words always performed well. Even better than longer articles.
Also, I recommend my writers and my colleagues that make several articles when the extension is massive. That helps increasing interaction with the visitors and keeps them moving over other pages
GR
-
I don't think we can look at a word-count in a vacuum; not only because there are so many contributing factors, but because there are likely variables that effected this "magic number" (a concept that I feel is bunk) that weren't measured and considered or weighed in any way.
Most importantly, I don't think such a figure has any use to a specific person, business, site, etc. It's interesting data, but it says nothing about what any individual should do or expect. In my experience, my readers want anywhere between 300 - 2000 words; but again, this means practically nothing. There are different types of posts, subjects, content-uses, audiences based on these, and many other variables.
I think that, if one's data shows that their posts aren't doing well, word count is one area in which it may be worth exploring different solutions. But there are dozens of more vital and useful data points out there and readily available.
-
I don't believe in "magic numbers" and I don't believe that "walls of text" have any magic either.
I do believe that Google enjoys substantive content, that is understandably written, addresses a diversity of important query words for its topic, engages visitors, includes media beyond text, and is on a website that is in good technical health. The most important part of that is "engaging visitors" and that is a broad term that can include many on-site and off-site actions. Don't underestimate the value of great media, probably more valuable than the text but without the text it's impotent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Has anyone else seen a large increase in Direct Traffic in Analytics since August 21 2017?
I work at an agency that has Google Analytics to many different sites. About half of them have seen a huge increase in their "Direct" traffic since August 21. Given the cross section of sites I am viewing, I don't think this issue just effects our clients. So has anyone else seen this issue? Any ideas on how to correct/filter?
Reporting & Analytics | | Rogersthe3rd1 -
Do Lot of Tracking via Tag manager Increase Bounce Rate?
Hello Expert, I am doing lots of tracking for my ecommerce site but I am not sure reason for increase in bounce rate as my traffic also increase but I want to make sure that my tracking not affecting my bounce rate. I do tracking via page views, events, custom html, etc so for all the applicable tags Non-Interaction Hit - I set "True" so I am right here? Thanks!
Reporting & Analytics | | dsouzac0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Have i got a manual action on rankings?
Hi guys, so recently Google released the 'manual actions' tab under 'search traffic' in Google webmaster tools. Recently hit by panda i checked this and got: " Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole." However was this not the message before saying it was not a complete manual ban, just affecting a few bad links? I was hit badly in the rankings but still have some good terms and traffic is not that bad still from organic non-brand. I have fixed many of the bad links (horrible articles submissions that went out of control in 2009) - do i do a reconsideratikn request? i have disavowed the bad links that are left 3 weeks ago? so do i wait to see if disavow works or do a reconsideration request?I was sure this was like the message sent out before which meant it wasn't that bad for your site, just a few links ignored? Any advice much appreciated, thank you.
Reporting & Analytics | | pauledwards0 -
Where have the 'most changed keyword rankings' gone from the weekly summary emails?
Since the change to Moz we have noticed that the weekly summary emails do not show the 'most changed keyword rankings' table. We found these extremely helpful and would be disappointed to see these go. Are these going to make a come back?
Reporting & Analytics | | RedAntSolutions2 -
Moz Rank & Trust | Page vs Sub vs Root
Hey guys, Just need some help deciphering my OSE link metrics for my site theskimonster.com . Page MozRank: 5.51 (highest among my competitors) Page MozTrust: 5.74 (#2 among my competitors) Subdomain MozRank: 4.19 (#4 among my competitors) Subdomain MozTrust: 4.63 (#2 among my competitors) Root Domain MozRank: 3.89 (#5 or last place among competitors) Root Domain MozRank: 4.1 (#5 or last place among competitors) What does this mean? What am I doing right, what do I need to do?
Reporting & Analytics | | Theskimonster1 -
X2 Google Analytics affect page rank ?
Hi there, If you had 2 Google Analytics Accounts one to the main site and another to the blog, could this affect the page position in Google? We've suddenly noticed a drop in our KWs and it was shortly after we added another Google Analytics Account. The blog has 68% Bounce rate and the main site has always been about 48%. Any help would be much appreciated. Many thanks Paul
Reporting & Analytics | | webdesigncwd0 -
Why does Google Analytics think PPC traffic is organic?
I have a bastard of a problem... Google Analytics is incorrectly tracking PPC traffic as SEO which is screwing up all my reporting . I don't care for rankings, I care for actual SEO traffic and I can't be sure that what i am seeing is correct which is driving me nuts. Any ideas?
Reporting & Analytics | | Red_Mud_Rookie1