Community Discussion - Do you think increasing word count helps content rank better?
-
In the online marketing community, there is a widespread belief that long-form content ranks better.
In today's YouMoz post, Ryan Purthill shares how his research indicated 1,125 to be a magic number of sorts: The closer a post got to this word count, the better it ranked. Diminishing returns, however, were seen once a post exceeded 1,125 words.
- Does this jibe with your own data and experiences?
- Do you think increasing word count helps content rank better in general?
- What about for specific industries and types of content?
Let's discuss!
-
I have back correlated data to performance looking in particular at content length, keyword and phase density and prominence both overall and within different page elements against SERP rank and page performance (engagement or conversion based on whatever the particular critical measure might be). There does appear to be a minimal length of non-boiler plate text necessary to achieve both, although optimal length of content for inspection and semantic determination does not appear to be the same as page outcome, which should not be surprising.
What I have also found is that just while its possible to be too short with content, it is equally possible to be too verbose, particularly if the content begins to extend into a wide variety of topics and subtopics. My guess is that search engines have a harder time deciding what the message of a page is when it turns into an encyclopedia.
-
Numbers, number, numbers.
Simply put, no. You can rank an article 1st page for a highly sought after term, if it says something that is going to perfectly answer a question. It isn't the length of the text, but the content therein.
One example always given, is "i F***ing Love Science". They don't need to write 2000-word articles in order to rank well. Strength is partly in numbers here. They can rank short articles that contain a video with seemingly little work, but Google knows just how accurately it will answer a question.
As Egol also mentioned, there is also lots of studies into the use of the correct keywords, supporting content, and then look at EAT (Expertise, Authority & Trust) and YMYL (Your Money, Your Life) and simply put, are your trustworthy enough to believe what is said, and are you enough of an expert to be making statements about the subject.
I am loving content marketing at the moment as there is a lot going on, and seeing some fantastic wins!
-Andy
-
I don't believe it's the length or the number of words so much as how much more information those extra words bring to the table. More words isn't better, but more information is.
-
we should point out that long content the most of the time is really well written. The creator is looking to engage with the visitors and puts a lot of effort in that.
From my experience, this is really the correct answer.
We have a target minimum of 1500 words per landing page for our content team but of course, if they get to 1100 words and are genuinely stuck for quality content from there, 1100 is perfectly ok.
In the early days we started out with a minimum of 500 words and after noticing positive results within days of content going up we started increasing that and measuring the response in terms of rankings and user interaction. Each increment (800, 1000, 1500) saw consistent improvement over the previous one but 1500 words did seem to be the tipping point; beyond that there were significantly diminishing returns.
As you mentioned, that longer content is typically going to have far more effort into it so really, what the Moz study has measured is a correlation between quality+wordcount and improved rankings.
-
I don't think there is a magic number at all when it comes to content length. Writing an extra 500 words just to fluff up an article or SEO page isn't going to help anything or anyone. The ultimate goal of search engines is to provide the best results for a query, therefore the ultimate goal of content writing should be to solve a problem, provide an answer, et al. If you can do that in 200 words, great, if your product/service is complex and requires much more education and it takes 2,000 words, great.
We should write with the user in mind, get into the mindset of someone searching for our offerings and think about what we'd want to read, no matter how long. I don't care how great the content is, if I'm searching for a new pair of running shoes, I'm not reading 1,125 words, and if that's all I see when I land there, I'm bouncing.
-
Thanks for the info. If I look to the southeast from my home or my office the first major ridge of the Appalachians rises out of the Earth and occupies a spectacular 180 degrees of my view. If I cross a few of those ridges to the south the way people talk changes and words seldom heard elsewhere are common in the spoken language. I worked in that area for about twenty years and loved the words, the cadence and the tone that most people used.
-
I can't claim I know the origins of the word. I use it here only as a synonym of "cling on to". My name is a bit more mundane, in that it was a street I used to work on when I created my SEO accounts.
-
Glom ?? A word, I used to hear in a previous life.
Now, maybe I understand the name "Highland" ?
From what I know glom is a word from the Scots dialect, used here in the states by people in parts of New England and the Appalachians.
-
I think that this is going to fall into the same category as some other ideas about "optimal content". Back in the day there was "keyword density". Then came "latent semantic indexing" where your words had to relate to other words on your page. And now we have a "magic" word count (don't get me wrong, it's an interesting stat)
I had to spend a LONG time deprogramming people from these ideas because people glom onto them as the limit lines of SEO. They're dangerous in the sense that if someone thinks the line is "10% keyword density" or "1125 words" people will start measuring them and making sure that their page on "blue widgets" has exactly 10% KD and 1123 words so Google will love it (who cares if it's crap nobody will read?)
My advice on content is that it should read naturally. Don't pull out any measuring sticks. Stop with the SEO hyper-focus. If it doesn't read like something you would tell a personal friend it's probably not worth writing. Or ranking...
-
For a while I was seeing Google respond to certain search queries with in depth article options. They experimented with a small section that was similar to page by / author results. It doesn't seem to turn up as much but I did find this:
https://support.google.com/webmasters/answer/3280182?hl=en
Is this still happening?
-
I'm with EGOL on this. "Don't underestimate the value of great media, probably more valuable than the text but without the text it's impotent." I'd add promotion efforts to the that statement. Even great and long content needs a bit of promotion to get the attention it deserves.
-
I just read Ryan's article a second time and reflected on my beliefs as described above.
They looked at "related search" to see if there were topics that would beef up their articles. It is possible that adding information about topics made their article more relevant to Google because it "covered topics that people are asking about". I wonder if the "hernia" and "gall stones" articles had that type of improvement. That could explain the jump in rankings because of a sudden increase in the relevance of the article to the query.
I've always belived that "a diversity of important query words" is key to rankings. Ryan's study points to where the important query words are recommended by Google. I really like how he did this and plan to look at it when I revise old articles or write new articles.
I have always believed that a "media beyond text" is important. My thinking was that photo, video, tabluar data was where to get this. However, his "Q & A" and callouts with "prevalence information" might have the same effect because they give the reader "something special" to consider while reading the article. It is possible that the article already has such information embedded within it, but calling it out with a diverse format could be "refreshing change" or "more interesting" for the reader.
I think that his article was one of the most important articles that has been on the Moz Blog. Reading it a second time has probably been one of the best investments of my time in the past year. Thank you Ryan.
-
I feel it does. To get away from just link stuffing. Having quality content surrounding your anchor text in an informative and relative way I feel always performs better. I agree with the above comment on the 1000 words + always do seem to perform well.
I try and structure things to around 1 link, or anchor text, per decent paragraph of quality information.
-
Hi,
First of all we should put a limit: how long is too long? Personally I'd like to put the limit over 2.000 words.
It's known fact that Google loves long content. But also we should point out that long content the most of the time is really well written. The creator is looking to engage with the visitors and puts a lot of effort in that. That's why long content also ranks high.
In my experience nearly and above 1000 words always performed well. Even better than longer articles.
Also, I recommend my writers and my colleagues that make several articles when the extension is massive. That helps increasing interaction with the visitors and keeps them moving over other pages
GR
-
I don't think we can look at a word-count in a vacuum; not only because there are so many contributing factors, but because there are likely variables that effected this "magic number" (a concept that I feel is bunk) that weren't measured and considered or weighed in any way.
Most importantly, I don't think such a figure has any use to a specific person, business, site, etc. It's interesting data, but it says nothing about what any individual should do or expect. In my experience, my readers want anywhere between 300 - 2000 words; but again, this means practically nothing. There are different types of posts, subjects, content-uses, audiences based on these, and many other variables.
I think that, if one's data shows that their posts aren't doing well, word count is one area in which it may be worth exploring different solutions. But there are dozens of more vital and useful data points out there and readily available.
-
I don't believe in "magic numbers" and I don't believe that "walls of text" have any magic either.
I do believe that Google enjoys substantive content, that is understandably written, addresses a diversity of important query words for its topic, engages visitors, includes media beyond text, and is on a website that is in good technical health. The most important part of that is "engaging visitors" and that is a broad term that can include many on-site and off-site actions. Don't underestimate the value of great media, probably more valuable than the text but without the text it's impotent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Ecommerce, Product Content & Google Metrics
Hi I know Google has many different variations of what they consider to be thin content. I wondered if anyone has an idea of the best metric to determine what content you need to improve on your site? I work on a large e-commerce site so there are a thousands of product pages - all with product descriptions similar [but not duplicate] to competitors. I guess in terms of quantity, these pages don't have huge amounts of written content, so I'm wondering what Google classes as 'thin' on a product page: 1. Does Google just expect a conversion to deem that product page useful? And if not, what's the best metric to identify what works vs. what doesn't on product pages in Google's eyes. 2. If adding lots of product pages on mass is bad and will decrease overall authority? The content isn't duplicate, but may be fairly similar to other sites selling the same thing. I'm trying to get our reviews added directly to product pages rather than in a pop up to improve the unique content and I'm starting to write guides, FAQ's and I'll work towards getting video started - however, I'm the only SEO & we don't have much resource so this all takes time. If anyone else has any advice on steps to take that would be great 🙂
Reporting & Analytics | | BeckyKey0 -
Removing blog posts with little/thin content
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up). Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages? I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall. Sam
Reporting & Analytics | | Sam.at.Moz1 -
Moz Rank & Trust | Page vs Sub vs Root
Hey guys, Just need some help deciphering my OSE link metrics for my site theskimonster.com . Page MozRank: 5.51 (highest among my competitors) Page MozTrust: 5.74 (#2 among my competitors) Subdomain MozRank: 4.19 (#4 among my competitors) Subdomain MozTrust: 4.63 (#2 among my competitors) Root Domain MozRank: 3.89 (#5 or last place among competitors) Root Domain MozRank: 4.1 (#5 or last place among competitors) What does this mean? What am I doing right, what do I need to do?
Reporting & Analytics | | Theskimonster1 -
Page Rank - logarithmic or exponential
Possibly a really stupid question. Is Page Rank logarithmic or exponential? I've seen a lot of people talking about Page Rank saying it's logarithmic but when they describe it they're actually talking about an exponential scale. (Apologies if I'm showing a basic misunderstanding in mathematical knowledge - I studied Drama)
Reporting & Analytics | | BenFox0 -
Rank tracker tool - seomoz
Hi All, I have noticed that ranks in the tool -rank tracker are not correct for my keywords sometimes. Does anybody else has experienced this? Isn't it real time updated? What other free rank tracker tool you guys would suggest which can send you weekly accurate updates? Thanks
Reporting & Analytics | | EG0CENTRIX0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
How Do You Determine The Value/Worth of A #1 Ranking?
There are many articles out there, some outdated, and I am confused as to how to determine the value/worth (in terms of estimated revenue and traffic percentage) when you achieve a #1 ranking in google. I am also looking for a way to estimate the value, say, if a lower ranked keyword jumped to a better position--what that increase might be. Any feedback will be greatly appreciated.
Reporting & Analytics | | JCorp0