How long should my website content be (max and min number of words)?
-
I saw a web site which has been number 1 on Google for a long time, and the home page has 5700 words, but the results show it is not spam, so what would be the recommended word number for a home page?
-
There is no set word count that will help rank. It is the quality and depth of the content you are providing. Keep the searches intent in mind when developing content. Be engaging and educational, answering the searchers query. i have pages on my website that rank with as little as 300 words and as much as a1000. so as you see its not the word count. its the quality.
Thanks,
Don Silvernail
-
There really isn't a right and wrong answer to this, but I will say that the content has to be long enough to answer the question that the page is answering.
In some circumstances, this might mean a short page with just 200 words - other times it might be more. 5000+ sounds a bit insane and like they are trying to over optimise, in which case would make me wonder about the phrase itself - is it low search volume?
There are so many variables that come into play with ranking pages that it would be impossible to say with any degree of certainty what is going on without actually seeing it.
From a usability view, this also sounds very excessive.
Make your pages clear and straight to the point without any waffle - and with 5000+ words, I would be surprised if all of that was really needed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
There is a copy of our website that is ranking. How can I let Google know our website is the authentic site?
I just found another copy of my old website and have no way to take it down. Unfortunately, it's ranking so he didn't place it as a nofollow. (My boss hired someone to redevelop our website before I came on board and never finished the project). So, could this be hurting us? I tried to look to see if we were being penalized and couldn't find that we were. Also, ever since we migrated to a new domain name, our ranking is tumbling. I've redirected properly and tested to make sure they're resolving correctly and they are. I have no idea what is going on. We've virtually lost all ranking. Any help would be much appreciated.
On-Page Optimization | | npuffer790 -
Website server errors
I launched a new website at www.cheaptubes.com and had recovered my search engine rankings as well after penguin & panda devestation. I'm was continuing to improve the site Sept 26th by adding caching of images and W3 cache but moz analytics is now saying I went from 288 medium issues to over 600 and i see the warning "45% of site pages served 302 redirects during the last crawl". I'm not sure how to fix this? I'm on WP using Yoast SEO so all the 301's I did are 301's not 302's. I do have SSL, could it be Http vs Https? I've asked this question before and two very nice people replied with suggestions which I tried to implement but couldn't, i got the WP white screen of death several times. They suggested the code below. Does anyone know how to implement this code or some other way to reduce the errors I'm getting? I've asked this at stackoverflow with no responses. "you have a lot of http & https issues so you should fix these with a bit of .htaccess code, as below. RewriteEngine On
On-Page Optimization | | cheaptubes
RewriteCond %{HTTPS} !=on
RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L] You also have some non-www to www issues. You can fix these in .htaccess at the same time... RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] You should find this fixes a lot of your issues. Also check in your Wordpress general settings that the site is set to www.cheaptubes.com for both instances." When I tried to do as they suggested it gave me an internal server error. Please see the code below from .htaccess and the server error. I took it out for now. BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^.$ https://%{SERVER_NAME}%{REQUEST_URI} [R,L]
RewriteCond %{HTTP_HOST} !^www. RewriteRule ^(.)$ http://www.%{HTTP_HOST}/$1 [R=301,L]</ifmodule> END WordPress Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, webmaster@cheaptubes.com and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.0 -
When writing content for a website what is the optimal copy length?
My site is currently in the mist of a redesign and I’d like us to compile some recommendations on the length of copy for a page to rank well but can't seem to find any up to date articles on this.Does anyone have any suggestions, comments, or feedback?Thank you.
On-Page Optimization | | PorshaAndrea0 -
Duplicate content issues?
Our company consists of several smaller companies, some of whom deal with very similar things. For instance, two of our companies resell accounts software, but only one provides after-sales support. Because of the number of different companies and websites we have, sometimes it would be easier to simply copy content from one site to the other, optimised in the same manner as, in some instances, we would want different websites to rank for the same keywords. I have been asked my opinion on the potential impact of this practice and my initial response was that we should avoid this due to potential penalties. However, I thought I'd garner opinion from a wider audience before making any recommendations either way. What do people think? Thanks.
On-Page Optimization | | HBPGroup0 -
How to solve duplicate content issue???
I have 5 websites with different domain names, every website have same content, same pages, same website design. Kindly let me know how to solve this issue.
On-Page Optimization | | ross254sidney0 -
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
On-Page Optimization | | cyaindc0 -
SEO for One Page Websites
Hi Are there any SEO guidelines for "one page websites". I'm looking into the 'benefit' it might have in combination with exact match URLs. Many thanks in advance.
On-Page Optimization | | Partouter2 -
Max # of recommended links per page?
I've heard it said that Google may choose to stop following links after the first 100 on a page. The landing/category pages for my site's product catalog have earned quite a respectable PR and positioning in search results, and I'm currently paginating their product listings (about 200 products in a category) so that only a couple dozen products are shown on the first page, with links to "next page" and "previous page" being accomplished via query string (i.e. "?page=3"). An alternative option I have is to link to 100% of the contained products within the category's landing page (which would increase my on-page link count to ~300) and use CSS/Javascript to allow the user to simulate browsing between pages on the client side. My goal is to see as many of my product pages indexed as possible. Is this done better using my current scheme (where Googlebot would have to navigate to, say, Landing Page -> Page 6 -> Deeply Buried Product Page) or in the alternative method above, where all the links are in a single page? Since my landing pages are currently treated pretty well by search engines, would that "trust" cause them to follow more links than might normally be done? Thank you!
On-Page Optimization | | cadenzajon0