Does many unique pages mean better SERP position?
-
My site has about 50 pages. All of them are unique 500-700 word articles. Almost every page is on its keyword at the 4-8 position in google/yahoo/bing.
I can add a lot of relative unique pages on my site, about 100-200 word content per page. They all will be unique, with unique description and title. I can make about 1000+ pages.
Would you suggest me to do this? Will this action boost my SERP position.
Do more pages mean better SERP position?
-
More pages doesn't always mean better SERP position. There is no black and white answer on this. Here is a helpful excerpt from a recent blog by Dr. Pete :
A common example is when you take a page of content and spin it off across 100s of cities or topics, changing up the header and a few strategic keywords. In the old days, the worst that could happen is that these pages would be ignored. Post-Panda, you risk much more severe consequences, especially if those pages make up a large percentage of your overall content.
Another common scenario is deep product pages that only vary by a small piece of information, such as the color of the product or the size. Take a T-shirt site, for example – any given style could come in dozens of combinations of gender, color, and size. These pages are completely legitimate, from a user perspective, but once they multiple into the 1000s, they may look like low-value content to Google.
The Solution
Unfortunately, this is a case where you might have to bite the bullet and block these pages (such as with META NOINDEX). For the second scenario, I think that can be a decent bet. You might be better off focusing your ranking power on one product page for the T-shirt instead of every single variation. In the geo-keyword example, it’s a bit tougher, since you built those pages specifically to rank. If you’re facing large-scale filtering or devaluation, though, blocking those pages is better than the alternative. You may want to focus on just the most valuable pages and prune those near duplicates down to a few dozen instead of a few thousand. Alternatively, you’ve got to find a way to add content value, beyond just a few swapped-out keywords.
In your case, if you are able to truly make pages that are unique, valuable, and genuinely helpful to users, you definitely stand to benefit your overall traffic. However, If I were in your position I would focus on making more like 50-100 pages with better, higher quality content, rather than 1000's of pages with just a little content.
-
You could take that approach, but I wouldn't necessarily recommend it.
In most cases, 100-200 words is not enough to represent great content. It is possible, but I don't think it is realistic for 1000 pages. Even dictionary pages offer more content.
With that said, if your pages attract attention (i.e. you get users to those pages and they link to or Like/Tweet/+1), then those pages can help improve the SERP of your other pages through good anchor text.
I would suggest a lot of caution here because you risk lowering the overall quality of your site. I would prefer a 50 page site with good content over a 1050 page site with 50 good pages and 1000 low quality pages. No one wants to wade through the bad pages to find the good ones.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google display the home page rather than a page which is better optimised to answer the query?
I have a page which (I believe) is well optimised for a specific keyword (URL, title tag, meta description, H1, etc). yet Google chooses to display the home page instead of the page more suited to the search query. Why is Google doing this and what can I do to stop it?
Intermediate & Advanced SEO | | muzzmoz0 -
How many times will Google read a page?
Hello! Do you know if Google reads a page more than once? We want to include a very robust menu that has a lot of links, so we were thinking about coding a very simple page that loads first and immediately loading the other code that has all the links thinking that perhaps Google will only read the first version but won't read it the second time with all the links. Do you know if we will get penalized? I'm not sure if I got the idea across, let me know if I need to expand more. Thanks,
Intermediate & Advanced SEO | | alinaalvarez0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
I just want a second opinion 🙂 The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
Intermediate & Advanced SEO | | Zanox0 -
Is it better "nofollow" or "follow" links to external social pages?
Hello, I have four outbound links from my site home page taking users to join us on our social Network pages (Twitter, FB, YT and Google+). if you look at my site home page, you can find those 4 links as 4 large buttons on the right column of the page: http://www.virtualsheetmusic.com/ Here is my question: do you think it is better for me to add the rel="nofollow" directive to those 4 links or allow Google to follow? From a PR prospective, I am sure that would be better to apply the nofollow tag, but I would like Google to understand that we have a presence on those 4 social channels and to make clearly a correlation between our official website and our official social channels (and then to let Google understand that our social channels are legitimate and related to us), but I am afraid the nofollow directive could prevent that. What's the best move in this case? What do you suggest to do? Maybe the nofollow is irrelevant to allow Google to correlate our website to our legitimate social channels, but I am not sure about that. Any suggestions are very welcome. Thank you in advance!
Intermediate & Advanced SEO | | fablau9 -
How Many Characters in an H1?
Hi, How long can the text within an H1 tag area be? Should it ideally be 1-2 words or can it be a full sentence? Or more?
Intermediate & Advanced SEO | | mindflash0 -
Dynamically creating unique page titles on enterprise site
Hi, I want to dynamically create unique page titles (possible meta descriptions too) on a 10k page site. Many of the page titles are either duplicates or are missing. I heard about the option of grabbing the page titles from a database or possibly using the h1 as the page title. solmelia.com (the website consist of mostly static pages) Any suggestions would be much appreciated. Best Regards,
Intermediate & Advanced SEO | | Melia0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0