Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can too many pages hurt crawling and ranking?
-
Hi,
I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good.
We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)?
Please advice
-
Hi,
I don't believe having toooooooo many pages will hurt crawling and ranking. Actually having a lot of pages will give crawl bots more pages to crawl and when someone searches for keywords related to your pages, your pages might show up.
The only 2 problems I see from having too many pages are
-
With all these pages, are they all unique? With a lot of pages, it will be hard to manager and to keep track if all of them are unique. If you don't have unique pages and have a lot of duplicate, that will hurt your ranking.
-
The second problem is are you inter-linking all your pages? Can the bot crawl all your pages? You will need to have a good linking system and direct bots to different pages for them to crawl. Having a lot of pages will be difficult to manage as I mentioned above. Can you interlink all of them so the bots can crawl all of them? One solution I see to this is submitting a Sitemap but I am not sure if they will index everything since I had a problem with Google only indexing 4% of my sitemap and still can't find solution.
Hope this helps!
-
-
This is really just speculation...
It sounds like you're solid on the on-page, site architecture side. I would assume that crawling and indexation will slow down though if your offsite signals don't keep up though. By this, I mean that Google might see that you're doing everything right on your end, but that over time you're not creating content that very many people care to link to, share, etc, so they'll stop wasting resources on you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How effective are 301 redirects in passing page rank?
I have a blog which is ranking well for certain terms, and would like to repurpose it to better explain these terms it is ranking for, including updating the url to the new term the blog will be about. The plan being to 301 redirect the old url to new. In the past, I've done this with other pages, and have actually lost much of the rankings that I had earned on the original URL. What is your take on this? Maybe repurpose blog, but maintain original URL just to be on the safe side? Thanks
Technical SEO | | CitimarineMoz0 -
Can spiders crawl jQuery Fancy Box scripts
Hi Everyone - I'm not a technical person at all. I have some content that will be hidden until a user clicks "learn more" where upon it will be displayed via jQuery Fancy Box script. The content behind the learn more javascript is important and I need it to be crawled by search engine spiders. Does anyone know if there will be a problem with this script?
Technical SEO | | Santaur0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | | PooleyK0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ? Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ? Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too. Please help me out in this regard. Thank you beforehand. Amit Ganguly http://aamthoughts.blogspot.com - Sustainable Sphere
Technical SEO | | amit.ganguly0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0