Can adding thousands of new indexable URLs to my site at once be a problem?
-
Hi everyone,
I am currently working on a project that will quickly add thousands of new indexable URLs to my site. For context, the site currently has over a million indexable pages. Is there any danger of adding a few thousand URLs at once to the site? Could it potentially affect crawlability/SEO/other pages?
Thank you!
-
I agree - thanks for the response!
-
With that many pages, I assume you are already carefully managing structure and crawlability. I wouldn't think that about few thousand pages on a site of a million would make that much difference (Your new pages only make up 0.1% of your total site), if they're linked properly and included in the site map(s) correctly you shouldn't have much issue. I certainly wouldn't suggest drip feeding them in.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you help by advising how to stop a URL from referring to another URL on my website please?
Stopping a redirect from one URL to another due to a 404 error? Referred URL which is (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/%5Bnull%20id=43484%5D) Referring URL (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/)
Technical SEO | | Nichole.wynter20200 -
How to handle New Page/post with site map
Hi, I've created and submitted to google (through webmaster tool) a site map with the WP plugin XML google maps. Now I've created new pages and posts. My question is: do i have to recreate and re submit another site map to google or can i just submit to google the new pages and posts with the option 'FETCH AS GOOGLE' ? Tx so much in advance.
Technical SEO | | tourtravel0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Google indexing less url's then containded in my sitemap.xml
My sitemap.xml contains 3821 urls but Google (webmaster tools) indexes only 1544 urls. What may be the cause? There is no technical problem. Why does Google index less URLs then contained in my sitemap.xml?
Technical SEO | | Juist0 -
Google is indexing proxy (mirror) site.
We moved the site to a new hosting. Previously the site used Godaddy Windows Hosting with white domain masking. After moving the site we just mirrored the site. We have to use mirrored domain for PPC campaigns because it mirrored site contains true BRAND name and there is better conversion with that domain plus all trade marked keywords are approved for mirrored domain. Robots.txt User-agent: * Host: www.hermitagejewelers.com Disallow: /Bin Disallow: /css www.hermitagejewelers.com is the main domain. Mirror site is www.ermitagejewelers.com (Without the "H" at the beginning) Most of the keywords are now picked up by mirror site. I have not noticed any major changes in ranking except that it ranks for mirror site. We updated the sitemap. Website is designed very poorly (not by us). Also, we submitted the change address request for ermitagejewelers to hermitagejewelers in webmasters. Please let me know any advice to fix that problem. Thank you.
Technical SEO | | MaxRuso1 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90