Submitting sitemaps every 7 days
-
Question, if you had a site with more than 10 million pages (that you wanted indexed) and you considered each page to be equal in value how would you submit sitemaps to Google?
Would you submit them all at once: 200 sitemaps 50K each in a sitemap index?
Or
Would you submit them slowly? For example, would it be a good idea to submit 300,000 at a time (in 6 sitemaps 50k each). Leave those those 6 sitemaps available for Google to crawl for 7 days then delete them and add 6 more with 300,000 new links? Then repeat this process until Google has crawled all the links? If you implemented this process you would never at one time have more than 300,000 links available for Google to crawl in sitemaps.
I read somewhere that eBay does something like this, it could be bogus info though.
Thanks
David
-
Thanks Maurizio.
What I am really most concerned about is submitting hundreds of sitemaps to Google and giving them concern that we might be spamming them.
This is why I am considering the second approach where we would submit 6 sitemaps at a time which would total no more than 300,000 links rather than giving them 200 plus sitemaps with 10 million links.
I should have been clearer in my reason for this question. The main goal here is to not have Google freakout because we just gave them 10,000,000 links at one time.
-
hI
it's better divide the sitemap in many files, max 50k and create
how you can read in this page
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35738
"To fix this issue, break your Sitemap into several smaller Sitemaps, and list these in a Sitemap index file. (More information about Sitemap index files.) Upload your Sitemaps and Sitemap index files to your site, then submit these files individually."
Ciao
Maurizio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting URLs After New Search Console
Hi Everyone I wanted to see how people submit their urls to Google and ensure they are all being indexed. I currently have an ecommerce site with 18,000 products. I have sitemaps setup, but noticed that the various product pages haven't started ranking yet. If I submit the individual url through the new Google Search Console I see the page ranking in a matter of minutes. Before the new Google Search Console you could just ask Google to Fetch/Render an XML sitemap and ask it to crawl all the links. I don't see the same functionality working today on Google Search Console and was wondering if there are any new techniques people could share. Thanks,
Intermediate & Advanced SEO | | abiondo
Anthony1 -
Should I submit an additional sitemap to speed up indexing
Hi all, Wondered if there was any wisdom on this that anyone could impart my way? I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense. Our URLs are very long in some cases, so this ought to help with some rationalisation there too. We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible. In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)? The site is thousands of pages. Any benefits / disadvantages anyone could think of? Any thoughts very gratefully received.
Intermediate & Advanced SEO | | ceecee0 -
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Sitemaps: HTML and/or XML?
Can someone explain sitemaps, and if you need html and/or xml? I have a site with a few html sitemaps, one for products, one for categories. I have another site with just one xml sitemap for my entire site (which has massive pages, 600k+). Should I be dividing the site with massive pages into html sitemaps like my other site?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Does a sitemap override Google parameter handling?
This question might seem silly, but I'll ask anyway. We have an eCommerce site with a ton of duplicate content, mostly caused by faceted navigation. In researching ways to reduce the clutter, I've decided to use Google parameter handling to stop Googlebot from crawling pages with certain parameters, like: sort order, page #, etc... Now my question: If I set all of these parameters so that Googlebot doesn't crawl the grids, how will they ever find the individual product pages? We do upload a sitemap with all of the product pages. Does this solve my issue? Or, should I handle the duplicate content with noindex, follow tag? Or, is there an even better way? Thanks
Intermediate & Advanced SEO | | rhoadesjohn0 -
Broken sitemaps vs no sitemaps at all?
The site I am working on is enormous. We have 71 sitemap files, all linked to from a sitemap index file. The sitemaps are not up to par with "best practices" yet, and realistically it may be another month or so until we get them cleaned up. I'm wondering if, for the time being, we should just remove the sitemaps from Webmaster Tools altogether. They are currently "broken", and I know that sitemaps are not mandatory. Perhaps they're doing more harm than good at this point? According to Webmaster Tools, there are 8,398,082 "warnings" associated with the sitemap, many of which seem to be related to URLs being linked to that are blocked by robots.txt. I was thinking that I could remove them and then keep a close eye on the crawl errors/index status to see if anything changes. Is there any reason why I shouldn't remove these from Webmaster Tools until we get the sitemaps up to par with best practices?
Intermediate & Advanced SEO | | edmundsseo0 -
Getting Google to index MORE per day than it does, not with greater frequency nec.
Hi The Googlebot seems to come around healthily, every day we see new pages that we've written the week before get ranked, however, if we are adding 12-15 new products/blog entries/content bits each day, only about 2-3 ever get indexed per day and so, after a few weeks, this builds up to quite a time lag. Is there any way to help step up the amount of new pages that get indexed every day? It really will take 2 or 3 each day, but no more than that, it seems strange. We're fairly new, around 6 months creating content but domain name 18 months old. Will this simply improve over time, or can something be done to help google index those pages? We dont mind if the 15 we do on Monday all get indexed the following Monday for example?
Intermediate & Advanced SEO | | xoffie0