Automated XML Sitemap for a BIG site
-
Hi,
I would like to do an automated sitemap for my site but it has more than a million pages. It would need to be a sitemap index with a separation on different parts of the site (i.e. news, video) and I'll want a news sitemap and video sitemap as well (of course). Does anyone have any recommended way of making this and how much would you recommend it getting updated? For news and , I would like it to be pretty immediate if possible but the static pages don't need to be updated as much.
Thanks!
-
Another good reference:
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
that points to how to ping:
http://www.sitemaps.org/protocol.html#submit_ping
specific search engine examples:
-
Excellent. Thank you! How would you ping google when a sitemap is updated?
-
Yes, split them out. You will need an index sitemap. That is a sitemap that links to other sitemaps
https://support.google.com/webmasters/answer/75712?vid=1-635768989722115177-4024498483&rd=1
In any given sitemap you can have up to 50,000 URLs listed in it and it can be no larger than 50MB uncompressed.
https://support.google.com/webmasters/answer/35738?hl=en&vid=1-635768989722115177-4024498483
Therefore, you could have an index sitemap with links up to 50,000 other sitemaps. Each of those sitemaps could contain links to 50,000 URLs on your site each.
If my math is right, that would be a max of 2,500,000,000 URLs if you have 50,000 sitemaps of 50,000 URLs each.
(Interesting side note Google allows up to 500 index sitemaps, so if you take 2,500,000,000 pages x 500 - 1,250,000,000,000 URLs that you can submit to Google via sitemaps)
How you divide up your content into sitemaps would relate to how your organize the pages on your site, so you are on the right track in breaking out the sitemaps by types of content. Depending on how big any one section of the site is, you may need to have more of those sitemaps in that type i.e. articlesitemap1.xml articlesitemap2.xml etc. You get the idea.
It is recommended that you ping Google every time a page in a sitemap is updated so Google will come back and recrawl the sitemap. I don't run any sites with 1M URLs but I do run several that run in the 10s of thousands. We break them up by type and ping whenever we update a page in that group. You need to consider your crawl budget with Google in that it may not crawl all 1M pages in your sitemap as often and so you may consider for a group of pages setting them up so that if you have articlesitemap1.xml, articlesitemap2.xml, articlesitemap3.xml you are always adding your newest URLs to the most recent sitemap created (i.e. articlesitemap3.xml) That way you are generally pinging Google about the update of a single sitemap out of the group vs all three.
My other thought is that in addition to pinging Google only on the sitemaps that that you have updated, you show a 304 server response to all sitemaps that have not been updated. 304 means "not modified" since last visit. One of your challenges will be your crawl budget with Google and so why make them recrawl a sitemap they have already crawled? You may want to consider a 304 on any URL on your site that has not changed since last time Google visited.
All of that said, as I mentioned above, I have not worked at the scale of 1M+ pages and would defer to others on the best way to approach. The general thought process would be the same though in trying to figure out the best way to use your sitemaps to manage your crawl budget from Google. Small side note, if you have 1M+ pages and any of those are from the use of things like sorting parameters, duplicate content, printer friendly pages, you may want to just noindex them regardless and leave them out of the sitemap and not allow Google to crawl them to start with.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webmaster Tools Suddenly Asking For Verification of Site Registered for 5 Years
Google Webmaster Tools has been successfully installed on my website, (www.nyc-officespace-leader.com) for more than five years. Suddenly, today I have received a request to Verify this Site". This makes no sense. The only possibility I can think of is that this is somehow tied to the following events in the last month: 1. Launch of new version of website on June 4th
Reporting & Analytics | | Kingalan1
2. Installation of Google of Tag Manager
3. Sudden Increase in number of pages indexed by Google. Unexplained indexing of an additional 175 pages. About 625 pages should be indexed, while 800 are now indexed. In the last month ranking and traffic have fallen sharply. Could it be tat these issues are all linked? But the strangest issue is the request to verify the site. Does anyone have any ideas? Thanks,
Alan0 -
Sitemap 404 error
I have generated a .xml sitemap of the site www.ihc.co.uk. The sitemap generated seems all fine, however when submitting to webmaster tools, it is returning a 404 error? anyone experienced this before. deleted and re-done the process. Tried different xml sitemap generators and even cleared cache along the way.
Reporting & Analytics | | dentaldesign0 -
Why does a selection of sites I have written guest posts on not come up on my link analysis?
I have done a few guests posts on different sites and they are not coming up in my link analysis report.
Reporting & Analytics | | meteorelectrical
We created an info graphic on one particular site and this site isn't coming up on the link analysis report. Would there be a reason for this. I ran a check on the sites code and it doesnt contain "nofollow" as i originally thought this was the problem. Here is an example of our work on a site that isn't coming up on the analysis report. http://www.electriciansblog.co.uk/2013/10/energy-saving-using-led-lighting/ Thanks0 -
Internal site referrers
Hi, So I have a segment of my website-let’s call it /examplea, I am trying to figure out how many visits I have to /examplea from all other areas of my website i.e. /exampleb, /examplec etc to /examplea so almost internal site refers to a particular segment of my website, Any thoughts on how to do this within Google analytics ? Marc
Reporting & Analytics | | NRMA0 -
Why did I loose all my product page rankings (e-commerce site)
This friday I noticed that I'd lost pretty much all my product pages in the SERP and also their rankings for the product names. These are products I both have introduced to the market (sweden) and also some that I've been the only one selling. I've analyzed a couple of different ranking-faults. Examples: **"super mario väggdekaler" should rank **http://www.roligaprylar.se/Super-Mario-Vaeggdekaler.html as #1 and has done for several years. Instead this search in my internal search engine ranks #10-#15 with no relevance. www.roligaprylar.se/?q=mario%20v%E4g "jedi morgonrock" should rank www.roligaprylar.se/Jedi-Morgonrock.html as #1 or #2 but instead this url ranks as #12 www.roligaprylar.se/product_detail.php?pid=Jedi-Morgonrock "Charlie sheen bobblehead" (in the swedish serp this should be the most simple term to rank on. previously #1) my internal search engine ranks for #8 with this url <cite>www.roligaprylar.se/?q=Charlie%20Sheen%20Bobblehead</cite>J So I've drawn these conclusions and actions Products that don't rank well longer but still ranks with their alternative non-rewritten url has gotten deep links from affilliates (i track affilliate ids and stuff via this link) and have replaced the original url which is rewritten. Action: Canonical urls for these non-rewritten products to the rewritten version. For example on this product page www.roligaprylar.se/product_detail.php?pid=Jedi-Morgonrock I've placed a canonical for this url www.roligaprylar.se/Jedi-morgonrock.html With the products not ranking at all or when searches in my search engine shows up I suspect some kind of dup content punishment where Google thinks the search result is more important than the product page. Action: All search-pages are now noindex,follow I also increased product name density in terms of keywords on the product page. But I'm still owned and losing tons of money during the holidays (buying adwords at obscene amounts instead hehe). So just wanted to hear with you guys. Are my conclusions and actions correct? What have I missed, what more could I do to reverse this? Thanks Dan
Reporting & Analytics | | nuttinalle0 -
What is this referrer site?
Hi Guys....i keep seeing this in my analytics..can someone tell me what it is? 146w.bay146.mail.live.com thanks for your time
Reporting & Analytics | | nomad-2023230 -
Unable to use Open Site Explorer
I have repeatedly tried to use Open Site Explorer on a site. I always get the same response: No Data Available for this URL The site is 4 months old. It is listed by Alexa in 1.3 million page category. The site ranks well on numerous key terms for the industry it covers. OSE offers 4 reasons for not having data: 1. Recency of page creation. In the explanation offered 45-60 days is the longest estimate it should take. The site has been live close to 120 days. 2. Deep down in the web. The tool doesn't even have any data for the site's home page. 3. Blocked pages. The site is listed in Google and these pages are not blocked. 4. No links. There are plenty of links. Site url is www.terapvp.com What can I do to fix this issue and get data?
Reporting & Analytics | | RyanKent0