Automated XML Sitemap for a BIG site
-
Hi,
I would like to do an automated sitemap for my site but it has more than a million pages. It would need to be a sitemap index with a separation on different parts of the site (i.e. news, video) and I'll want a news sitemap and video sitemap as well (of course). Does anyone have any recommended way of making this and how much would you recommend it getting updated? For news and , I would like it to be pretty immediate if possible but the static pages don't need to be updated as much.
Thanks!
-
Another good reference:
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
that points to how to ping:
http://www.sitemaps.org/protocol.html#submit_ping
specific search engine examples:
-
Excellent. Thank you! How would you ping google when a sitemap is updated?
-
Yes, split them out. You will need an index sitemap. That is a sitemap that links to other sitemaps
https://support.google.com/webmasters/answer/75712?vid=1-635768989722115177-4024498483&rd=1
In any given sitemap you can have up to 50,000 URLs listed in it and it can be no larger than 50MB uncompressed.
https://support.google.com/webmasters/answer/35738?hl=en&vid=1-635768989722115177-4024498483
Therefore, you could have an index sitemap with links up to 50,000 other sitemaps. Each of those sitemaps could contain links to 50,000 URLs on your site each.
If my math is right, that would be a max of 2,500,000,000 URLs if you have 50,000 sitemaps of 50,000 URLs each.
(Interesting side note Google allows up to 500 index sitemaps, so if you take 2,500,000,000 pages x 500 - 1,250,000,000,000 URLs that you can submit to Google via sitemaps)
How you divide up your content into sitemaps would relate to how your organize the pages on your site, so you are on the right track in breaking out the sitemaps by types of content. Depending on how big any one section of the site is, you may need to have more of those sitemaps in that type i.e. articlesitemap1.xml articlesitemap2.xml etc. You get the idea.
It is recommended that you ping Google every time a page in a sitemap is updated so Google will come back and recrawl the sitemap. I don't run any sites with 1M URLs but I do run several that run in the 10s of thousands. We break them up by type and ping whenever we update a page in that group. You need to consider your crawl budget with Google in that it may not crawl all 1M pages in your sitemap as often and so you may consider for a group of pages setting them up so that if you have articlesitemap1.xml, articlesitemap2.xml, articlesitemap3.xml you are always adding your newest URLs to the most recent sitemap created (i.e. articlesitemap3.xml) That way you are generally pinging Google about the update of a single sitemap out of the group vs all three.
My other thought is that in addition to pinging Google only on the sitemaps that that you have updated, you show a 304 server response to all sitemaps that have not been updated. 304 means "not modified" since last visit. One of your challenges will be your crawl budget with Google and so why make them recrawl a sitemap they have already crawled? You may want to consider a 304 on any URL on your site that has not changed since last time Google visited.
All of that said, as I mentioned above, I have not worked at the scale of 1M+ pages and would defer to others on the best way to approach. The general thought process would be the same though in trying to figure out the best way to use your sitemaps to manage your crawl budget from Google. Small side note, if you have 1M+ pages and any of those are from the use of things like sorting parameters, duplicate content, printer friendly pages, you may want to just noindex them regardless and leave them out of the sitemap and not allow Google to crawl them to start with.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two long established sites with similar audiences, what do we do?
Hi guys, We operate two long established and reasonably well ranking sites — our company website which was built on a keyword domain: market-stalls.co.uk (approx 15 years online) and our online store which was established several years later on a different domain: tradersupplies.co.uk (approx 9 years online). (At the bottom of this post I've attached real world traffic and turnover figures that demonstrate the issue we're facing) The problem is... The above sites target very similar audiences and keywords and both rank fairly well but I know are likely competing against eachother We're a small company (8-10 employees) and we (or rather, I) don't have the time or resources to blog, build back links, manage opseo and all the social channels etc for both sites. I'm struggling to cope with one. The question is... Do we abandon the original company site (market-stalls.co.uk) in favour of pooling all our resource in to improving rankings for our online store (tradersupplies.co.uk). All our social media presence relates to tradersupplies.co.uk. We don't have any social channels for market-stalls.co.uk. Ironically, the only blog we have is established on market-stalls.co.uk — set up a couple of years ago in the hope to pull ourselves back up the rankings — but it hasn't been updated in over a year due to time restraints. Or do we attempt to keep both sites operational, despite a lack of resource? That would likely include a fairly sizeable overhaul of market-stalls.co.uk to bring it up to date with modern design standards, establishing social media channels for market-stalls.co.uk, creating a blog on tradersupplies.co.uk, and regularly updating two blogs and two sets of social media channels with unique content. Sounds like a pretty huge job right!? Obviously, had we been setting up our business in 2017 and having read the many community posts on the subject of multiple websites, we wouldn't be splitting our time between two websites and would be focussing solely on building one highly ranking site. But unfortunately we're not in this position and we're in a quandary because we don't know whether or not we should let our original, highly ranking company site drop off the radar in favour of focussing on building traffic to our online store. This situation arose out of a decision to establish our online store on a different domain to our company website. Back in 2007 I rebuilt market-stalls.co.uk and spent a lot of time optimising it. The site blew up and we were ranking very well for all kinds of keywords related to market stalls In 2009 we opened our online store tradersupplies.co.uk which sells all of the products advertised on market-stalls.co.uk and then some By using "buy now" buttons on market-stalls.co.uk which redirected to tradersupplies.co.uk, our original site was driving a large amount of traffic and sales to tradersupplies.co.uk. At it's peak it was driving almost £6,000 GBP a month in sales. This has since dropped to around a third/quarter of this total. As the business grew we began to run short of time to maintain market-stalls.co.uk and it has inevitably slipped down the rankings This has also had a direct impact on the referral traffic and resulting sales on tradersupplies.co.uk. I've attached below the analytics which show the drop in referral traffic to tradersupplies.co.uk and the drop off in sales. I have a feeling I know the answer to this debacle but I'm keen to hear the opinions of those that may have found themselves in this position before! UPDATE: I've just had a call with our Magento developer halfway through writing this post ... he has suggested we transfer all content from market-stalls.co.uk over to CMS pages on our Magento powered online store, and create 301 redirects. Apparently this will carry the weight of market-stalls.co.uk over to tradersupplies.co.uk. Does anyone have any thoughts on this? turnover.jpg
Reporting & Analytics | | tinselworm0 -
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
Is there an automated way to determine which pages of your website are getting 0 traffic?
I'm doing a content audit on my company website and want to identify pages with zero traffic. I can use GA for low traffic, but not zero traffic. I can do this manually, but it would take a long time. Are there any tools to help me determine these pages?
Reporting & Analytics | | Ksink0 -
Webmaster Tools Suddenly Asking For Verification of Site Registered for 5 Years
Google Webmaster Tools has been successfully installed on my website, (www.nyc-officespace-leader.com) for more than five years. Suddenly, today I have received a request to Verify this Site". This makes no sense. The only possibility I can think of is that this is somehow tied to the following events in the last month: 1. Launch of new version of website on June 4th
Reporting & Analytics | | Kingalan1
2. Installation of Google of Tag Manager
3. Sudden Increase in number of pages indexed by Google. Unexplained indexing of an additional 175 pages. About 625 pages should be indexed, while 800 are now indexed. In the last month ranking and traffic have fallen sharply. Could it be tat these issues are all linked? But the strangest issue is the request to verify the site. Does anyone have any ideas? Thanks,
Alan0 -
Tracking Clicks on a Global Header Across Multiple Sites
Hey All, A particular client has multiple websites and we're planning on implementing a global header across 15+ sites. I've been looking for a way to track the clicks on this global header across all sites (that is that they are summed up), what's the best way to go about this if I am using Google Analytics (I know Adobe site catalyst could do this no problem with some advanced tweaking), any ideas? I could do the general click tracking route and tag every link but that will only help me if I do that for each site (that being said, if the global header for all sites pulls from a single HTML, then tagging it would technically count all the clicks from all the sites, the only caveat being that I'd have to pick which Google analytics profile I'd want to track the header with). Thoughts? Thanks!
Reporting & Analytics | | EvansHunt0 -
Sitemap 404 error
I have generated a .xml sitemap of the site www.ihc.co.uk. The sitemap generated seems all fine, however when submitting to webmaster tools, it is returning a 404 error? anyone experienced this before. deleted and re-done the process. Tried different xml sitemap generators and even cleared cache along the way.
Reporting & Analytics | | dentaldesign0 -
Wordpress site with increase number of Crawl(400 response Code) errors in Others section of GWT
I have a wordpress site http://muslim-academy.com/I check in Google Webmasters tool today and I see the increase number of errors in Others area of Google webmaster Tool.The error code is 400http://muslim-academy.com/%D8%B3%D9%8A%D8%B1%D8%A9-%D8%AA%D8%A7%D8%B1%D9%8A%D8%AE%D9%8A%D8%A9-%D9%84%D9%84%D8%B1%D8%A6%D9%8A%D8%B3-%D8%AC%D9%85%D8%A7%D9%84-%D8%B9%D8%A8%D8%AF-%D8%A7%D9%84%D9%86%D8%A7%D8%B5%D8%B1-2/%D8%B3%D9%....%3Cbr%20/%3E________________%3Cbr%20/%3E___________%3Ca%20href=?lang=zhOne of the example link of this error.Can you guide me why the number of errors are increasing and how to fix the existing errors.
Reporting & Analytics | | csfarnsworth0 -
Wordpress SEO vs Regular Site SEO
Hey Mozzers I'm building a Wordpress-powered site (self hosted on different domain). I know there are different plug-ins and whatnot for Wordpress SEO, but what exactly am I getting myself into? Am I required to use these plug-ins even if I already know how to do regular SEO on-page coding, or are they mainly dumbed-down tools for mom-bloggers to use? Am I still able to use Google Analytics as I am with a regular site?
Reporting & Analytics | | Travis-W
What else is there to think about? Thanks!0