Submitting XML Sitemap for large website: how big?
-
Hi there,
I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages.
Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted.
Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index.
Finally, my questions are:
- How big should a sitemap be? What proportion of the URLs of a website should it cover?
- What are the best tools for creating the sitemaps of large websites?
- How often should a sitemap be updated?
Thanks
-
Thanks Matt, that's really useful
-
Yeah, it's better to have one than not - but I have always aimed to make it as complete as I can. Why? I'm not sure - mostly because I figure Google is GREAT at crawling my main structure - it's those far-reaching pages that I'm hoping they find in the sitemap.
-
Thanks for both your replies - I will check out the tools and recommendations you suggested.
I'm sure I remember somewhere reading a recommendation that it was only necessary to submit the basic site structure in a sitemap. It sounds like this is not the case and that a site map should , if possible, be comprehensive.
Would it be better to have a basic sitemap giving the main navigational URLs than having nothing at all?
-
I've created sitemaps with the paid version of Screaming Frog that were almost 80,000 pages. That's what I'd use. No point asking what % unless you can't get it all. If you're crawling Microsoft, break it up. Otherwise, organize it if you can (category sitemap, month by month, something.) or just make one big finger to Google type sitemap. lol
-
Hi!
First off, since your content can be accessed in multiple ways, I'd make sure that you're applying means to indicate duplicate pages as such to search engines. Easy access to great content is fantastic, but you can devaluate your own pages a lot when you're not careful. If you're not using it yet, I recommend implementing the rel="canonical" tag in your website.
To answer your questions:
- It should cover all URLs that want indexed. Ideally, that would be every URL
- I'm not sure what 'the best' tools would be, but I used http://www.xml-sitemaps.com a lot a few years back. Their sitemaps are free up to 500 URLs. There are payment plans for bigger ones.
- I wouldn't update an XML sitemap for every new page you make once a month. Instead, let the search engine find their own way in that case. Should your entire site structure change, an XML sitemap can be a great way to help search engine understand your new site setup better.
I hope this helps!
- It should cover all URLs that want indexed. Ideally, that would be every URL
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Old Website Build Effecting SEO
So this is a bit of a strange one. My latest website was built on a different domain, then transferred over (as opposed to being built on a subdomain). I was told that the domain which my site was built on wasn't indexed by Google, but looking at the Google Search Console I can see that the old domain name is showing up as the most linked to domain name of my current site - meaning it was indexed. The domain (and all of its pages) does have a 301 redirect to the new website home page (as opposed to their individual pages), but could this be causing me a problem with SEO? Additionally, my website has a sister (UK and US websites), both link to each other on the footer (which appears on every page). Could this be pulling my SEO efforts down if it is a do-follow link?
Intermediate & Advanced SEO | | moon-boots0 -
Would you redirect Website A to Website B, when Website B is in the middle of a HTTP=>HTTPS migration?
Hey guys, I'm curious on your thoughts around this scenario... Website A: 35,000 monthly pageviews 1,000 pages 375 root linking domains currently HTTPS focused on one topic weak rankings for competitive keywords Website B: 3M monthly pageviews 32,500 pages 3,500 root linking domains started HTTP to HTTPS migration 1 week ago. 1/3 of pages indexed as HTTPS. focused on many topics strong rankings for competitive keywords Requirement: I want to have a reliable read on how Website A's keyword rankings change after redirecting it's pages to Website A. This post-migration analysis will be used as a basis to assess the risk of redirecting another website we own that is similar to Website A into Website B. My question: Would you wait until most of the pages on Website B are indexed as HTTPS before doing a 301 of Website A to Website B? Please back up your answer with reasons why or why not 🙂
Intermediate & Advanced SEO | | jeremycabral0 -
SEO Audit Strategy For A Complex Website?
I am looking for a list of SEO audit tools and strategies for a complex website. The things I am looking for include (but not limited to): finding all the subdomains of the website listing all the 301's, 302's, 404's, etc finding current canonical tags suggesting canonical tags for certain links listing / finding all current rel=nofollow's on the website listing internal links which use & don't use 'www.' finding duplicate content on additional domains owned by this website I know how to find some of the items above, but not sure if my methods are optimal and/or the most accurate. Thank you in advance for your input!
Intermediate & Advanced SEO | | CTSupp0 -
Using author on every page of website?
I'm currently get to grips with schema and one thing im using is author on my blog posts and seeing my photo etc on organic searches which are related. I see one of my competitors is using author on every page on their website, not just blog posts etc. Are there any recommendation when it should be used? Should it be site wide or is it really intended for blog posts etc? Would it be wrong for me to use on every page of my website as one of my businesses is myself as a lone person? This is what you get when searching for driving lessons in just about any town! https://www.google.co.uk/#gs_rn=15&gs_ri=psy-ab&tok=LS_DOrAHswmHC9_8AJZEJA&suggest=p&pq=driving instructor brighton&cp=20&gs_id=1k2&xhr=t&q=driving+lessons+crawley&es_nrs=true&pf=p&sclient=psy-ab&oq=driving+lessons+craw&gs_l=&pbx=1&bav=on.2,or.r_cp.r_qf.&bvm=bv.47244034,d.d2k&fp=45c2f917e11bca99&biw=1680&bih=843 Any comments welcome! Antony
Intermediate & Advanced SEO | | Ant710 -
Construction website
Hi, I have a construction website that is aimed at tradesmen. There are 2 goals of the site: 1. To allow potential customers to sign up for a trade account. 2. To allow existing customers to access to products and login to their account to make an order. The site is full of categories and products which should be indexed so we rank for these trade products. The homepage redesign is where i am having an issue: Currently the site is set up like a standard retail site but without prices, which are viewable only when logged in. The homepage is designed such that there is several call to actions about promotions, services and to apply for a trade account, that apply to both existing and potential customers. At the moment there is a poor conversion to get potential customers to apply for a trade account. This is because there is too much distraction away from this goal and they are allowed to engage other areas of the site freely. The main purpose of the homepage should be to encourage potential customers to sign up. The secondary purpose to for existing customers to access the accounts and products. I believe potential customers should not be exposed to the categories and products as it is a distraction from the primary goal. Potential customers, i.e. Tradesmen, would already have a certain understanding of the types of products we provide, so I don't feel it is necessary to allow them to crawl the rest of the site unless they have an account. What are your thoughts on that? Here is my lack of understanding: On the homepage, if I restrict access to categories and products to existing account holders only, where a login is required to proceed, would that mean Google cannot access these pages to index them? Or is this only controlled by NoFollows & Robots.txt? Obviously not indexing is undesirable. I do understand potential customers will need some information about our range of products but the idea is to coerce them to sign up for an account so they can see this information. The more information that is provided to a potential customer, the higher the probability a person can make a decision against applying for an account. Restricting access creates a motivator to reveal information and we capture their data to converse with them personally. This increases the probability of us being able to retain their interest by providing a customised service based on their needs. All of this I feel makes perfect sense to me, the only query/obstacle I have is the indexing of the site. If Google cannot index pages that are restricted by account access, then I would like suggestions to solve/compromise/optimise the above. Just to address the desired behaviour of index pages. If in search a our product page appears, the person clicking the link would either be redirected or exposed to a login or sign up screen to view. Thank you so much for your help. Antonio
Intermediate & Advanced SEO | | AVSFencingSupplies0 -
Should I create a separate sitemap.xml for paginated categories?
For example: http://www.site.com/category/sub-category http://www.site.com/category/sub-category/1 http://www.site.com/category/sub-category/2 http://www.site.com/category/sub-category/3 Thanks in advance! 🙂
Intermediate & Advanced SEO | | esiow20130 -
Ajax website and SEO
Hi all, A client of mine has a website similar to Pintrest. All in Ajax/. So imagine an ajax-grid based animal lover site called domain.com. The domain has three different Categories Cats, Dogs, Mice. When you click on a category, the site doesn't handle the URL and doesn't change the domain So instead of the domain going from domain.com to domain.com/cats, it uses the Ajax script and just shows all the cat pins. and when you click on each pin/post it opens a page such as domain.com/Pin/123/PostTitle It doesn't reference the category. However a page domain.com/cats does exist and you can go there directly. Is this an SEO issue for not grouping all pins under a category? How does Google handle Ajax these days, it use to be real bad but if Pintrest is going so well i'm assuming times have changed? Any other things to be wary of for a grid based/ajax site? I am happy to pay for an hour or two for a more in depth audit/tips if you can feed back on the above. Fairly urgent. Thanks
Intermediate & Advanced SEO | | Profero1