Submitting XML Sitemap for large website: how big?
-
Hi there,
I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages.
Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted.
Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index.
Finally, my questions are:
- How big should a sitemap be? What proportion of the URLs of a website should it cover?
- What are the best tools for creating the sitemaps of large websites?
- How often should a sitemap be updated?
Thanks
-
Thanks Matt, that's really useful
-
Yeah, it's better to have one than not - but I have always aimed to make it as complete as I can. Why? I'm not sure - mostly because I figure Google is GREAT at crawling my main structure - it's those far-reaching pages that I'm hoping they find in the sitemap.
-
Thanks for both your replies - I will check out the tools and recommendations you suggested.
I'm sure I remember somewhere reading a recommendation that it was only necessary to submit the basic site structure in a sitemap. It sounds like this is not the case and that a site map should , if possible, be comprehensive.
Would it be better to have a basic sitemap giving the main navigational URLs than having nothing at all?
-
I've created sitemaps with the paid version of Screaming Frog that were almost 80,000 pages. That's what I'd use. No point asking what % unless you can't get it all. If you're crawling Microsoft, break it up. Otherwise, organize it if you can (category sitemap, month by month, something.) or just make one big finger to Google type sitemap. lol
-
Hi!
First off, since your content can be accessed in multiple ways, I'd make sure that you're applying means to indicate duplicate pages as such to search engines. Easy access to great content is fantastic, but you can devaluate your own pages a lot when you're not careful. If you're not using it yet, I recommend implementing the rel="canonical" tag in your website.
To answer your questions:
- It should cover all URLs that want indexed. Ideally, that would be every URL
- I'm not sure what 'the best' tools would be, but I used http://www.xml-sitemaps.com a lot a few years back. Their sitemaps are free up to 500 URLs. There are payment plans for bigger ones.
- I wouldn't update an XML sitemap for every new page you make once a month. Instead, let the search engine find their own way in that case. Should your entire site structure change, an XML sitemap can be a great way to help search engine understand your new site setup better.
I hope this helps!
- It should cover all URLs that want indexed. Ideally, that would be every URL
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Does removing large portion of content hurt overall website organic visibility?
Hi everyone, I am wondering if there are any negative SEO effects of removing mass amounts of content specifically in the situation I am about to describe. We have a website that is being converted into Wordpress, however, one particular section that contains a large portion of content (31 pages) have not been transferred over yet. We are very eager to launch the new Wordpress website for lead generation purposes and will gradually re-implement the content over time. From Google Analytics, these pages have not generated a significant amount of organic entrances (~7 ) in the last year. Furthermore, these pages do not contain any backlinks. I would like to know whether or not this would have an overal negative SEO impact on the website even if we 301/create a page for coming soon/310/404 these pages? My gut feeling is no, but I would like to make sure I am not missing anything. Thanks Moz community!
Intermediate & Advanced SEO | | Snaptech_Marketing0 -
Google does not favour php websites?
Hi there. An SEO company recently told me that google does not favour php development? This seems rather sketchy, I have not read that google doesn't favour this anywhere, did I just miss that part of SEO or are these guys blowing a little smoke?
Intermediate & Advanced SEO | | ProsperoDigital1 -
XML Sitemap on another domain
Hi, We've rebuilt our website and created a better sitemap index structure. There's a good chance that we not be able to append the XML files to existing site for technical reasons (don't get me started). I'm reaching out because I'm wondering if can we place the XML files on another website or subdomain? I know this is not best practice and probably very grey but I'm looking for alternatives. If there answer is DON'T DO IT let me know too. Thx
Intermediate & Advanced SEO | | WMCA0 -
External links from banned websites
Currently working with a client that has seen his rankings diminish after the penguin update. I've manually analyzed all his 600 backlinks and identified approximately 85 external links from websites that have been banned by Google. How do these sites affect his current rankings? Should i just disavow all these links using the Google disavow tool? Any comments would be highly appreciated!
Intermediate & Advanced SEO | | Nick_Johansson0 -
XML Sitemap index within a XML sitemaps index
We have a similar problem to http://www.seomoz.org/q/can-a-xml-sitemap-index-point-to-other-sitemaps-indexes Can a XML sitemap index point to other sitemaps indexes? According to the "Unique Doll Clothing" example on this link, it seems possible http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic Can someone share an XML Sitemap index within a XML sitemaps index example? We are looking for the format to implement the same on our website.
Intermediate & Advanced SEO | | Lakshdeep0 -
Website change of address
Hi Everyone, I apologize if the answer to this questions is obvious, but I wanted some input on how changing our web address of our site will affect our SERP. We are looking to change our website address from a.com to b.com due to rebranding of our company (primarly to expand our product line as our current url and company name are restricting). I understand that this can be done using 301 direct and via webmaster tools with google. My question is how does this work exactly? Will our old website address show in SERP rankings, and when a user clicks on the listing are they redirected to our new address? With regards to building new links from press releases etc, do we have links point to our new web address or the old one in order to increase SERP? Does google see our old address and new address as the same website and therefor it does not matter where inbound links point to and both will increase our ranking positions? It took 6 years of in house seo to get our website to rank on the first page of all the major search engines for our keywords, so we am being very cautious before we do anything. Thanks everyone for your input, it is greatly appreciated 🙂
Intermediate & Advanced SEO | | AgentMonkey0 -
Someone To Review My Website
I want someone to review My Website as I'm tired of doing everything but its not getting ranked, one day its in top 50 and on the very next day it jut gets disappear, Website URL: http://goo.gl/duJPf I'm trying to get my inner pages rank in google, http://goo.gl/Ha1Vm http://goo.gl/913DR Can someone please review it and tell me if m doing something wrong or if m missing something! Please suggest Thanks
Intermediate & Advanced SEO | | AnkitRawat0