Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
-
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location
Yet surely it's better to put the sitemaps at the root so you have:
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on...OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xmlI would tend towards (a) rather than (b) - which is the best option?
Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml
Look forward to reading your comments - Luke
-
Thanks Angular Marketing, and Everett... very helpful feedback and much appreciated. Luke
-
Consider this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location
B would not be an acceptable approach as http://example.com/sitemap/chocolatecakes.xml could only contain a sitemap of content located in http://example.com/sitemap. For this same reason, you couldn't create sitemaps in subfolder directories...
This is the best approach from those options you mentioned...
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
http://example.com/sitemap-chocolatecakes-cherryicing.xmlIt is worth noting that you can have a sitemap of sitemaps.. so for example.
http://example.com/sitemap.xml could contain links to http://example.com/sitemap-cakes, http://example.com/sitemap-articles, etc..
http://example.com/sitemap-cakes.xml could contain links to http://example.com/sitemap-chocolatecakes.xml, http://example.com/sitemap-vanilla-cakes.xml, etc..Try not to over-complicate things by trying to create sub-category sitemaps, etc.. Unless you have an exorbitant amount of sub-category pages, or have directories/sections managed by different cms, etc.
You generally see large sites will have a separate sitemap based on content type (company pages, category pages, product pages, blog pages)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird behavior with site's rankings
I have a problem with my site's rankings.
Intermediate & Advanced SEO | | Mcurius
I rank for higher difficulty (but lower search volume) keywords , but my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off. I thought very seriously to start new with a new domain name, cause what ever i do seems that is not working. I will admit that in past (2-3 years ago) i used some of those "seo packages" i had found, but those links which were like no more than 50, are all deleted now, and the domains are disavowed.
The only thing i can think of, is that some how my site got flagged as suspicious or something like that in google. Like 1 month ago, i wrote an article about a topic related with my niche, around a keyword that has difficulty 41%. The search term in 1st page has high authority domains, including a wikipedia page, and i currently rank in the 3rd place. In the other had, i would expect to rank easily for a keyword difficulty of 30-35% but is happening the exact opposite.The pages i try to rank, are not spammy, are checked with moz tools, and also with canirank spam filters. All is good and green. Plus the content of those pages i try to rank have a Content Relevancy Score which varies from 98% to 100%... Your opinion would be very helpful, thank you.0 -
Do you suggest I use the Yoast or the Google XML sitemap for my blog?
I just shut off the All-In-One seo pack plugin for wordpress, and turned on the Yoast plugin. It's great! So much helpful, seo boosting info! So, in watching a video on how to configure the plugin, it mentions that I should update the sitemap, using the Yoast sitemap I'm afraid to do this, because I'm pretty technologically behind... I see I have a Google XML Sitemaps (by Arne Brachhold) plugin turned on (and have had it for many years). Should I leave this one on? Or would you recommend going through the steps to use the Yoast plugin sitemap? If so, what are the benefits of the Yoast plugin, over the Google XML? Thanks!
Intermediate & Advanced SEO | | DavidC.0 -
Change in sitemap from XML to PHP caused to lose all organic rankings
Hi MOZers, I need some advice for my website: http://www.scorepromotions.ca/ I recently changed the sitemap submitted to GWT from http://www.scorepromotions.ca/sitemap.xml to http://www.scorepromotions.ca/google-sitemap.php I deleted the previously submitted XML sitemap from GWT on Friday & submitted the PHP sitemap on the advice of our developer. On Saturday, I noticed that all our organic rankings disappeared. So, I changed the PHP sitemap back to XML sitemap on Sunday. I am hoping to see my organic rankings recover to previous levels. Does anyone have any advice or experience to share about this issue ? Ankush
Intermediate & Advanced SEO | | ScorePromotions0 -
Submitting XML Sitemap for large website: how big?
Hi there, I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages. Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted. Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index. Finally, my questions are: How big should a sitemap be? What proportion of the URLs of a website should it cover? What are the best tools for creating the sitemaps of large websites? How often should a sitemap be updated? Thanks 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
What's the deal with significantLinks?
http://schema.org/significantLink Schema.org has a definition for "non-navigation links that are clicked on the most." Presumably this means something like the big green buttons on Moz's homepage. But does anyone know how they affect anything? In http://moz.com/blog/schemaorg-a-new-approach-to-structured-data-for-seo#comment-142936, Jeremy Nelson says " It's quite possible that significant links will pass anchor text as well if a previous link to the page was set in navigation, effictively making obselete the first-link-counts rule, and I am interested in putting that to test." This is a pretty obscure comment but it's one of the only results I could find on the subject. Is this BS? I can't even make out what all of it is saying. So what's the deal with significantLinks and how can we use them to SEO?
Intermediate & Advanced SEO | | NerdsOnCall0 -
Video XML Sitemap
I've been recently been information by our dev team that we are not allowed legally to make our raw video files available in a video XML sitemap...This is one of the required tags. Has anyone run into a similar situation and has figured out a way around it? Any ideas would be greatly appreciated. Thanks! Margarita
Intermediate & Advanced SEO | | MargaritaS0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0 -
We've just bought a new domain - need advice on the exact procedure to follow...
Hi guys, We've just bought a 3 letter .co.uk domain to replace our current 20 character old domain. Our existing domain is PR5 with quite a few links (that we can modify no problem) We're currently .301 redirecting the new domain to the old domain. I was looking at the procedure in one of the guides but as it's slightly different - is this the correct procedure? 1. prep the duplicate site on new domain and prep the individual htaccess .301 redirects 2. Add new the domain to google webmaster tools bing Webmaster centre 2. On the switchover date - modify all possible incoming links from external sites 3. On the switchover date - apply the .301 redirects and make the new site live 4. On the switchover date - apply the new sitemaps to google & bing 5. on the switchover date - fill out the change of address form in webmaster tools 6. Do the happy dance? many thanks in advance, Tony.
Intermediate & Advanced SEO | | posh_tiger0