Query on Sitemap xml Root Path
-
- Is it compulsory to have sitemap.xml at this path - abcd.com/sitemap.xml?
My sitename is abcd.com. Now is it compulsory to have sitemap.xml at this path - abcd.com/sitemap.xml only?
a) If i take cnd services where path can be like xyz.com/sitemap.xml and then this sitemap i can submit in robot file so it is fine?
b) What will happen here in webmaster tool as in webmaster tool when we submit sitemap by default it gives us domain name like abcd.com and we have to just add /sitemap.xml
-
Hi Ian,
Thanks for your support!
-
Hi,
What is the main reason for you to upload the sitemap to a new location?
I also found an article on sitemaps that might help. It says: The XML sitemaps protocol defines that XML sitemap files can not contain URLs from different domains. This includes subdomains and other kinds of variations. You have to keep all URLs to a single domain per XML sitemap.
Taken from this site https://www.microsystools.com/products/sitemap-generator/help/multiple-domains-xml-sitemaps/
This includes tutorials on different types of sitemaps.
Hope this can help further.
Thanks,
Ian
-
Hi Ian,
I have a large ecommerce website whose XML site map is currently located at https://www.abcd.com./sitemap.xml , whereas I want to upload it at new location I.e https://data.abcd.com/sitemap.xml it means on new location which is sudomain, Also if it is OK as per Google guidelines to upload your sitemap in sudomain rather than main domain then please let me know that in webmaster console how can I upload this new sitemap? Because when I try to upload sitemap in console it ask mandatorily to upload sitemap to be available at root of the website so what next I can do ? Thanks!
-
Hi,
I Believe it's compulsory for a sitemap to be at abcd.com/sitemap.xml.
Here is a guide on sitemaps and their format for future references: https://www.sitemaps.org/protocol.html
You tend to only have one sitemap, unless you have a large site then you will need divide the sitemaps across different pages, a general rule of thumb is to keep the sitemap below 50,000 URLs. I'd say one sitemap at abcd.com/sitemap.xml should be enough for a standard website.
I'm unsure of Question a) if you could repeat that in more detail please.
Finally, there are two ways to submit a sitemap,
-
Directly within Google Search Console (previously Webmaster Tools) using the 'Test/Add Sitemap' feature, by adding /sitemap.xml and testing it before submitting it.
-
Insert the your sitemap line anywhere in your
robots.txt
file, specifying the path to your sitemap:
Sitemap: http://example.com/sitemap_location.xml
You can also find more information and guidelines on sitemaps here: https://support.google.com/webmasters/answer/156184?hl=en&ref_topic=4581190
Hope that helps.
Ian
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop robots.txt restricting access to sitemap?
I'm working on a site right now and having an issue with the robots.txt file restricting access to the sitemap - with no web dev to help, I'm wondering how I can fix the issue myself? The robots.txt page shows User-agent: * Disallow: / And then sitemap: with the correct sitemap link
Technical SEO | | Ad-Rank0 -
Is it possible to export Inbound Links in a CSV file categorized by Linking Root Domains ?
Hi, I am performing an analysis of the total inbound links to my homepage and I would like to have the total amount of inbound links categorized by the Linking root domains. For example, the Open Site explorer does offer the feature to show you the Linking Root Domains to your page. Then when you click on the first Linking Root Domain, it also shows you the Top Linking Pages ( Which means all the pages that link to your page from this particular top level domain) Now I would like to export this data to a CSV file, but open site explorer only exports the total amount of top level linking domains. Does anyone has a solution to this problem ? Thank you very much for the help in advance!
Technical SEO | | Feweb0 -
Https vs http sitemap
I have a site that does a 301 redirect from http to https I currently have a sitemap auto submitted to google webmaster tools using the http pages. (because i didnt have https before) should I disable that sitemap for http and create one for the https only?
Technical SEO | | puremobile0 -
Wordpress Robots.txt Sitemap submission?
Alright, my question comes directly from this article by SEOmoz http://www.seomoz.org/learn-seo/r... Yes, I have submitted the sitemap to google, bing's webmaster tools and and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with? <code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code> <code>???</code> because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read this, but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt. [http://www.seomoz.org/q/removing-...](http://www.seomoz.org/q/removing-robots-txt-on-wordpress-site-problem) I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right. like <code> <code>
Technical SEO | | joony2008
<code>User-agent: *
Disallow: </code> Sitemap: http://www.example.com/sitemap_index.xml</code> <code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code> <code>?????????</code> ```</code>0 -
Query string in url - duplicate content?
Hi everyone I would appreciate some advice on the following. I have a page which has some nice content on but it also has a search functionality. When a search is run a querystrong is run. So i will get something like mypage.php?id=20 etc. With many different url potentials, will each query string be seen as a different page? If so i don't want duplicate content. So am i best putting canonical tags in the head tags on mypage.php ? to avoid Google seeing potential duplicate content. Many thanks for all your advice.
Technical SEO | | pauledwards0 -
Which is the best wordpress sitemap plugin
Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?
Technical SEO | | simoncmason0 -
301 redirects inside sitemaps
I am in the process of trying to get google to follow a large number of old links on site A to site B. Currently I have 301 redirects as well a cross domain canonical tags in place. My issue is that Google is not following the links from site A to site B since the links no longer exist in site A. I went ahead and added the old links from site A into site A's sitemap. Unfortunately Google is returning this message inside webmaster tools: When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL. However I do not understand how adding the redirected links from site B to the sitemap in site A will remove the old links. Obviously Google can see the 301 redirect and the canonical tag but this isn't defined in the sitemap as a direct correlation between site A and B. Am I missing something here?
Technical SEO | | jmsobe0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0