Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you have a /sitemap.xml and /sitemap.html on the same site?
-
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community!
My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain?
For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap
Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts.
I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this.
What do you think?
-
As all 3 of us have said here, Pioneer, there is no issue with setting things up the way you are proposing. Can't make it any clearer than that.
To answer your specific point - /sitemap and /sitemap.xml are categorically NOT seen as the same URL by search engines. They are absolutely considered two different pages. Your statement "...two items with the same url, but different file extensions..." is a non-sequitur. If the URLs have different file extensions, they are by definition NOT the same URL. The file extension (or lack thereof) is an integral part of the URL.
Since 3 different people have given you the same answer and you still don't believe us, why not simply test for yourself?
- Implement the two files as above, then use Google Webmaster Tools to report your XML sitemap location, and confirm that it's finding and recognizing it correctly.
- Then use your browser to go to the URL of the regular sitemap and you'll see that it renders the html version of your sitemap map just fine.
Paul
-
So if I'm understanding you correctly, there's no technical issues with having two items with the same url, but different file extensions, coexisting? I was unable to find any examples of other sites doing this, which is making me question.
I mean, what we're proposing is two separate pieces of content that resolve as:
I want that to work, but it's just amazing to me that it doesn't cause any issues.
-
Just like Oleg & Paul I agree 100% your site may have and it will probably benefit from having both a site map which is a nice feature in HTML format and one in XML format as they are not used for the same purpose by Google nor by individuals so you may safely create a regular webpage in HTML and call it whatever you like if it ends in.XML it is not a forward facing webpage it has a separate use and that uses to tell Google's crawler where you would like it to go now keep in mind Google does not always listen to what we want but site maps can be helpful.
I hope this was of help to you
sincerely,
Thomas
-
As Oleg says - not a problems at all. What you're proposing to do is a pretty standard implementation used by most websites out there.
XML sitemaps are a very specific configuration of data built to a standard that the Search Engines all agreed on - even the naming convention. Spiders are programmed to look for the whole filename (specifically including the .xml suffix) not just the first part of the file name. And yea, connecting to them inside your Webmaster Tools accounts is an extra signal for where the search engines should find them.
Paul
-
Nope, won't cause any problems. The xml sitemap is what you will submit to G and search engines while the HTML one is for your site visitors who want to see all your pages (although it will be crawled and indexed as well).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
<sub>& <sup>tags, any SEO issues?</sup></sub>
Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.
Technical SEO | | Jenny10 -
Image Height/Width attributes, how important are they and should a best practice site include this as std
Hi How important are the image height/width attributes and would you expect a best practice site to have them included ? I hear not having them can slow down a page load time is that correct ? Any other issues from not having them ? I know some re social sharing (i know bufferapp prefers images with h/w attributes to draw into their selection of image options when you post) Most importantly though would you expect them to be intrinsic to sites that have been designed according to best practice guidelines ? Thanks
Technical SEO | | Dan-Lawrence0 -
What is the best way to find missing alt tags on my site (site wide - not page by page)?
I am looking to find all the missing alt tags on my site at once. I have a FF extension that use to do it page by page, but my site is huge and that will take forever. Thanks!!
Technical SEO | | franchisesolutions1 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Should XML sitemaps include *all* pages or just the deeper ones?
Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think?
Technical SEO | | timwills0