Sitemap error
-
Hey Guys
Everytime I run the tester through google webmaster tools - I keep getting an error that tells me
"Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead."
An idea how to go about fixing this without changing the site around?
https://www.zenory.co.nz/sitemap
I have seen competitors sitemaps look similar to mine.
Cheers
-
awesome thanks so much great info!
-
What you've submitted is your sitemap for human visitors - not a sitemap for search engines.
The sitemap that you submit to Webmaster Tools for search engines will be a .xml file. Eg. instead of https://www.zenory.co.nz/sitemap it would be https://www.zenory.co.nz/sitemap.xml
There are a few ways you can create a sitemap.xml for your site. You can use a program like Screaming Frog to crawl the site and generate a static sitemap. That is, a sitemap file that won't update automatically when you add new pages or posts - you will have to generate a new sitemap every time you add new content.
Depending on which CMS you are using you should be able to generate your sitemap.xml dynamically. That is, a sitemap file that updates itself as you add new content, which makes things a lot easier - especially on large sites!
You can find out more about XML Sitemaps and see a sample here: http://www.web-site-map.com/
It is important that you only include URLs in your sitemap that you want search engines to crawl. Don't include any pages that are no indexed or blocked by robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301's - Do we keep the old sitemap to assist google with this ?
Hello Mozzers, We have restructured our site and have done many 301 redirects to our new url structure. I have seen one of my competitors have done similar but they have kept the old sitemap to assist google I guess with their 301's as well. At present we only have our new site map active but am I missing a trick by not have the old one there as well to assist google with 301's. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
What is the best way to correct 403 access denied errors?
One of the domains I manage is seeing a growing number of 403 errors. For SEO purposes would it be ideal to just 301 redirect them? I am plenty familiar with 404 error issues, but not 403s.
Intermediate & Advanced SEO | | RosemaryB0 -
Images Returning 404 Error Codes. 301 Redirects?
We're working with a site that has gone through a lot of changes over the years - ownership, complete site redesigns, different platforms, etc. - and we are finding that there are both a lot of pages and individual images that are returning 404 error codes in the Moz crawls. We're doing 301 redirects for the pages, but what would the best course of action be for the images? The images obviously don't exist on the site anymore and are therefore returning the 404 error codes. Should we do a 301 redirect to another similar image that is on the site now or redirect the images to an actual page? Or is there another solution that I'm not considering (besides doing nothing)? We'll go through the site to make sure that there aren't any pages within the site that are still linking to those images, which is probably where the 404 errors are coming from. Based on feedback below it sounds like once we do that, leaving them alone is a good option.
Intermediate & Advanced SEO | | garrettkite0 -
Do XML sitemaps need to be manually resubmitted every time they are changed?
I have been noticing lately that quite a few of my client's sites are showing sitemap errors/warnings in Google webmaster tools, despite the fact that the issue with the the sitemap (e.g a URL that we have blocked in robots.txt) was fixed several months earlier. Google talks about resubmitting sitemaps here where it says you can resubmit your sitemap when you have made changes to it, I just find it somewhat strange that the sitemap is not automatically re-scanned when Google crawls a website. Does anyone know if the sitemap is automatically rescanned and only webmaster tools is not updated, or am I going to have to manually resubmit or ping Google with the sitemap each time a change is made? It would be interesting to know other people's experiences with this 🙂
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
Is it bad to host an XML sitemap in a different subdomain?
Example: sitemap.example.com/sitemap.xml for pages on www.example.com.
Intermediate & Advanced SEO | | SEOTGT0 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0