Sitemap 404 error
-
I have generated a .xml sitemap of the site www.ihc.co.uk. The sitemap generated seems all fine, however when submitting to webmaster tools, it is returning a 404 error? anyone experienced this before. deleted and re-done the process. Tried different xml sitemap generators and even cleared cache along the way.
-
Hi,
Webmaster tools will normally tell you what the problem is if you click through to the details. I've also tried to access it and it seems fine. I've used a couple of different useragents on websniffer: http://web-sniffer.net/ and I've checked robots.txt for anything funny. All looks good.
My advice would be to try and fetch as Googlebot in WMT - if that works, I wouldn't worry about it. You could also check the server logs to see what the response code is when Googlebot requests the sitemap.
I hope this helps, seems like a bug to me.
Craig
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is reporting a server error, but there's no server error.
Google is erroneously reporting a server error and I just can't figure out the source of the issue. My links work, and GoDaddy ensures me there is no server error. This issue arose when I moved from HTTP to HTTPS and CPanel hosting, but I've got no idea how to fix it. I thought maybe I have duplicate content, but it does not appear that way. Any suggestions? I'm at a loss. www.thedishmaster.com
Reporting & Analytics | | TheDishmaster0 -
I get a - 'Temporarily unreachable' error message when I 'Fetch as Google' Any ideas please??
I wanted to Fetch this page and got this error from Google - Temporarily unreachable. I've never had this issue before?? I checked another page and it came back as 'Complete', so no problems there? Any ideas? Thank you in advance.
Reporting & Analytics | | MissThumann0 -
641 Crawl Errors In My Moz Report - 190 are high priority Duplicate Content
Hi everyone, There are high and medium level errors. I was surprised to see any especially since Google Analytics shows no errors whatsoever.190 errors - duplicate content.A lot of images are showing in the Moz Crawl Report as errors, and when I click on one of these links in the report, it directs to the image which displays on a blog post on the site unusually since I haven't started blogging yet.. So it looks like all those errors are because the images are appearing on their own post.So for example a picture of a mountain would be referred to with www.domain.com/mountains ; the image would be included in the content on a page but why give an image a page/post all of it's own when that was not my intention. Is there a way I can change this?# ----------------------------------------
Reporting & Analytics | | SEOguy1
These are things I first see at the top of the Moz Report: There are 2 similar home urls at the top of the report: http status code is 200 for both (1) and (2) Link Count for (1) is 71. Link count for (2) is 60. No client or server errors Rel Canonical Rel-Canonical Target
Yes http:// domain. co.uk/home
Yes http:// domain. co.uk/home/ Does this mean that the home page is being seen as a duplicate by Google and the search engines?http status codes on every page is 200.Your help would be appreciated.Best Regards,0 -
If you include video in a video sitemap should it also be in your global xml sitemap
I was wondering in hope of not duplicating URLs, if you include video in a video sitemap should it also be in your global xml sitemap. Would it be better to put them in one or both?
Reporting & Analytics | | mattdinbrooklyn0 -
Webmaster Tools Indexed pages vs. Sitemap?
Looking at Google Webmaster Tools and I'm noticing a few things, most sites I look at the number of indexed pages in the sitemaps report is usually less than 100% (i.e. something like 122 indexed out of 134 submitted or something) and the number of indexed pages in the indexed status report is usually higher. So for example, one site says over 1000 pages indexed in the indexed status report but the sitemap says something like 122 indexed. My question: Is the sitemap report always a subset of the URLs submitted in the sitemap? Will the number of pages indexed there always be lower than or equal to the URLs referenced in the sitemap? Also, if there is a big disparity between the sitemap submitted URLs and the indexed URLs (like 10x) is that concerning to anyone else?
Reporting & Analytics | | IrvCo_Interactive1 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
Find 404 Broken Links
We are looking for tools to help us repair broken backlinks, those with 404 error. We have used the Open Site Explorer tool to create a CSV file from our client's URL. We read the "Fixing Crawl Diagnostic Issues", but don't see how to "find the 404ed URL", nor do we see a "referral column" to scroll to. What steps should we take to locate the broken 404 linkages? What steps can we take to streamline repairs?
Reporting & Analytics | | jamie_netsitemarketing.com0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0