Issues with Moz producing 404 Errors from sitemap.xml files recently.
-
My last campaign crawl produced over 4k 404 errors resulting from Moz not being able to read some of the URLs in our sitemap.xml file. This is the first time we've seen this error and we've been running campaigns for almost 2 months now -- no changes were made to the sitemap.xml file. The file isn't UTF-8 encoded, but rather Content-Type:text/xml; charset=iso-8859-1 (which is what Moveable Type uses). Just wondering if anyone has had a similar issue?
-
Hi Barb,
I am sure Joel will chime in also but just to clarify that it is probably not the utf8 encoding or lack of it that is causing the issue. At least with the sitemap urls it is simply the formatting of the xml that is being produced. As to if the other errors you are seeing are caused by the same kind of thing, if you are seeing references to the same encoded characters (%0A%09%) then the answer is most likely yes.
So the issue is not utf8 encoding related (there are plenty of non utf8 encoded sites on the web still!) but how the moz crawler is reading your links and if other tools/systems will be having the same troubles. Have you looked in google webmaster tools to see if it reports similar 404 errors from the sitemap or elsewhere? If you see similar errors in GWT then the issue is likely not restricted to the moz crawler only.
Beyond that, since for the sitemap at least the fix should be relatively simple and quite possibly the other moz errors you see will also be able to be fixed easily by making small adjustments to the templates and removing the extra line breaks/tabs which are creating the issue then it is worth doing so that these errors are removed and you can concentrate on the 'real' errors without all the noise.
-
Joel,
The latest 404 errors have the same type of issue, and are all over place in terms of referrer (none are the sitemap.xml) that I can see.
My question is, can the fact that we don't use the UTF-8 encoding in our site potentially cause issues with other reporting? This is not something we can change easily and I don't want to waste a great deal of effort sorting through "red herring" issues due to the encoding we use on the site.
thoughts?
barb
-
Thanks Joel,
We're looking into this.
barb
-
Thanks Lynn,
We are looking at that. The 4k 404 errors are gone now, but it's possible they will return.
It's a major change for us to switch to UTF-8, so it's not something that will happen anytime soon. I'll just have to be aware that it might be causing issues.
barb
-
Hey Brice,
I just to add to Lynn's great answer with the reason you're seeing the URLs the way they are and to reinforce that.
You have it formatted as such:
<loc>http://www.cmswire.com/cms/web-cms/david-hillis-10-predictions-for-web-content-management-in-2011-009588.php</loc>The crawler converts everything to URL encoding. So those line feeds and tabs will be converted to percentage tags. The reason your root domain is there is because %0A is not the proper start of a URL so RogerBot assumes it's a relative link to the domain your sitemap is on.
The encoding thing is probably not affecting this.
Cheers,
Joel. -
Hi,
It can be frustrating I know, but if you are methodical you will get to the bottom of all errors and then feel much better
Not sure why the number of 404s would have gone down, but in regards the sitemap itself the moz team might be right that utf-8 encoding could be part of the problem. I think it might be more to do with some non visible formatting/characters being added to your site map during creation. %09 is a url encoded tab and %0A is a url encoded line feed, it looks to me that these are getting into your sitemap even though they are not actually visible.
If you download your site map you will see that many (but not all) the urls look like this:
<loc>http://www.cmswire.com/cms/web-cms/david-hillis-10-predictions-for-web-content-management-in-2011-009588.php</loc>Note the new lines and the indent. Some other urls do not have this format for example:
<loc>http://www.cmswire.com/news/topic/impresspages</loc>
It would be wise to ensure both the file creating the sitemap and the sitemap itself are in utf-8, but also it could be as simple as going into the file creating the sitemap and removing those line breaks. Once that is done wait for the next crawl and see if it brings the error numbers down (it should). As for the rest of the warnings, just be methodical, identify where they are occurring and why and work through them. You will get to few or zero warnings, and you will feel good about it!
-
interesting that a new crawl just completed and now I only have 307 404 Errors and a lot of other different errors and warnings. It's frustrating to see such different things each week.
barb
-
Hi Lynn,
I did download the csv and found all the 404 errors were generate from our sitemap.xml file. Here's what the URLs look like:
referring URL is http://www.cmswire.com/sitemap.xml
You'll notice that there is odd formatting wrapping the URL (%0A%09%09%09) + the extra http://www.cmswire to the front of the URL- which does not exist in the actual sitemap.xml file if I view it separately.
Also: Moz support looked at our campaign and they thought the problem was that our sitemap wasn't UTF-8 encoded.
Any ideas?
-
Hi Brice,
What makes you think the issue is that moz cannot read the urls? In the first instance I would want to make sure that something else is not going wrong by checking the urls moz is flagging as 404s, ensuring they actually do or do not exist and if the latter finding out where the link is coming (be it the sitemap or another page on the site). You may have already done this, but if not you can get all this information by downloading the error report in csv and then filtering in excel to get data for 404 pages only.
If you have done this already then if you give us a sample or two of the urls moz is flagging along with the referring url and your sitemap url we might be able to diagnose the issue better. It would be unusual for the moz crawler to start throwing errors all of a sudden if nothing else has changed. Not saying it is impossible for it to be an error with moz, just saying that the chances are on the side of something else going on.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Pro Tools
Hello I ran into a error while using Moz Pro Tools Crawl Site feature. Stating that my wordpress website could not be crawled. When contacting moz they sent me this screenshot stating the reason for this error is because of the odd ip address highlighted in yellow. Only time I've seen this particular ip-address is during local development. If anyone has any advice on how to fix this or what may have caused this issue. I feel this maybe effecting the site's overall search visibility. ednqnL7
Moz Pro | | willakawillow220 -
Moz Data Issues?
Since the launch of Moz something or other has been wrong with my data. Is everyone having these issues? Or is it just me?
Moz Pro | | EcommerceSite0 -
Moz is incorrect
So i have had my latest crawl, and the keyword report is showing that using google.co.uk i have some keywords not within the top 50 - i have just done a spot check (coping & pasting) direct from Moz to google.co.uk and out of 4 i have checked 1 keyword i am number 4 and 1 keyword i am page 3 - therefore within the top 50. This leads me to believe that there is something wrong with Moz - here is 1 example "land in the cayman islands" i am showing in moz as not in the top 50 when i am showing on Google as number.4
Moz Pro | | OasisLandDevelopment0 -
1 page crawled ... and other errors
1. Why is only one (1) page crawled every second time you crawl my site? 2. Why do your bot not obey the rules specified in the robots.txt? 3. Why does your site constantly loose connection to my facebook account/page? This means that when ever i want to compare performance i need to re-authorize, and therefor can not see any data until next time. Next time i also need to re-authorize ... 4. Why cant i add a competitor twitter account? What ever i type i get an "uh oh account cannot be tracked" - and if i randomly succeed, the account added never shows up with any data. It has been like this for ages. If have reported these issues over and over again. We are part of a large scandinavian company represented by Denmark, Sweden, Norway and Finland. The companies are also part of a larger worldwide company spreading across England, Ireland, Continental Europe and Northern Europe. I count at least 10 accounts on Seomoz.org We, the Northern Europe (4 accounts) are now reconsidering our membership at seomoz.org. We have recently expanded our efforts and established a SEO-community in the larger scale businees spanning all our countries. Also in this community we are now discussing the quality of your services. We'll be meeting next time at 27-28th of june in London. I hope i can bring some answers that clarify the problem we have seen here on seomoz.org. As i have written before: I love your setup and you tools - when they work. Regretebly, that is only occasionally the case!
Moz Pro | | alsvik1 -
I did a redirect and now I'm getting duplication errors.
I was told by SEO Moz to do a redirect so that our website would be crawled with and without the www in front of the address. I did and now I'm getting duplicate page and title errors because the crawler is seeing www.oursitename.com and its underpages and oursitename.com and its underpages and giving me duplicate page content errors and duplicate page title errors. Makes sense, but how do I make it stop? Anyone else have this problem?
Moz Pro | | THMCC0 -
Drop in number of Pages crawled by Moz crawler
What would cause a sudden drop in the number of pages crawled/accessed by the Moz crawler? The site has about 600 pages of content. We have multiple campaigns set up in our Pro account to track different keyword campaigns- but all for the same domain. Some show 600+ pages accessed, while others only access 7 pages for the same domain. What could be causing these issues?
Moz Pro | | AllaO0 -
Errors went from 2420 to ZERO
Of course this happened without my intervention, i don't know why but seomoz is reporting 0 errors.
Moz Pro | | iFix0 -
Moz tool bar showing less links
Just checked our links for a couple of our sites and noticed that the number of inbound links has dropped from around 55,000 to 13,000 on one and from 6000 to 700 on the other. GWMT still showing the previous amounts. Anyone else experienced this over the last few days?
Moz Pro | | heatherrobinson0