How important are sitemap errors?
-
If there aren't any crawling / indexing issues with your site, how important do thing sitemap errors are? Do you work to always fix all errors?
I know here: http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholds
Duane Forrester mentions that sites with many 302's 301's will be punished--does any one know Googe's take on this?
-
Very important. Particularly if you have a large site. We operate a large site with 100,000's of pages and as Dan said it can be difficult to maintain. We use something called Unlimited XML Sitemap Generator which builds XML sitemaps for us automatically. I'd highly recommend it although it takes a bit of fiddling with to get it up and running as it's software which sits on site. We couldn't manage without it as we'd be forever on sitemaps.
We found that getting sitemaps right on a large site made a huge difference to the crawl rate that we encountered in GWT and a huge indexation to follow.
In particular check for 302's. I made the mistake of leaving those for a while and am sure that we suffered from some loss of link equity along the way.
Hope it helps
Dawn
-
Your sitemap should only list pages that actually exist.
If you delete some pages, then you need to rebuild the sitemap.
Ditto if you delete them and redirect.
Google is always lagging, so if you delete 10 pages and then update the sitemap, even if google downloads the sitemap immediately, they will still be running crawls on the old map, and they may be crawling the now-missing pages, but haven't shown the failures in your WMT yet.
If you update your sitemap quickly, it is possible they will never crawl the missing pages and get a 404 or 301.
(but of course, there could be other sites pointing to the now-missing pages, and the 404s will show up elsewhere as missing)
I am always checking, adding, deleting and redirecting pages, and I update the current sitemap every hour and all the others are rebuilt at midnight every night. I usually do deletions just before midnight if I can, to minimize the time the sitemap is out of sync.
-
As far as I know Google is more lenient with sitemap errors, but I would still recommend looking into it. The first step would be to be sure your sitemap is up to date to begin with - and has all the URLs you want (and not any you don't want). The main thing is none of them should 404 and then beyond that, yes, they should return 200's.
Unless you're dealing with a gigantic site which might be hard to maintain, in theory there shouldn't be errors in sitemaps if you have the correct URLs in there.
Even better, if you're running WordPress the Yoast SEO plugin will generate an XML sitemap for you and it update automatically.
Hope that helps!
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang in header...should I do a Sitemap?
A client implemented hreflang tags in the site header. MOZ says you aren't supposed to do an hreflang Sitemap as well. My question is how should I do a Sitemap now (or should I do one at all)?
Intermediate & Advanced SEO | | navdm0 -
Href lang in image or video XML sitemaps
Does anyone know if it is possible/recommended/not recommended to use href lang in image or video XML sitemaps? This had not crossed my mind until recently, but a client asked me this question and I couldn't find any information on this topic.
Intermediate & Advanced SEO | | ChrisKing0 -
How To Implement Pagination Properly? Important and Urgent!
I have seen many instructions but I am still uncertain. Here is the situation We will be implementing rel prev rel next on our paginted pages. The question is: Do we implement self referencing canonical URL on the main page and each paginated page? Do we implement noindex/follow meta robots tag on each paginated page? Do we include the canonical URL for each paginated page in the sitemap if we do not add the meta robots tag? We have a view all but will not be using it due to page load capabilities...what do we do with the viewl all URL? Do we add meta robots to it? For website search results pages containing pagination should we just put a noindex/follow meta robots tag on them? We have seperate mobile URL's that also contain pagination. Do we need to consider these pages as a seperate pagination project? We already canonical all the mobile URL's to the main page of the desktop URL. Thanks!
Intermediate & Advanced SEO | | seo320 -
Google is showing 404 error. What should I do?
Dear Experts, Though few of my website pages are accessible, Google is showing 404 error. What should I do? Even moz reports gives me the same. Problems:
Intermediate & Advanced SEO | | Somanathan
1. Few of my Pages are not yet catched in Google. (Earlier all of them were catched by Google)
2. Tried to fetch the those pages, but Google says, page not found.
3. Included them in sitemap, the result is the same. Please advice: Note: I have recently changed my hosting server.0 -
Is there any importance in including http:// in the url?
I have seen some sites that always redirect to https and some sites that always redirect to http://, but lately I have seen sites that force the url to just the site. As in [sitename].com, no www. no http://. Does this affect SEO in anyway? Is it good or bad for other things? I was surprised when I saw it and don't really know what effect it has.
Intermediate & Advanced SEO | | MarloSchneider0 -
Access denied errors in webmaster tools
I notice today Ihave 2 access denied errors. I checked the help which says: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Therefore I think it may be because I have added a login page for users and googlebot can't access it. I'm using wordpress and presume I need to amend the robots.txt to remove the requirement for google to log in but how do I do that? Unless I'm misunderstanding the problem altogether!
Intermediate & Advanced SEO | | SamCUK0 -
Google Webmaster Tools Sitemap errors for phantom urls?
Two weeks ago we changed our urls so the correct addresses are all lowercase. Everything else 301 redirects to those. We have submitted and made sure that Google has downloaded our updated sitemap several times since. Even so, Webmaster Tools is reporting 33000 + errors in our sitemap for urls that are no longer in our sitemap and haven't been for weeks. It claims to have found the errors within the last couple of days but the sitemap has been updated for a couple of weeks and has been downloaded by Google at least three times since. Here is our sitemap: http://www.aquinasandmore.com/urllist.xml Here are a couple of urls that Webmaster Tools says are in the sitemap: http://www.aquinasandmore.com/catholic-gifts/Caroline-Gerhardinger-Large-Sterling-Silver-Medal/sku/78664
Intermediate & Advanced SEO | | IanTheScot
Redirect error unavailable
Oct 7, 2011
http://www.aquinasandmore.com/catholic-gifts/Catherine-of-Bologna-Small-Gold-Filled-Medal/sku/78706
Redirect error unavailable
Oct 7, 20110