Sitemap Warnings
-
Due to an issue with our CMS, I had a bunch of URL aliases that were being indexed and causing duplicate content issues.
I disallowed indexing of the bad URLs (they all had a similar URL structure so that was easy). I did this until I could clean up the bad URLs
I then recieved a bunch of sitemap warnings that the URLs that I blocked URLs with robots.txt that were in the sitemap.
Isn't this the point of robots.txt? Why am I getting warnings and how can I get rid of them?
-
Irving -
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Irving,
Totally get that and we're working to ensure they are no longer included in the sitemap.
Thanks,
Lisa
-
The purpose of your sitemap is to tell Google to go out and index the pages you specify. The purpose of the robots.txt is to tell Google not to index the page. The warning is likely just a precaution to let you know that you may have by accident requested them to block something in robots.txt. If you remove the URL's from your submitted sitemap the warnings should disappear. If you leave them, you will have warnings but Google should not index the content since your blocked it in robots.txt.
-
you are not supposed to include blocked URLs in the sitemap.xml files, or Google considers it wasting their crawl time. Are these automated sitemap.xml files?
You're basically saying "come index these pages i've listed, but don't index them!"
Remove the URLs that are blocked content (or rerun/regenerate them) and resubmit the sitemaps and the warnings will go away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound Links Warning
I got the following error about our domain name in Link Explorer. "You entered the URL freexy.net which redirects to youlovelife.com/?domain=freexy.net. Click here to analyze youlovelife.com/?domain=freexy.net instead." Can you give me an advice about this problem?
Moz Pro | | ligia.tatucu0 -
Sitemap Best Practices
My question is regarding the URL structure best practices of a sitemap. My website allows search any number of ways, i.e. 1. http://www.website.com/category/subcategory/product 2. http://www.website.com/subcategory/product 3. http://www.website.com/product However, I am not sure which structure to use in the sitemap (which is being written manually). I know that for SEO purposes the 3rd option is best as the link is more relevant to that individual product, but the Moz tool states that the home page should have less than 100 links (although Google doesn't penalise for having more) and by writing my entire site in the 3rd way it would result in a lot more links adjoining to the home page. It is either the 2nd or 3rd option, I think, as the 1st category is not keyword specific (rather a generic term, i.e. novelties). Does anyone have experience with this?
Moz Pro | | moon-boots0 -
Missing Title for Sitemap
Our site is built on Wordpress and we use a very popular SEO plugin called Yoast to generate our sitemap (as well as handle multiple other SEO functions). When MOZ's spider crawls our site, this sitemap triggers an error saying "Missing Title or Empty." My question is how can I avoid having this error hurt me in terms of my rankings. It seems strange to me that such a ubiquitous plugin would be generating something as important as a sitemap in an incorrect format.
Moz Pro | | ShatterBuggy0 -
Why might Google be crawling via old sitemap, when the new one has been submitted and verified?
We have recently relaunched Scoutzie.com and re-submitted our new sitemap to Google. When I look on Webmaster tools, our new sitemap has been submitted just fine, but at the same time, Google is finding a lot of 404s when crawling the site. My understanding, it is still using crawling the old links, which do not exists. How can I tell Google to refresh it's index and to stop looking at all the old links?
Moz Pro | | scoutzie0 -
I have a Rel Canonical "notice" in my Crawl Diagnostics report. I'm presuming that means that the spider has detected a rel canonical tag and it is working as opposed to warning about an issue, is this correct?
I know this seems like a really dumb question but the site I'm working on is a BigCommerce one and I've been concerned about canonicalisation issues prior to receiving this report (I'm a SEOmoz pro newbie also!) and I just want to be clear I am reading this notice correctly. I presume this means that the site crawl has detected the rel canonical tag on these pages and it is working correctly. Is this correct?? Any input is much appreciated. Thanks
Moz Pro | | seanpearse0 -
SEO Web Crawler - Referrer Lists XML Sitemap URL
Hello!, I recently ran the crawl tool on a client site. Opening up the file, I noticed that the referring URLs listed are my XML sitemaps and not (X)HTML pages. Any reason or thoughts behind why this is happening? Thanks!
Moz Pro | | MorpheusMedia0 -
Wordpress-related warnings
As proposed by a number of people here, I have moved from WordPress.com to a self-hosted WordPress blog. I have also installed the SEO All in One plugin. This has been up and running for a month or so. My problem is that it is generated many (thousands) of warnings through my PRO Dashboard for Crawl Diagnostics. Specifically, I have a huge number of "Overly-dynamic URL" warnings. A typical URL is as folows: http://www.wednet.com/blog/2011/10/07/do-us-a-favor-dont/?utm_source=rss&utm_medium=rss&utm_campaign=do-us-a-favor-dont This has three querystring parameters, all generated by WordPress automatically. Here's another significant issue. With the SEO All In One plug in I can control the SEO-related parameters for each post (title, meta description, etc). However, WordPress generates a ton of virtual URLs which I can't (as far as I know) directly control. For example, the following page is a category page with all the posts for a single category. This is generating warnings because the meta description is missing. However, I do not know how to control such parameters since the page is automatically generated. http://www.wednet.com/blog/category/ceremony/ These type of warnings dominate the stats I have through my dashboard. How can I resolve these? Thanks. Mark
Moz Pro | | MarkWill0 -
Title tag on sitemap.xml
The SEO moz is showing an error on one of the sites within my SE Moz account campaign under Crawl Diagnostics: Title tag missing or empty. No problem here but the file associated with this issue is sitemap.xml and that just dose't look right as as far as I know xml files are title tag free. I've searched around and i've been able only to confirm my initial thought that sitemap.xml dose't use a title tag .. like any other xml. is this an issue ? (the error that is) or i should let it slide. can it be fixed ? if yes, how ? Thanks !
Moz Pro | | eyepaq1