Sitemap Warnings
-
Due to an issue with our CMS, I had a bunch of URL aliases that were being indexed and causing duplicate content issues.
I disallowed indexing of the bad URLs (they all had a similar URL structure so that was easy). I did this until I could clean up the bad URLs
I then recieved a bunch of sitemap warnings that the URLs that I blocked URLs with robots.txt that were in the sitemap.
Isn't this the point of robots.txt? Why am I getting warnings and how can I get rid of them?
-
Irving -
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Ok, so we took the restriction out of robots.txt while IT tries to fix the issue of URLs showing up on the sitemap that shouldn't.
Warnings haven't fallen off and now our sitemap is a day behind now as it's stuck in pending for almost a full day.
Any thoughts on what might be causing? I'm assuming this is impacting what's indexed and hurting our site.
-
Irving,
Totally get that and we're working to ensure they are no longer included in the sitemap.
Thanks,
Lisa
-
The purpose of your sitemap is to tell Google to go out and index the pages you specify. The purpose of the robots.txt is to tell Google not to index the page. The warning is likely just a precaution to let you know that you may have by accident requested them to block something in robots.txt. If you remove the URL's from your submitted sitemap the warnings should disappear. If you leave them, you will have warnings but Google should not index the content since your blocked it in robots.txt.
-
you are not supposed to include blocked URLs in the sitemap.xml files, or Google considers it wasting their crawl time. Are these automated sitemap.xml files?
You're basically saying "come index these pages i've listed, but don't index them!"
Remove the URLs that are blocked content (or rerun/regenerate them) and resubmit the sitemaps and the warnings will go away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Should I Do About Duplicate Title Warning From Category Pages Of Store?
I know a lot of the MOZ warnings can be ignored, however, I'm trying to figure out of this one should be added to that list: my store has urls setup like this for categories: https://www.mysite.com/sweaters https://www.mysite.com/sweaters/page/2 The meta title is "Sweaters" for both pages. Is that bad practice? I don't think I can automatically change the meta title to to Sweaters Page 2 or even want to. or should I do that? Or just ignore these type of warnings?
Moz Pro | | IcarusSEO0 -
How to fix the Crawl Diagnostics error and warnings
hi im new to the seo world and i dont know a lot about it , so after my site get crawled i found 1 error and 151 warning and 96 notices , it that bad ?? and plz cam someone explain to me how to fix thos problem , a will be very thankful
Moz Pro | | medlife0 -
I have had ro resubmit my sitemap to google, Bing & yahoo. Does SEOmoz automatically pic that up?
Hi there I am monitoring this website for a client: www.smsquality.com Someone on their side had gone and blocked the sitemap from being crawled and also in some form or another removed it as well. (Confusing I know) However I have gone and recreated the sitemat for these guys allowing robots to crawl the site, resubmitted it to all major search engines. My question is; Will SEOmoz be ableto crawl the site like it usually does and give me proper results for my Keywords placed into the Keywords Capmaign as well as give me Onsite page crawls using these keywords with proper results? Thanks in Advance Ray
Moz Pro | | RayHay0 -
Crawl Report Warnings
How much notice should be paid to the warnings on the SEO Moz crawl reports? We manage a fairly large property site and a lot of the errors on the crawl reports relate to automated responses. As a matter of priority which of the list below will have negative affects with the search engines? Temporary RedirectToo Many On-Page LinksOverly-Dynamic URLTitle Element Too Long (> 70 Characters)Title Missing or EmptyDuplicate Page ContentDuplicate Page TitleMissing Meta Description Tag
Moz Pro | | SoundinTheory0 -
"Too many on-page links" warning on all of my Wordpress pages
Hey guys, I've got like 120 "Too many on-page links" warnings in my crawl diagnostics section. They're all the pages of my WordPress blog. Is this an acceptable and expected warning for Wordpress or does something need to be better optimized? Thanks.
Moz Pro | | SheffieldMarketing0 -
404 Page/Content Duplicates & its "Warning"
My website has MANY duplicate pages and content which are both derived from the MANY 404 pages on my website. While these are flagged in SEOmoz as "Warnings," should this be of concern to SEO effectiveness?
Moz Pro | | dhk50180 -
SEOMoz Crawl Warnings, do they really hurt rankings?
SEOMoz reports 250 crawl warnings on my site. In most cases its too long title tags, with 4 of them its missing meta description. SEOMoz says it will hurt my rankings? However, I'm sure a recent whiteboard Friday contradicted this. So what is it?
Moz Pro | | sanchez19600 -
Wordpress-related warnings
As proposed by a number of people here, I have moved from WordPress.com to a self-hosted WordPress blog. I have also installed the SEO All in One plugin. This has been up and running for a month or so. My problem is that it is generated many (thousands) of warnings through my PRO Dashboard for Crawl Diagnostics. Specifically, I have a huge number of "Overly-dynamic URL" warnings. A typical URL is as folows: http://www.wednet.com/blog/2011/10/07/do-us-a-favor-dont/?utm_source=rss&utm_medium=rss&utm_campaign=do-us-a-favor-dont This has three querystring parameters, all generated by WordPress automatically. Here's another significant issue. With the SEO All In One plug in I can control the SEO-related parameters for each post (title, meta description, etc). However, WordPress generates a ton of virtual URLs which I can't (as far as I know) directly control. For example, the following page is a category page with all the posts for a single category. This is generating warnings because the meta description is missing. However, I do not know how to control such parameters since the page is automatically generated. http://www.wednet.com/blog/category/ceremony/ These type of warnings dominate the stats I have through my dashboard. How can I resolve these? Thanks. Mark
Moz Pro | | MarkWill0