In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
-
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt
However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description.
Why is the number of "issues" so high?
Does it compound over time as Google re-crawls the sitemap?
-
Hello, I just went through an issue like this. Are you using WordPress? Also, Do you have any SEO plug-ins installed?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which search engines should we submit our sitemap to?
Other than Google and Bing, which search engines should we submit our sitemap to?
Intermediate & Advanced SEO | | NicheSocial0 -
Can a "site split" cause a drastic organic search decline?
Let's say you have a client. They have two big, main product offerings. Come early April of this year, one of the product offerings decide to move their product offering over to a new domain. Let's also say you had maybe 12 million links in your inbound link portfolio for the original domain. And when this product offering that split opened their new domain, they 301 redirected half of those 12 million links (maybe even 3/4s) over to their new domain. So you're left with "half" a website. And while you still have millions of links; you lost millions as well. Would a ~25-50% drop in organic traffic be a reasonable effect? My money is on YES. Because all links to a domain help "rise" the page authority sea level of all URLs of the domain. So cutting off 50-75% of those links would drop that sea level a somewhat corresponding amount. We did get some 301 redirects that we felt were "ours" in place in late July... but that really accounted for 25% of the total amount of pages with inbound links they took originally. And those got in place almost 4 months after the fact. Curious what other people may think. LnEazzi.png
Intermediate & Advanced SEO | | ChristianMKG0 -
Only ranking well when "UK" is added to search term
Hi, what does it mean when a lot of our keyword phrases rank only when "UK" is typed in the search term? For example:
Intermediate & Advanced SEO | | Solid_Web
"boxes" (not in top 50)
"boxes UK" (38) "big storage boxes" (45)
"big storage boxes UK" (33) We haven't attempted to SEO the pages for search terms with "UK" appended to them. Our domain is a co.uk domain. So, what reasons could there be that are we ranking in such a way?0 -
XML Sitemap on another domain
Hi, We've rebuilt our website and created a better sitemap index structure. There's a good chance that we not be able to append the XML files to existing site for technical reasons (don't get me started). I'm reaching out because I'm wondering if can we place the XML files on another website or subdomain? I know this is not best practice and probably very grey but I'm looking for alternatives. If there answer is DON'T DO IT let me know too. Thx
Intermediate & Advanced SEO | | WMCA0 -
Taking up an "abondoned" domain?
Hi, As far as SEO goes, are there any direct contradictions to picking up an approximately 1 year old domain, where the only thing that has ever been on is a static "Hello world" page from a wordpress install done when the domain was created? I'm thinking about picking it up again, as if it was a totally fresh domain, add content, and do SEO on it. What are your thoughts friends? Thanks.
Intermediate & Advanced SEO | | kaince0 -
To index search results or to not index search results?
What are your feelings about indexing search results? I know big brands can get away with it (yelp, ebay, etc). Apart from UGC, it seems like one of the best ways to capture long tail traffic at scale. If the search results offer valuable / engaging content, would you give it a go?
Intermediate & Advanced SEO | | nicole.healthline0 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
Include Cross Domain Canonical URL's in Sitemap - Yes or No?
I have several sites that have cross domain canonical tags setup on similar pages. I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap. My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version. I have yet to see any errors in GWT about this. I have seen errors where I included a 301 redirect in my sitemap file. I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.
Intermediate & Advanced SEO | | WEB-IRS0