Not All Submitted URLs in Sitemap Get Indexed
-
Hey Guys,
I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right?
Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different?
Thanks for your help on this
Cheers
-
Thanks Dan, but I have registered the right URL (http).
However today I have again 100% indexed from the submitted URLs (changed nothing). Really Crazy.
Cheers,
Heiko
-
This can happen if you don't have the correct version of your URL registered in webmaster tools, so something to check
-
Hi There
One thing to check - do you have the exact version of domain registered in webmaster tools? So www or non-www and http or httpS? This has to be exact, webmaster tools considers them all different sites and you can get limited data if the wrong one is registered.
That would be the biggest cause of discrepancy. If this is not the case, there are many times Webmaster Tools data can lag behind, or be different than the index. I would go with what you see in actual Google searches though as the "final answer".
-
I get the same thing. Nobody on here seems to know the answer (I asked a similar question in the last week or so) - if the pages are there when you do a manual search then I wouldn't sweat it. I have taken the view that it's not worth worrying about!
Good luck Amelia
-
I didn't change the sitemap in the last 4 months. At the beginning the numbers match exactly, so submitted and indexed URLs where the same. But this week I recognized, that now of about 20% are not indexed any more. So I already got confused, but the manual check showed that everything is ok.
However, I just would like to know, why there is this difference in webmaster tools....
Cheers
-
this is clear, but has nothing to do with my original question. I just wanted to know why webmaster tools doesn't display the right number of indexed pages from the sitemap. It would just be the easiest way to recognize when some pages will get de-indexed for whatever reason.
-
Hi there
This is pretty common. Google sometimes shows varying numbers in Webmaster Tools and what actually appears in the index. When did you submit your sitemap?
Here are some reasons that Google may not index all of your pages.
Check your robots.txt to be sure, but give yourself a bit of time for the indexing number in WMT to update. The good news is that you are seeing your pages in search - so that's a positive.
I would also check to see if you have any duplicate or thin content on the website, dynamic URLs in your sitemap, check how deep your pages go (this is especially important due to crawl budgets), and also your website's canonical tag situation.
These are some things I would look into. Hope this helps! Good luck!
-
sitemap does not ensure you are in the index. they just inform the search engine about your site.
in fact Bing suggest you only put hidden pages and important pages in sitemap.
IMO they are overrated unless you have something special to inform them of, or a very large site , they will find it crawling your site normaly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which product URL to include in Sitemaps?
Hi Does the product URL's in Sitemaps affect the sub-categories authority too? For example, if I have a product with 2 URL's and which have a canonical tag: **/brands/michael-kors/bags/**jet-set-double-zip-wallet/ **/women/accessories/wallets/**jet-set-double-zip-wallet/ If I make the main URL "/women/accessories/wallets/jet-set-double-zip-wallet/" and set that as the Canonical URL & list that URL in the XML Sitemap, will it also mean the "/women/accessories/wallets/" category will get more authority and increase it's power to rank? Thanks Frankie
Technical SEO | | Frankie-BTDublin0 -
Sudden Indexation of "Index of /wp-content/uploads/"
Hi all, I have suddenly noticed a massive jump in indexed pages. After performing a "site:" search, it was revealed that the sudden jump was due to the indexation of many pages beginning with the serp title "Index of /wp-content/uploads/" for many uploaded pieces of content & plugins. This has appeared approximately one month after switching to https. I have also noticed a decline in Bing rankings. Does anyone know what is causing/how to fix this? To be clear, these pages are **not **normal /wp-content/uploads/ but rather "index of" pages, being included in Google. Thank you.
Technical SEO | | Tom3_150 -
Wrong canonical URL was specified. How to refresh the index now?
Wrong canonical URL was applied to thousands of pages of a client website, pointing them all to a single non-existing URL. Now Google has de-indexed most of those pages. We have fixed the problem now, but do we get Search engines crawl those pages again and start showing in Search results? I understand that a slow recovery is possible if we don't do anything. Was wondering if we can fast track the recovery... Any pointers? Thanks
Technical SEO | | Krupesh0 -
I am trying to figure out why a website is not getting fully indexed by google. Any ideas?
I am trying to figure out why a website is not getting fully indexed by google. The website was built with Godaddy's website designer so maybe this is the problem. Originally, the internal links throughout the navigation were linked to âpagesâ within the site. I went in and changed all of these navigation links to point to the actual url links throughout the site instead of relative links pointing to pages on the server. I thought this would have solved the problem because I thought that perhaps google was not able to follow the original relative links. When I check to see how many pages are in the google index I still see the same #. What is going on? Should this website be rebuilt using more search engine friendly code like wordpress? Is there a simple fix that will enable google to find all of this content created by Godaddy design software? I appreciate any help offered. Here is the site- http://www.securehomeusa.com/
Technical SEO | | ULTRASEM0 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here:Â https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Changing all urls
A client of mine has a wordpress website that is installed in a directory, called "site". So when you go to www.domain.com you are redirected to www.domain.com/site. We all know how bad it is to have a redirect fron your subdomain to another page. In this case I measured a loss of 5 points of page authority. The question is: what is the best practice to remove the "site" from the address and changing all the urls? Should I use the webmaster tool to tell to Google that the site is moving? It's not 100% true, cause the site is just moving one level up. Should I install a copy of the website under www.domain.com and just redirect 301 every old page to its new url? This way I think the site would be deindexet for 2/3 months. Any suggestions or tips welcome! Thanks DoMiSol
Technical SEO | | DoMiSoL0 -
Exclude Child URLs from XML Sitemap Generator (Wordpress)
Hi all, I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs. There is a section Exclude items and a subsection Exclude posts.  I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work.  So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked. I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz! Cheers.
Technical SEO | | markadoi840 -
Getting more pages indexed by yahoo and bing
Anyone has a reliable way to get more pages indexed in yahoo and bing. Please dont say to get more inner page quality links.
Technical SEO | | mickey110