Not All Submitted URLs in Sitemap Get Indexed
-
Hey Guys,
I just recognized, that of about 20% of my submitted URL's within the sitemap don't get indexed, at least when I check in the webmaster tools. There is of about 20% difference between the submitted and indexed URLs. However, as far as I can see I don't get within webmaster tools the information, which specific URLs are not indexed from the sitemap, right?
Therefore I checked every single page in the sitemap manually by putting site:"URL" into google and every single page of the sitemap shows up. So in reality every page should be indexed, but why does webmaster tools shows something different?
Thanks for your help on this
Cheers
-
Thanks Dan, but I have registered the right URL (http).
However today I have again 100% indexed from the submitted URLs (changed nothing). Really Crazy.
Cheers,
Heiko
-
This can happen if you don't have the correct version of your URL registered in webmaster tools, so something to check
-
Hi There
One thing to check - do you have the exact version of domain registered in webmaster tools? So www or non-www and http or httpS? This has to be exact, webmaster tools considers them all different sites and you can get limited data if the wrong one is registered.
That would be the biggest cause of discrepancy. If this is not the case, there are many times Webmaster Tools data can lag behind, or be different than the index. I would go with what you see in actual Google searches though as the "final answer".
-
I get the same thing. Nobody on here seems to know the answer (I asked a similar question in the last week or so) - if the pages are there when you do a manual search then I wouldn't sweat it. I have taken the view that it's not worth worrying about!
Good luck
Amelia
-
I didn't change the sitemap in the last 4 months. At the beginning the numbers match exactly, so submitted and indexed URLs where the same. But this week I recognized, that now of about 20% are not indexed any more. So I already got confused, but the manual check showed that everything is ok.
However, I just would like to know, why there is this difference in webmaster tools....
Cheers
-
this is clear, but has nothing to do with my original question. I just wanted to know why webmaster tools doesn't display the right number of indexed pages from the sitemap. It would just be the easiest way to recognize when some pages will get de-indexed for whatever reason.
-
Hi there
This is pretty common. Google sometimes shows varying numbers in Webmaster Tools and what actually appears in the index. When did you submit your sitemap?
Here are some reasons that Google may not index all of your pages.
Check your robots.txt to be sure, but give yourself a bit of time for the indexing number in WMT to update. The good news is that you are seeing your pages in search - so that's a positive.
I would also check to see if you have any duplicate or thin content on the website, dynamic URLs in your sitemap, check how deep your pages go (this is especially important due to crawl budgets), and also your website's canonical tag situation.
These are some things I would look into. Hope this helps! Good luck!
-
sitemap does not ensure you are in the index. they just inform the search engine about your site.
in fact Bing suggest you only put hidden pages and important pages in sitemap.
IMO they are overrated unless you have something special to inform them of, or a very large site , they will find it crawling your site normaly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to get a list of all pages of your website that are indexed in Google?
I am trying to put together a comprehensive list of all pages that are indexed in Google and have differing opinions on how to do this.
Technical SEO | | SpodekandCo0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
In running a crawl of a client's site I can see several URLs listed in the sitemap that are then blocked in the robots.txt file. Other than perhaps using up crawl budget, are there any other negative implications?
Technical SEO | | richdan0 -
Submitting a new sitemap index file. Only one file is getting read. What is the error?
Hi community, I am working to submit a new a new sitemap index files, where about 5 50,000 sku files will be uploaded. Webmasters is reporting that only 50k skus have been submitted. Google Webmasters is accepting the index, however only the first file is getting read. I have 2 errors and need to know if this is the reason that the multiple files are not getting uploaded. Errors: | 1 | | Warnings | Invalid XML: too many tags | Too many tags describing this tag. Please fix it and resubmi | | 2 | | Warnings | Incorrect namespace | Your Sitemap or Sitemap index file doesn't properly declare the namespace. | 1 | Here is the url I am submitting: http://www.westmarine.com/sitemap/wm-sitemap-index.xml | 1 | | | | |
Technical SEO | | mm9161570 -
Should I Edit Sitemap Before Submitting to GWMT?
I use the XML sitemap generator at http://www.auditmypc.com/xml-sitemap.asp and use the filter that forces the tool to respect robots.txt exclusions. This generator allows me to review the entire sitemap before downloading it. Depending on the site, I often see all kinds of non-content files still listed on the sitemap. My question is, should I be editing the sitemap to remove every file listed except ones I really want spidered, or just ignore them and let the Google spiderbot figure it all out after I upload-submit the XML?
Technical SEO | | DonB0 -
I am trying to figure out why a website is not getting fully indexed by google. Any ideas?
I am trying to figure out why a website is not getting fully indexed by google. The website was built with Godaddy's website designer so maybe this is the problem. Originally, the internal links throughout the navigation were linked to “pages” within the site. I went in and changed all of these navigation links to point to the actual url links throughout the site instead of relative links pointing to pages on the server. I thought this would have solved the problem because I thought that perhaps google was not able to follow the original relative links. When I check to see how many pages are in the google index I still see the same #. What is going on? Should this website be rebuilt using more search engine friendly code like wordpress? Is there a simple fix that will enable google to find all of this content created by Godaddy design software? I appreciate any help offered. Here is the site- http://www.securehomeusa.com/
Technical SEO | | ULTRASEM0 -
I'm redesigning a website which will have a new URL format. What's the best way to redirect all the old URLs to the new ones? Is there an automated, fast way to do this?
For example, the new URL will be: https://oregonoptimalhealth.com/about_us.html while the old one's were like this: http://www.oregonoptimalhealth.com/home/ooh/smartlist_1/services.html I have redirect almost 100 old pages to the correct new page. What's the best and easiest way to do this?
Technical SEO | | PolarisMarketing0 -
Duplicate pages, overly dynamic URL’s and long URL’s in Magento
Hi there, I’ve just completed the first crawl of my Magento site and SEOMOZ has picked up 1,000’s of duplicate pages, overly dynamic URL’s and long URL’s due to the sort function which appends URL’s with variables when sorting products (e.g. www.example.com?dir=asc&order=duration). I’m not particularly concerned that this will affect our rankings as Google has stated that they are familiar with the structure of popular CMS’s and Magento is pretty popular. However it completely dominates my crawl diagnostics so I can’t see if there are any real underlying issues. Does anyone know a way of preventing this? Cheers,
Technical SEO | | WendyWuTours
Al.1