Why extreme drop in number of pages indexed via GWMT sitemaps?
-
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
-
Thanks, that coves it!
-
Yes, this is the norm. You will generally have a variety of update frequencies in your xml sitemap. If you look at your sitemap you will usually see a value from 0.1 to 1.0. Those request the frequency in which the page is updated. If Googlebot will generally adhere to your guidelines and only crawl those pages when you tell them they are updated. If all of your pages are set to the same frequency, which they shouldn't be, Google will generally only crawl a certain amount of data on your site on a given crawl. So, a slow increase in indexed pages is the norm.
-
Yes, looking back at change logs was helpful. Canonical tags was it! We found a bug, the canonical page tags were being truncated at 8 characters. The number of pages indexed has started to increase rather than decrease, so it appears the issue is resolved. But I would have thought the entire sitemap would get indexed once the issue was resolved, rather than small increases each day. Does that seem correct to have a slow increase back to normal, rather than getting back to nearly 100% indexed overnight?
-
Do you have the date of the change? Try to see if you can see the when the change happened because we might be able to figure it out that way too.
WMT > sitemaps > webpages tab
Once you find the date you may be able to go through your notes and see if you've done anything around that date or if Google had any sort of update (PageRank just updated).
I have had sites that had pages unindexed and then a few crawls later it got reindexed. I just looked at 20 sites in our WMT and all of our domains look good as far as percentage of submitted vs indexed.
Only other things I can think of is to check for duplicate content, canonical tags, noindex tags, pages with little or no value (thin content) and (I've done this before) keep your current sitemap structure but add an additional sitemap with all of your pages and posts to it. Don't break it down, just add it all to one sitemap. I've had that work before for a similar issue but that was back in 2010. Multiple sitemaps for that site never seemed to work out. Having it all on one did the trick. The site was only about 4,000 pages at the time but I thought I would mention it. I haven't been able to duplicate the error and no other site has had that problem but that did do the trick.
Definitely keep an eye on it over the next few crawls. Please let us know what the results are and what you've tried so we can help troubleshoot.
-
We use multiple site maps.
Thanks, I had not thought about page load speed. But it turned up okay. Had already considered your other suggestions. Will keep digging. Appreciate your feedback. -
Not sure why the drop but are you using just one sitemap or do you have multiple ones?
Check the sizes of your pages and the crawl rate that Google is crawling your site. If they have an issue with the time it takes them to crawl your sitemap, it will start to reduce the number of indexed pages it serves up. You can check your crawl stats by navigating to WMT, crawl > crawl stats. Check to see if you've notice any delays in the numbers.
Also, make sure that your robots.txt isn't blocking anything.
Have you checked your site with a site: search?
These are pretty basic stuff but let us know what you've looked into so we can help you more. Thanks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Why do I have so many extra indexed pages?
Stats- Webmaster Tools Indexed Pages- 96,995 Site: Search- 97,800 Pages Sitemap Submitted- 18,832 Sitemap Indexed- 9,746 I went through the search results through page 28 and every item it showed was correct. How do I figure out where these extra 80,000 items are coming from? I tried crawling the site with screaming frog awhile back but it locked because of so many urls. The site is a Magento site so there are a million urls, but I checked and all of the canonicals are setup properly. Where should I start looking?
Intermediate & Advanced SEO | | Tylerj0 -
Why one of my top pages dropped?
Hello here. Our website, virtualsheetmusic.com, is pretty popular in the sheet music realm, and we used to rank on the first page for the keyword "violin sheet music" until a few weeks ago with our violin dedicated page: http://www.virtualsheetmusic.com/downloads/Indici/Violin.html But a couple of weeks ago we dropped to over the 5th page on Google (I can't even find us!) and I have no idea why. Most of our top ranking pages are still there though. This never happened before, after 17 years on the web. Do you have any idea why that could have happened?
Intermediate & Advanced SEO | | fablau0 -
Google is indexing wrong page for search terms not on that page
I’m having a problem … the wrong page is indexing with Google, for search phrases “not on that page”. Explained … On a website I developed, I have four products. For example sake, we’ll say these four products are: Sneakers (search phrase: sneakers) Boots (search phrase: boots) Sandals (search phrase: sandals) High heels (search phrase: high heels) Error: What is going “wrong” is … When the search phrase “high heels” is indexed by Google, my “Sneakers” page is being indexed instead (and ranking very well, like #2). The page that SHOULD be indexing, is the “High heels” page (not the sneakers page – this is the wrong search phrase, and it’s not even on that product page – not in URL, not in H1 tags, not in title, not in page text – nowhere, except for in the top navigation link). Clue #1 … this same error is ALSO happening for my other search phrases, in exactly the same manner. i.e. … the search phrase “sandals” is ALSO resulting in my “Sneakers” page being indexed, by Google. Clue #2 … this error is NOT happening with Bing (the proper pages are correctly indexing with the proper search phrases, in Bing). Note 1: MOZ has given all my product pages an “A” ranking, for optimization. Note 2: This is a WordPress website. Note 3: I had recently migrated (3 months ago) most of this new website’s page content (but not the “Sneakers” page – this page is new) from an old, existing website (not mine), which had been indexing OK for these search phrases. Note 4: 301 redirects were used, for all of the OLD website pages, to the new website. I have tried everything I can think of to fix this, over a period of more than 30 days. Nothing has worked. I think the “clues” (it indexes properly in Bing) are useful, but I need help. Thoughts?
Intermediate & Advanced SEO | | MG_Lomb_SEO0 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
Large number of new web pages to launch in bulk or to stretch over longer time?
We are going to launch about 60.000 new web pages under one established domain (google pagerank 6), which currently has about 30.000 web pages. So web pages will triple. New pages contain keyword variations of existing pages with pages targeting specific niches. All pages with unique and useful specific content for visitor. All pages heavily interlinked among each other. Would you recommend to stretch launch of new pages over some time period, or do you see no problem to launch them all on one day? In about 3 months we plan to launch another 180.000 new web pages. Thanks.
Intermediate & Advanced SEO | | lcourse0 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120