Webmaster Tools Indexed pages vs. Sitemap?
-
Looking at Google Webmaster Tools and I'm noticing a few things, most sites I look at the number of indexed pages in the sitemaps report is usually less than 100% (i.e. something like 122 indexed out of 134 submitted or something) and the number of indexed pages in the indexed status report is usually higher. So for example, one site says over 1000 pages indexed in the indexed status report but the sitemap says something like 122 indexed.
My question: Is the sitemap report always a subset of the URLs submitted in the sitemap? Will the number of pages indexed there always be lower than or equal to the URLs referenced in the sitemap?
Also, if there is a big disparity between the sitemap submitted URLs and the indexed URLs (like 10x) is that concerning to anyone else?
-
Unfortunately not, the closest you'll get is selecting a long period of time in Analytics and then exporting all the pages that received organic search traffic. If you could then cross check them with your list of URLs on your site it could provide you with a small list. But I would still check them in Google to make sure they aren't indexed. As I said it's not the best way.
-
Is there a reliable way to determine which pages have not been indexed?
-
Great answer by Tom already, but I want to add that probably images and other types of content whom are mostly not by default included in sitemaps could also be among the indexed 'pages'.
-
There's no golden rule that your sitemap > indexed pages or vice versa.
If you have more URLs in your sitemap than you have indexed pages, you want to look at the pages not indexed to see why that is the case. It could be that those pages have duplicate and/or thin content, and so Google is ignoring them. A canonical tag might be instructing Google to ignore them. Or the pages might be off the site navigation and are more than 4 links/jumps away from the homepage or another page on the site, make them hard to find.
Conversely, if you had lots more pages indexed than in your sitemap, it could be a navigation or URL duplication problem. Check to see if any of the pages are duplicate versions caused by things like dynamic URLs generated through search on the site or the site navigation, for example. If those pages are the only physical pages that you have created and you know every single one has been submitted in a sitemap - and so any other indexed URLs would be unaccounted for, that may well be cause for concern, so check nothing is being indexed multiple times.
Just a couple of scenarios, but I hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analytics reporting traffic for 404 pages
Hi guys, Unique issue with google analytics reporting for one of our sites. GA is reporting sessions for 404 pages (landing pages, organic traffic) e.g. for this page: http://www.milkandlove.com.au/breastfeeding-dresses/index.php the page is currently a 404 page but GA (see screenshot) is reporting organic traffic (to the landing page). Does anyone know any reasons why this is happening? Cheers. http://www.milkandlove.com.au/breastfeeding-dresses/index.php GK0zDzj.jpg
Reporting & Analytics | | jayoliverwright2 -
Can not divide in different properties a domain in Search Console (Webmaster Tools)
Dear Moz Community, I hope you can give me a hand with the following questions. Im in charge of SEO of an ecommerce site in LATAM. It´s service is available in several countries, therefore each country has it subdirectory Eg. /ar /pe /co /bo /cl /br,etc... (in the future we will move to differente ccTLDs). I have been recomended to split or create different Search Console or Webmaster Tools properties (one for each subdirectory) but when Im creating a new property with a subdirectory, lets say www.domain.com/ar, Webmaster tools starts creating a property for www.domain.com/ar/ (NOTICE THE LAST SLASH) and it returns since that page doesn´t exist, what do you recomend me to do? Best wishes, Pablo Lòpez C
Reporting & Analytics | | pablo_carrara0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Is Google turning Off Webmaster Tools Search data
My Webmaster Tools account has stopped showing data past 9/23, which is a full week old. Typically it is just a few days behind schedule. Is Google cutting this off?
Reporting & Analytics | | gametv0 -
Moz is showing different "errors" than Webmaster tools
I have set up my Moz campaign and the crawl errors are showing multiple duplicate content and page titles however when I check my webmaster tools data, these errors are not showing up. Is this normal and who should I listen to?
Reporting & Analytics | | LabelMedia0 -
Tool to check GA code present on every page?
Is there a tool to check if the Google Analytics code is present on every page of a website? Thanks for your help!
Reporting & Analytics | | gerardoH0 -
Why Does This Page Rank Badly?
Hi, The page in question is: http://www.toymart.com/details.php?id=1764 The problem is that this and most of the other 10,000 pages used to rank well but now are really struggling.
Reporting & Analytics | | ToyMart3
I am begining to suspect that I have some sort of Google penalty but I cannot know for sure.
I believe I comply with all Google guidelines and I think the SEO is good.
Possibly the eBay affiliate links are the problem but they form a large part of the content and user experience.
The page gives a price guide for items in different conditions and has various other tools for the user. Any thought would be appreciated. Thanks for your time in advance. Alan0 -
For an optimized site, any available stats / guesstimates on what is avg % of traffic to homepage vs. second-level pages?
I'm interested in passing this info on to a client who experienced a period of time when an incorrect GA code was installed on their homepage. They were able to get Google stats on second level pages only. This is a site that gets 80 + % of visits from organic search engine referrals. They do minimal advertising. Thanks in advance.
Reporting & Analytics | | alankoen1230