Webmaster Tools Indexed pages vs. Sitemap?
-
Looking at Google Webmaster Tools and I'm noticing a few things, most sites I look at the number of indexed pages in the sitemaps report is usually less than 100% (i.e. something like 122 indexed out of 134 submitted or something) and the number of indexed pages in the indexed status report is usually higher. So for example, one site says over 1000 pages indexed in the indexed status report but the sitemap says something like 122 indexed.
My question: Is the sitemap report always a subset of the URLs submitted in the sitemap? Will the number of pages indexed there always be lower than or equal to the URLs referenced in the sitemap?
Also, if there is a big disparity between the sitemap submitted URLs and the indexed URLs (like 10x) is that concerning to anyone else?
-
Unfortunately not, the closest you'll get is selecting a long period of time in Analytics and then exporting all the pages that received organic search traffic. If you could then cross check them with your list of URLs on your site it could provide you with a small list. But I would still check them in Google to make sure they aren't indexed. As I said it's not the best way.
-
Is there a reliable way to determine which pages have not been indexed?
-
Great answer by Tom already, but I want to add that probably images and other types of content whom are mostly not by default included in sitemaps could also be among the indexed 'pages'.
-
There's no golden rule that your sitemap > indexed pages or vice versa.
If you have more URLs in your sitemap than you have indexed pages, you want to look at the pages not indexed to see why that is the case. It could be that those pages have duplicate and/or thin content, and so Google is ignoring them. A canonical tag might be instructing Google to ignore them. Or the pages might be off the site navigation and are more than 4 links/jumps away from the homepage or another page on the site, make them hard to find.
Conversely, if you had lots more pages indexed than in your sitemap, it could be a navigation or URL duplication problem. Check to see if any of the pages are duplicate versions caused by things like dynamic URLs generated through search on the site or the site navigation, for example. If those pages are the only physical pages that you have created and you know every single one has been submitted in a sitemap - and so any other indexed URLs would be unaccounted for, that may well be cause for concern, so check nothing is being indexed multiple times.
Just a couple of scenarios, but I hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analytics - Tracking multiple thankyou pages?
Hi Guys, I want to track email opt-ins for multiple thank you pages. The setup is as follows: http://image.prntscr.com/image/57632e05a15f42fda0b8ffec2d176460.png I have not yet built the thank you pages, so i was wondering what the URL should be to make it easy to track them in GA? I'm thinking: domain.com/thankyou-page/page1 Then using regular expression in GA to track /thankyou-page/ Would this be a good way to go about it? Cheers. f6c7r0
Reporting & Analytics | | spyaccounts110 -
Page Tracking using Custom URLs - is this viable?
Hi Moz community! I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear. Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them. As an example: This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/ But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” ) It is this custom URL that we use within GA to look up metrics about this page. This is just one example of many across our site setup to do the same thing Here is a second example: Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/ Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/ NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose. Main Questions: Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking) Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad) I cannot find any reference to this method anywhere on the InterWebs - If method is not normal: Any recommendations on a solution to address this? Potential Problems? GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this? The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us. Thank you in advance for any insight and/or advice. Chris
Reporting & Analytics | | usnseomoz0 -
Alternative tools for Keyword Traffic
Hi There, Wondering if anyone has any other tools they would recommend using for finding out keyword traffic on websites. Currently (and I'm sure like most), my website is connected to Google Analytics and Google Search Console. My biggest frustration becomes the "(not set)" variable that appears when I go to review the keywords section. It's always such a large number and I have no way of finding out what people might be typing in and coming across my website. Of course, I understand the privacy factor as to why Google must do this but it's certainly difficult to analyze what's working and what's not. Any tips, tricks or suggestions are greatly appreciated! Thanks, Lindsay
Reporting & Analytics | | MainstreamMktg0 -
WMT data vs. Analytics
Hi Each month I export my data from WMT and go through analytics. I also export our non brand queries from analytics and not WMT - I haven't had an issue before, but this month the impression data is quite different. In the hundreds of thousands different for keywords, everything seems to have taken a big jump and it seems strange. However, not everything is different, I've spot checked some and its; consistent in both, I'm not sure what's going on? One example would be: <colgroup><col width="281"> <col width="72"></colgroup>
Reporting & Analytics | | BeckyKey
| industrial shelving | 1016 |
| industrial racking | 999 | These appear as impressions from Query data in analytics, but they appear nowhere in my WMT query data. Analytics query data shows: | industrial equipment | 670 | WMT Data: | industrial equipment | 143 | Anyone have any idea? Perhaps some kind of tracking issue? Also I've triple checked dates etc...0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Relation between Page and Landing Page
Hi all, I am looking through the Analytics data on a specific 'page' (404.html which is the Not found page) and as my secondary dimensions, I have Landing Page data. Now am I not sure how the page and the landing page are related here. Is it basically saying 58 Page views of the 404.html were originated by the users who landed on brochures.html? If someone could provide any pointers on it, it would be great. Thank you for your time. uZQJ
Reporting & Analytics | | nirpan0 -
How come de results from the old Google Keyword Tool are so different from the results from the keyword planner?
When I insert the keyword 'IT jobs' with Belgium as country and Dutch as the language, there is a huge difference between the results in the keyword planner and the keyword tool. Keyword tool says 1,000,000 local searches per month Keyword planner says 590 local searches per month This is a really big difference and I don't know which results I should trust. Can anybody help me with this?
Reporting & Analytics | | Murielleu0 -
Tool to check GA code present on every page?
Is there a tool to check if the Google Analytics code is present on every page of a website? Thanks for your help!
Reporting & Analytics | | gerardoH0