Is site: a reliable method for getting full list of indexed pages?
-
The site:domain.com search seems to show less pages than it used to (Google and Bing).
It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed.
Should I now accept the "pages" listed in site: is an unreliable metric?
-
Keep in mind that for a site:domain.com search, Google now includes pages from OTHER SITES that are using the canonical tag to point to your site. So, even though it says there are 300 pages indexed, 30 of those pages might be on other sites that use the canonical tag pointing to your site. The numbers of pages indexed that you're looking at may not be entirely accurate because of this.
-
I just haven't seen where the pages reduced, but I only use that operator for a general search. I have never gone through all the pages, etc. For that I would use any of the crawler tools. It would be interesting to see a download of search, GSC, and then something like Screaming Frog to see what we see.
As soon as I wrote that I checked our site and realized what you are saying. For Google we get "About 281 results," as I go to last page of results it changes to "page 13 of 126 results."
Then out of curiosity I tried Bing and now I am scratching my head: "763 results." When I go to last possible page I get, "247-256 of 256 results." I think that means my 281 results from Google are mostly on Bing!!!! (in case someone does not realize my humor, that last statement can be defined as either jest or sarcasm.)
So, when doing the site: I get 126 with Google but search console has 428...
Certainly interesting. I will keep playing with it.
Best
-
Hi Robert,
Thanks for your input.
The reason for doing it is part of an SEO site review process to examine pages indexed in Google compared to a site crawl in a tool like screaming frog and the indexed pages defined in GSC.
In terms of the "page 24 of 201 results" example, I mean that when you first use the site:domain.com Google will give you an estimated number of results, e.g. 3000 but actually as you click through the pages you find that the number of results is reduced - sometimes significantly.
-
I am not sure I understand where you say, " ...it will end and change to "page 24 of 201 results." I have used the site: operator a long time and I think it is reasonably accurate. One thing I notice is the occasional "some pages have been ... duplicate" and do you want to see those? So, if you include all of those what's the magic number?
Is there a reason you want the data that demands an exact result? I am not sure of anything that would give you that. The question is "indexed" within the given search engine. If you crawl with screaming frog, etc. you may see pages that are not indexed, so the comparison is not apples to apples. Just curious as to what you are wanting to know exact indexed pages for?
Interesting question.
-
Typically, the site: command in Google is unreliable. There are lots of reasons why, one being that there may be pages indexed that aren't "good enough", for whatever reason, to show up in the search results. When we look at the site pages indexed, we typically will use the site: command, then click a few pages deep and look at the number it shows (not the first number of pages it shows).
For SEO auditing purposes, we're looking to see if there is a significant difference between the number of pages indexed and the number of pages that we find when we we crawl the website ourselves.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
How to deal with Pages not present anymore in the site
Hi, we need to cut out from the catalog some destinations for our tour operator, so basically we need to deal with destination pages and tour pages not present anymore on the site. What do you think is the best approach to deal with this pages to not loose ranking? Do you think is a good approach to redirect with 301's these pages to the home page or to the general catalog page or do you suggest another approach? tx for your help!
Technical SEO | | Dreamrealemedia0 -
Page missing from Google index
Hi all, One of our most important pages seems to be missing from the Google index. A number of our collections pages (e.g., http://perfectlinens.com/collections/size-king) are thin, so we've included a canonical reference in all of them to the main collection page (http://perfectlinens.com/collections/all). However, I don't see the main collection page in any Google search result. When I search using "info:http://perfectlinens.com/collections/all", the page displayed is our homepage. Why is this happening? The main collection page has a rel=canonical reference to itself (auto-generated by Shopify so I can't control that). Thanks! WUKeBVB
Technical SEO | | leo920 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Is it bad to have your pages as .php pages?
Hello everyone, Is it bad to have your website pages indexed as .php? For example, the contact page is site.com/contact.php and not /contact. Does this affect your SEO rankings in any way? Is it better to have your pages without the extension? Also, if I'm working with a news site and the urls are dynamic for every article (ie site.com/articleid=2323.) Should I change all of those dynamic urls to static? Thank You.
Technical SEO | | BruLee0 -
Getting Rid of Duplicate Page Titles After URL Structure Change
I've had all sorts of issues with google when they just dropped us on our head a few weeks ago. Google is crawling again after I made some changes, but they're still not ranking our content like they were so I have a few questions. I changed our url structure from /year/month/date/post-title to just /post-title and 301 redirected the old link structure to the new. When I look I see over 3000 duplicate title errors listing both versions of the url. 1. How do I get google to crawl the old url structure and recognize the 301 redirect and update the index? 2. Google is crawling the site again, but they're not ranking us like they were before. We're in a highly competitive category and I'm aware of that, but we've always been an authority in our niche. We have plenty of quality backlinks and often we're originators of the content which is then rewritten by a trillion websites everywhere. We're not the best at writing and titles, but we're working on it and this did not matter much to google previously as it was ranking us pretty highly on the front page and certainly ranking us over many sites that are ranking above us today. Some backlinks http://www.alexa.com/site/linksin/dajaz1.com A few examples - if you google twista gucci louis prada you'll see many of the sites who trackbacked to us since we premiered the song rank much higher than us. 3 weeks ago we were ranking above them. http://dajaz1.com/twista-gucci-louis-prada/ google search jadakiss consignment mixtape 3 weeks ago we were ranking higher than all 4 sites ranking above us. The sites ranking above us even link to us or mention us, yet they rank above us now. original content here http://dajaz1.com/watch-jadakiss-confirms-cosignment-mixtape-2012-schedule/ I could throw out a ton of examples like this. How do we get google to rank us again. It should be noted that I'm not using any SEO plugin's on the site. I hand coded what's in there, and I know I can probably do it better so any tips or ideas is welcome. I'm pretty sure that our issues were caused by the Yoast SEO Plugin as when I search site:dajaz1.com the pages and topics that display were all indexed while the plugin was active. I've since removed it and all calls to it in the database, but I'm pretty nervous about plugins right now. Which brings me to my third and final question How do I get rid of the page category and topic pages that were indexed and seem to be ranking higher than the rest of our content? I lied one more. For category url I've set it to remove the category base so the url is dajaz1.com/news or dajaz1.com/music is that preferable or is this causing me issues? Any feedback is appreciated. Also google is crawling again (see attached image) but the Kilobytes downloaded per day hasn't. Should I be concerned about this? Gd9i6
Technical SEO | | malady0 -
How do I get rid of irrelevant back links pointing to missing pages on my site
Hi all, My site was hacked about a year ago and as a result I now have a ton of back links from irrelevant sites pointing to pages on my site that no longer exist. The followed back links section on the Competitive domain analysis tool shows about 3 pages worth of these horrible links. I have 2 questions: how bad is this for my site's SEO (which isn't good anyway, Page Rank 0) and how do I get rid of them? Any help would be much appreciated. Thanks, Andy WkXz0
Technical SEO | | getzen560