Indexed pages
-
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers...
- Google Search Console: 237 indexed pages
- Google search using site command: 468 results
- MOZ site crawl: 1013 unique URLs
- Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500)
Can anyone shed any light on why they differ so much? And where lies the truth?
-
Another option is if the site uses a CMS. If so, then you can create a sitemap for content pages/posts etc,.
Personally, I'm with Krzysztof Furtak on SF. Screaming Frog rocks. It'll find most pages, except perhaps Orphan pages as it wouldn't be able to find a link to crawl to discover the page.
If it's really important to get as many pages as possible, I'd do the following (I've put an Astrix (*) next to ones that some people may think are a tad extreme)
- Run a Screaming Frog crawl
- Grab a sitemap from your CMS
- Check any server-based analytics (AWSTATS etc)
- Check your access_log file & parse out URLs in there**(*)**
- site: queries, with & without www, and also using * as a subdomain (use something like Moz's toolbar to export)
- As Krzysztof suggests, Scrapebox would extract data too, but be careful scraping, you may get an IP slap.(*)
- Export crawl data from Moz & a tool such as Deep Crawl
- Throw the pages from all into Excel and de-dupe.
- Once you have a de-duped list, as an optional last step, go back to Screaming Frog and enter list mode (I have the paid version, not sure if it's possible with the free one) and run a crawl over all the de-duped URLs to get status codes etc
If you're going to do this sort of thing a fair bit - buy a Screaming Frog license, it's an awesome tool and can be useful in a multitude of situations.
-
The site: command is handy for asking Google what pages it knows about, however if Muzzmoz wants to know the number of pages on a site, you'd need more than this.
Also, re: your different ways or querying, I like to use:
site:*.domain.com - This can show other subdomains too, that may otherwise be missed
-
Ok so check with site something under 1000 pages and go to the last results page. You'll see that there'll be different number (in almost all cases).
-
I Will Always Prefer To Check Manually Using Site Command Because, site: operator, which will show us how many pages Google currently has indexed for the domain.
There Will Be Difference Between Index status in search console and current index as search console update the data after few days.
The number of indexed URLs is almost always significantly smaller than the number of crawled URLs, because Total indexed excludes URLs identified as duplicates, non-canonical or those that contain a meta no index tag.
Also, Check For Index(Preferred) Version Of Your Site
For E.g-
You can check More About this Here - https://support.google.com/webmasters/answer/2642366?hl=en
-
Hi
Most accurate number is from screaming frog (if you have less than 500 pages or paid version if more than 500).
Google indexes what it wants and if good enough to show in google index. If some pages are similar, got quality issues, blocked by robots etc then it won't show all. BTW don't think number in GSC or google index is good, check it manually because there can be 468 but in fact 200 only.
Moz can have "historical" pages that now don't exists or don't care about quality issues.
The truth is in screaming frog - most accurate number. If you used google user agent then number is the max that can appear in google index. If screaming frog user agent with turned off robots then you'll see bigger number (but google won't show it because of blocks).
If you want to check what's indexed then use tool like scrapebox. First get all urls (maybe without images if you don't care), then check indexed with sb. What's not indexed, can have some issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchased domain with links - redirect page by page or entire domain?
Hi, I purchased an old domain with a lot of links that I'm redirecting to my site. I want all of their links to redirect to the same page on my site so I can approach this two different ways: Entire site
Technical SEO | | ninel_P
1.) RedirectMatch 301 ^(.*)$ http://www.xyz.com or Page by page
2). Redirect 301 /retiredpage.html http://www.xyz.com/newpage.html Is there a better option I should go with in regards to SEO effectiveness? Thanks in advance!0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
Sitemap indexation
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
Technical SEO | | Morten_Hjort0 -
Page for page 301 redirects from old server to new server
Hi guys:
Technical SEO | | cindyt-17038
I have a client who is moving their entire ecommerce site from one hosting platform (Yahoo Store) to another (BigCommerce) and from one domain to another. The old domain is registered with the Yahoo as of yesterday and we have redirected the old domain (at the domain level) to the new domain. However, we are having trouble getting the pages to redirect page for page. Currently they are all redirecting to the new domain home page. We did just move the old domain from GoDaddy to Yahoo yesterday thinking this would solve it however as of this morning the old pages are still redirecting to the home page of the new domain. To complete the 301 redirect picture, we uploaded the redirects (all relative links for both from and to) to BigCommerce. And while the domain was hosted at GoDaddy with a redirect to the new domain, they were working. We moved the domain to Yahoo because of email issues thinking it should still work. Is it possibly just a waiting game now as the change populates across the DNS? old url to test:
rock-n-roll-action-figures.com/fender-jazz-bass-miniature-guitar-replica-classic-red-finish.html0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
My blog homepage deindexed, other pages indexing, still traffic not changed.
Hello! Today when I check my blog site search on Google, I can't see my blog home page. Though all my posts and pages are still on the Google results. Today I published a test post, then it also indexed by the Google less than 3 minutes. Still I can't see any traffic changes. 10th of April (yesterday) when I perform a site search (site:mydomain.com), I saw my site on the Google search result. Today I installed the Ulitmate SEO plug-in and deactivated WordPress SEO plug-in. After a few hours I saw this issue. (I'm not saying this is the issue, I just mentioned it). In addition to that I never used any black hat SEO methods to improve my ranking. my site:- http://goo.gl/6mvQT Any help really appreciate!
Technical SEO | | Godad0 -
Some site pages are removed from Google Index
Hello, Some pages of my clients website are removed from Google Index. We were in top 10 position for some keywords but now I cannot find those pages neither in top 1000. Any idea what to do in order to get these pages back? thank you
Technical SEO | | besartbajrami0