Validated pages on GSC displays 5x more pages than when performing site:domain.com?
-
Hi mozzers,
When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages.
Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
-
Hi there!
On the one hand, site: number of results is not the exact nor the current amount of URLs that Google has indexed. It's only just a way of seeing how much results there are. Google said that it's optimized for speed, that why it's an estimated number.
You should trust whats reported in Search Console.On the other, it's possible that Google has checked an URL and considered as valid on one time and not index it, because of canonicals, similar content or any other reason. If you find some URL not indexed, yet reported as indexed in the coverage report, try checking it through the Inspect URL tool, then asking for the TEST LIVE URL option. There you may find some answers.
Hope it helps.
Best luck.
Gaston
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webshop landing pages and product pages
Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?
Intermediate & Advanced SEO | | Mat_C0 -
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Why does Google display the home page rather than a page which is better optimised to answer the query?
I have a page which (I believe) is well optimised for a specific keyword (URL, title tag, meta description, H1, etc). yet Google chooses to display the home page instead of the page more suited to the search query. Why is Google doing this and what can I do to stop it?
Intermediate & Advanced SEO | | muzzmoz0 -
Better Domain and Page Authority Than my compeitors
Hi All, I have a pretty extensive question but wanted a starting point if you don't mind. I have a situation where I created 4 sites that I would say are almost identical other than I have loaned my other websites to other agents. My content is rewritten but it's still roughly the same. You will see, when I give the URL's, that they are similar, and almost identical in templates.My question is going to be, Since I have built some authority on all of these sites, is it wise to simply take them down, or just change the templates and take away the content and start over. If so, what do I do with the existing pages? Or is there a better idea I'm not thinking of? My other question is, this site: goo.gl/Tf00rc Is my main site. It has a higher domain authority and page authority than any of my other local competitors, yet I'm still ranked #13-15 for my main keywords. I will say, many of my other competitors have older domains and I'm sure didn't try to manipulate the serps either. Thoughts and recommendations? Here are my other similar sites which have almost identical templates and very similar content but not copied and pasted content. 1. goo.gl/Wwb0Tg 2. goo.gl/3gpR1X 3. goo.gl/FwD8Bk 4. goo.gl/vpuQv2 My dilemma: I want to make sure that my other agents have a great site that can perform well, as well. If I completely remove these sites, they have no site. I'll say that right now the sites that get the most traffic are the goo.gl/Tf00rc and goo.gl/Wwb0Tg then is the goo.gl3gpR1X, and lastly goo.gl/FwD8Bk so they all get about 3k, 2k, and 1k and 500 visits a month respectively. The total visits of all of these is pretty good. I feel like the max would visits would be around 10k per month in my market. Any help would be greatly appreciated as I have spent a lot of time and money getting these sites where they are only to be penalized, I'm sure, for duplicate content.
Intermediate & Advanced SEO | | Veebs0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
Intermediate & Advanced SEO | | Motava
ftp.DOMAIN2.com/?action=news&id=63
META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN3.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN4.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
mail.DOMAIN5.com/?action=category&id=17
META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27 There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.0 -
Should I start new domain and redirect site?
I recently my rankings for http://www.top-10-dating-reviews.com (some adult content) drop off a cliff. Google tells me there's no manual penalty therefore it might be algorithmic. I don't know why my rankings went but I think it could be that I added A LOT of category pages pulling the same content from posts and this could have caused both duplicate content issues and too many on page links causing an algo penalty. Ive deleted the categories and therefore fixed duplicate content issue (perhaps you guys could check out the site and see that you agree with me) but rankings have not improved even thougo most of the pages have been recrawled. I read somewhere its extremely hard to recover from such a penalty so should I move my site to a and domain and redirect all urls? I can't think of another solution. Any help appreciated!
Intermediate & Advanced SEO | | SamCUK0 -
Does it make sense to combine multiple news sites under one domain?
My company operates multiple news site brands across several markets and typically operates one website per market/brand. We've been advised that in the case of one of the newer markets, where each brand might only be publishing 3 articles a week, that combining all that market's brands under one domain will drive higher traffic from search because Google values higher article flow. Getting this done will be a lot of work. Is the juice worth the squeeze? And assuming we get the UX right and execute 301 redirects correctly, are there major risks to search traffic?
Intermediate & Advanced SEO | | AlecD0