Sudden Change In Indexed Pages
-
Every week I check the number of pages indexed by google using the "site:" function. I have set up a permanent redirect from all the non-www pages to www pages.
When I used to run the function for the:
non-www pages (i.e site:mysite.com), would have 12K results
www pages (i.e site:www.mysite.com) would have about 36K
The past few days, this has reversed! I get 12K for www pages, and 36K for non-www pages.
Things I have changed:
I have added canonical URL links in the header, all have www in the URL.
My questions:
Is this cause for concern?
Can anyone explain this to me?
-
Maybe Google includes all sub-domains. I just tested my site, and got the following results.
site:handsomeweb.com 340 results (all have www)
site:www.handsomeweb.com 231 results (all have www)
The difference is the first query includes pages located at:blog.handsomeweb.com
-
I don't get both resolving, and yes, I did set a preferred domain in my Google account.
Any other ideas?
-
Have you done preferred domain in Google?
I don't know how you could have done the 301's and still get both resolving. Have you run xenu or some other program to insure the 301s are there?
-
We did the 301 redirects from non-www to www when we launched the site.
I have another site that I have done a 301 from www to non-www, and you get 0 results when you search "site:www.mysite.com".
They are both on the same platform, which makes it more confusing!!!
-
inhouseseo
I have looked at several of our sites and see no change in results for site:
You stated: **Things I have changed: **
I have added canonical URL links in the header, all have www in the URL.
I believe what is happening (assuming you changed the canonical URL prior to the change in results of site:) is the change is the result of the canonical application you have added. However, I am not sure how you could still have an aggregate of 48K pages, are you sure this is accurate?
If, you are showing 12K of www, and 36K of non www, I would guess that the 12K were duplicated within the 36K. Therefore, you would have only 36K pages on your site.
Typically, when we encounter a site that has both www and non-www we select a preferred domain in WMT and do the 301 redirect in .htaccess file. Once this is done, over a short period, we will have only what we have chosen as the preferred domain www or non www.
So, if we started with 1,000 pages of www and 2,000 of non www, then if preferred choice is non www. We will end up with 2,000 pages total.
My suggestion would be to go into WMT and select a preferred domain and do the 301 redirect in the .htaccess file. Once that is done, I believe your problem will be resolved. rel=canon will not accomplish this in and of itself. Give it a few weeks and check your results.
Best
-
This is something I've been noticing greatly over the past few months. I was literally just about to post this question:
site: command, y u no accurate?!
I feel like the site:domain.com command used to be very accurate in showing you total pages indexed. Recently, I've seen wildly varied results returned.
Of course, it varies based upon the inclusion of "www.", but even without it, I've seen such results as anywhere from 193k to 8million pages... and everything in-between.
Why the variance? Has anyone else seen this recently?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pillar pages and blog pages
Hello, I was watching this video about pillar pages https://www.youtube.com/watch?v=Db3TpDZf_to and tried to apply it to my self but find it impossible to do (but maybe I am looking at it the wrong way). Let's say I want to rank on "Normandy bike tou"r. I created a pillar page about "Normandy bike tour" what would be the topics of the subpages boosting that pillar page. I know that it should be questions people have but in the tourism industry they don't have any, they just want us to make them dream !! I though about doing more general blog pages about things such as : Places to rent a bike in Normandy or in XYZ city ? ( related to biking) Or the landing sites in Normandy ? (not related to biking) Is it the way to do it, what do you recommend ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Why Is this page de-indexed?
I have dropped out for all my first page KWDs for this page https://www.key.co.uk/en/key/dollies-load-movers-door-skates Can anyone see an issue? I am trying to find one.... We did just migrate to HTTPS but other areas have no problem
Intermediate & Advanced SEO | | BeckyKey0 -
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Exact match .org Ecommerce: Reason why internal page is ranking over home page
Hello, We have a new store where an internal category page (our biggest category) is moving up ahead of the home page. What could be the reason for this? It's an exact match .org. Over-optimization? Something else? It happened both when I didn't optimize the home page title tag and when I did for the main keyword, i.e. mainkeyword | mainkeyword.org, or just mainkeyword.org Home Page. Both didn't help with this. We have very few backlinks. Thanks
Intermediate & Advanced SEO | | BobGW0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Webmaster Index Page significant drop
Has anyone noticed a significant drop in indexed pages within their Google Webmaster Tools sitemap area? We went from 1300 to 83 from Friday June 23 to today June 25, 2012 and no errors are showing or warnings. Please let me know if anyone else is experiencing this and suggestions to fix this?
Intermediate & Advanced SEO | | datadirect0