Sudden Change In Indexed Pages
-
Every week I check the number of pages indexed by google using the "site:" function. I have set up a permanent redirect from all the non-www pages to www pages.
When I used to run the function for the:
non-www pages (i.e site:mysite.com), would have 12K results
www pages (i.e site:www.mysite.com) would have about 36K
The past few days, this has reversed! I get 12K for www pages, and 36K for non-www pages.
Things I have changed:
I have added canonical URL links in the header, all have www in the URL.
My questions:
Is this cause for concern?
Can anyone explain this to me?
-
Maybe Google includes all sub-domains. I just tested my site, and got the following results.
site:handsomeweb.com 340 results (all have www)
site:www.handsomeweb.com 231 results (all have www)
The difference is the first query includes pages located at:blog.handsomeweb.com
-
I don't get both resolving, and yes, I did set a preferred domain in my Google account.
Any other ideas?
-
Have you done preferred domain in Google?
I don't know how you could have done the 301's and still get both resolving. Have you run xenu or some other program to insure the 301s are there?
-
We did the 301 redirects from non-www to www when we launched the site.
I have another site that I have done a 301 from www to non-www, and you get 0 results when you search "site:www.mysite.com".
They are both on the same platform, which makes it more confusing!!!
-
inhouseseo
I have looked at several of our sites and see no change in results for site:
You stated: **Things I have changed: **
I have added canonical URL links in the header, all have www in the URL.
I believe what is happening (assuming you changed the canonical URL prior to the change in results of site:) is the change is the result of the canonical application you have added. However, I am not sure how you could still have an aggregate of 48K pages, are you sure this is accurate?
If, you are showing 12K of www, and 36K of non www, I would guess that the 12K were duplicated within the 36K. Therefore, you would have only 36K pages on your site.
Typically, when we encounter a site that has both www and non-www we select a preferred domain in WMT and do the 301 redirect in .htaccess file. Once this is done, over a short period, we will have only what we have chosen as the preferred domain www or non www.
So, if we started with 1,000 pages of www and 2,000 of non www, then if preferred choice is non www. We will end up with 2,000 pages total.
My suggestion would be to go into WMT and select a preferred domain and do the 301 redirect in the .htaccess file. Once that is done, I believe your problem will be resolved. rel=canon will not accomplish this in and of itself. Give it a few weeks and check your results.
Best
-
This is something I've been noticing greatly over the past few months. I was literally just about to post this question:
site: command, y u no accurate?!
I feel like the site:domain.com command used to be very accurate in showing you total pages indexed. Recently, I've seen wildly varied results returned.
Of course, it varies based upon the inclusion of "www.", but even without it, I've seen such results as anywhere from 193k to 8million pages... and everything in-between.
Why the variance? Has anyone else seen this recently?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Should I apply Canonical Links from my Landing Pages to Core Website Pages?
I am working on an SEO project for the website: https://wave.com.au/ There are some core website pages, which we want to target for organic traffic, like this one: https://wave.com.au/doctors/medical-specialties/anaesthetist-jobs/ Then we have basically have another version that is set up as a landing page and used for CPC campaigns. https://wave.com.au/anaesthetists/ Essentially, my question is should I apply canonical links from the landing page versions to the core website pages (especially if I know they are only utilising them for CPC campaigns) so as to push link equity/juice across? Here is the GA data from January 1 - April 30, 2019 (Behavior > Site Content > All Pages😞
Intermediate & Advanced SEO | | Wavelength_International0 -
My home page is not found by the "Grade a Page" tool
My home page as well as several important pages are not found by the Grade a Page tool. With our full https address I got this http://screencast.com/t/s1gESMlGwpa With just the www address I got this http://screencast.com/t/BMRHy36Ih https://www.joomlashack.com
Intermediate & Advanced SEO | | etabush
https://www.joomlashack.com/joomla-templates We recently lost a lot of positions for our most important keyword: Joomla Templates Please help us figure this out. Whats screwy with our site?0 -
PR Dilution and Number of Pages Indexed
Hi Mozzers, My client is really pushing for me to get thousands, if not millions of pages indexed through the use of long-tail keywords. I know that I can probably get quite a few of them into Google, but will this dilute the PR on my site? These pages would be worthwhile in that if anyone actually visits them, there is a solid chance they will convert to a lead do to the nature of the long-tail keywords. My suggestion is to run all the keywords for these thousands of pages through adwords to check the number of queries and only create pages for the ones which actually receive searches. What do you guys think? I know that the content needs to have value and can't be scraped/low-quality and pulling these pages out of my butt won't end well, but I need solid evidence to make a case either for or against it to my clients.
Intermediate & Advanced SEO | | Travis-W0 -
Drop in indexed pages!
Hi everybody! I've been working on http://thewilddeckcompany.co.uk/ for a little while now. Until recently, everything was great - good rankings for the key terms of 'bird hides' and 'pond dipping platforms'. However, rankings have tanked over the past few days. I can't point my finger at it yet, but a site:thewilddeckcompany.co.uk search shows only three pages have been indexed. There's only 10 on the site, and it was fine beforehand. Any advice would be much appreciated,
Intermediate & Advanced SEO | | Blink-SEO0 -
Drop in number of pages in Bing index
I regularly check our index inclusion and this morning saw that we had dropped from having approx 6,000 pages in Bing's index to less than 100. We still have 13,000 in Bing's image index, and I've seen no similar drop in the number of pages in either Google or Yahoo. I've checked with our dev team and there have been no significant changes to the sitemap or robots file. Has anybody seen anything like this before, or could give any insight into why it might be happening?
Intermediate & Advanced SEO | | GBC0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0