Sudden Change In Indexed Pages
-
Every week I check the number of pages indexed by google using the "site:" function. I have set up a permanent redirect from all the non-www pages to www pages.
When I used to run the function for the:
non-www pages (i.e site:mysite.com), would have 12K results
www pages (i.e site:www.mysite.com) would have about 36K
The past few days, this has reversed! I get 12K for www pages, and 36K for non-www pages.
Things I have changed:
I have added canonical URL links in the header, all have www in the URL.
My questions:
Is this cause for concern?
Can anyone explain this to me?
-
Maybe Google includes all sub-domains. I just tested my site, and got the following results.
site:handsomeweb.com 340 results (all have www)
site:www.handsomeweb.com 231 results (all have www)
The difference is the first query includes pages located at:blog.handsomeweb.com
-
I don't get both resolving, and yes, I did set a preferred domain in my Google account.
Any other ideas?
-
Have you done preferred domain in Google?
I don't know how you could have done the 301's and still get both resolving. Have you run xenu or some other program to insure the 301s are there?
-
We did the 301 redirects from non-www to www when we launched the site.
I have another site that I have done a 301 from www to non-www, and you get 0 results when you search "site:www.mysite.com".
They are both on the same platform, which makes it more confusing!!!
-
inhouseseo
I have looked at several of our sites and see no change in results for site:
You stated: **Things I have changed: **
I have added canonical URL links in the header, all have www in the URL.
I believe what is happening (assuming you changed the canonical URL prior to the change in results of site:) is the change is the result of the canonical application you have added. However, I am not sure how you could still have an aggregate of 48K pages, are you sure this is accurate?
If, you are showing 12K of www, and 36K of non www, I would guess that the 12K were duplicated within the 36K. Therefore, you would have only 36K pages on your site.
Typically, when we encounter a site that has both www and non-www we select a preferred domain in WMT and do the 301 redirect in .htaccess file. Once this is done, over a short period, we will have only what we have chosen as the preferred domain www or non www.
So, if we started with 1,000 pages of www and 2,000 of non www, then if preferred choice is non www. We will end up with 2,000 pages total.
My suggestion would be to go into WMT and select a preferred domain and do the 301 redirect in the .htaccess file. Once that is done, I believe your problem will be resolved. rel=canon will not accomplish this in and of itself. Give it a few weeks and check your results.
Best
-
This is something I've been noticing greatly over the past few months. I was literally just about to post this question:
site: command, y u no accurate?!
I feel like the site:domain.com command used to be very accurate in showing you total pages indexed. Recently, I've seen wildly varied results returned.
Of course, it varies based upon the inclusion of "www.", but even without it, I've seen such results as anywhere from 193k to 8million pages... and everything in-between.
Why the variance? Has anyone else seen this recently?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Robots.txt Disallowed Pages and Still Indexed
Alright, I am pretty sure I know the answer is "Nothing more I can do here." but I just wanted to double check. It relates to the robots.txt file and that pesky "A description for this result is not available because of this site's robots.txt". Typically people want the URL indexed and the normal Meta Description to be displayed but I don't want the link there at all. I purposefully am trying to robots that stuff outta there.
Intermediate & Advanced SEO | | DRSearchEngOpt
My question is, has anybody tried to get a page taken out of the Index and had this happen; URL still there but pesky robots.txt message for meta description? Were you able to get the URL to no longer show up or did you just live with this? Thanks folks, you are always great!0 -
Should I set up no index no follow on low quality pages?
I know it is a good idea for duplicate pages, blog tags, etc. but I remember somewhere that you can help the overall link juice of a website by adding no index no follow or no index follow low quality content pages of your website. Is it still a good idea to do this or was it never a good idea to begin with? Michael
Intermediate & Advanced SEO | | Michael_Rock0 -
Google de-indexed a page on my site
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms). Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week. As a test I added a similar page to one of my other sites and it ranks fine. I've checked webmaster tools and there is nothing of note there. I'm not really sure what to do at this stage. Any advice would me much appreciated!
Intermediate & Advanced SEO | | deelo5550 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Any Issues with Changing a Page based on IP address?
Building a site and wondering if we have one page that changes depending on where how it is accessed if that is a good / bad idea. Thanks in advance!
Intermediate & Advanced SEO | | nicole.healthline0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0