Google Webmaster Tools - spike in 'not selected' under Index Status
-
Hi fellow mozzers
Has anyone seen a huge shift in the number of pages 'Not Selected' under Index Status in Google WMT, and been able to identify what the problem has been?
My new client recently moved their site to wordpress - and in doing so the number of pages 'not selected' rose from ~200 to ~1100, It was high before but is ridiculous now.
I am thinking there must be a new duplicate content issue which should be cleaned up in my quest to improve their SEO.
Could it be the good old WP tag/category issue? In which case I won't worry as Joost is doing its job of keeping stuff out of the index.
There are loads of image pages which could well appear as dupe as have no content on them (i do need to fix this), but Google is already indexing these so doesn't explain the ones 'not selected'.
I've tried checking dupe title tags but there are very few of them so that doesn't help
Any other ideas of how to identify what these problem pages maybe?
Thanks very much!
Wendy
-
Nice to see another ham on here! 33s and 73s from N6TME.
-
Thanks n1ar
There are only 54 crawl errors and 1133 not selected so unfortunately that's not the problem
I thought it must be a dupe content issue but cannot find a tool that will help me identify what the pages actually are! I wondered whether they are pages blocked by robots.txt, as that shows as 0 but I know a load of WP pages are blocked. How did you manage to identify the pages when it happened to you?
And thanks for that article Keri - always good to know you are not alone!
-
Hi Wendy,
I have seen that same thing happening on several sites. In one case, it was an abundance of pages sneaking through that were not part of the site, but had somehow been linked to thus the crawlers were infesting them. Robots entries have done a lot to improve the "not selected" pages in that case.
In another case it was a lot of dupe content, but not a Wordpress site, I'm not familar with the tag/category issue.
What I would do is download the list of all crawl errors from WMT and do whatever needs doing to correct as many of those as you possibly can.
-
I don't have an answer for you, but you're not the only one seeing this. Barry writes about it on Search Engine Roundtable at http://www.seroundtable.com/google-not-selected-16007.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Variables in Google Tag Manager
Last week I got a great answer here on how to implement GTM and cross domain tracking, now that we have that in place I'm looking for some more advice 😉 I'm wondering that if we push variabels into the data-layer of GTM, can we actually use those variables in reports as well? Do they get recorded in Google Analytics? I'd like to use some data that we push to the data layer for segmentation purposes. Anybody an idea how to achieve this?
Reporting & Analytics | | jorisbrabants0 -
Google is not indexing all URLs
My website have company and events profile from 200 countries. So it does have lots of URL. Earlier in August 2014, Google used to crawl 90% of URLs we submit. Thing goes wrong when we shifted from http to https. We lost traffic. But we are gaining it slowly. Main concern is that, It still does not indexed all submitted URLs. It have crawled merely 8% of all URLs submitted. site address is businessvibes.com Any help would be appreciated.
Reporting & Analytics | | irteam0 -
Discrepancy between FB PPC and Google Analytics
This question was answered in 2011 here: http://moz.com/community/q/facebook-ppc-number-of-clicks-according-to-fb-different-than-visits-in-analytics. Wanted to post it to the community to see if anyone had any new thoughts in the last 3 years. I have been running campaigns on Facebook and seeing dramatic discrepancies between Facebook and GA's numbers. For example, I ran a Facebook ad campaign for a chiropractor where FB shows 35 clicks to the website, but GA only shows 2! An attorney ran a Facbook promotion, got 4 clients who actually filled out a questionnaire online, but GA only showed 2 visitors exiting off the form completion page. Is this because the users did not have JS/cookies enabled? Something else? What is the recommended work around? Tracking URL?
Reporting & Analytics | | aj6130 -
Pairing webmaster tools with analytics - user access
If you pair webmaster tools with an analytics account does this auto allow access for any of the authorised analytics users or do they have to be granted access individually from within GWT ? cheers dan
Reporting & Analytics | | Dan-Lawrence0 -
How Google measure website bounce rate ?
Bounce rate is a SEO signal, but how Google measures it ? There is any explanation about this ? Does Google uses Analytics ? Maybe time between 2 clics in search results ? Thanks
Reporting & Analytics | | Max840 -
List all URL's indexed by google
Hi all i need a list of all urls google has indexed from my site i want this in excel format or csv how do i go about getting this thanks in advance
Reporting & Analytics | | Will_Craig0 -
Subdomains and SEO - Analytics & Webmaster Tools Setup Help
Any advice on the following greatly appreciated: How to get multiple subdomain data into 1 Google Analytics profile? Can we get multiple subdomain data into Google Webmaster Tools (and if so how?) or do we need to set GWT up per subdomain?
Reporting & Analytics | | AndyMacLean0 -
Will Google start trimming 'stale' sites rank?
With the recent focus on Google to reduce rank of farms and low value sites, I am interested to get SEO view on if you think Google will start devaluing stale sites. I do find it a bit frustrating that in the top 5 for my main key phrase, there is one site that has NO content just an error and another blog that has not updated content in 2 years. How can blogs that do not blog be considered high enough value by Google to rank in the top 5? How can sites that just return 404 or 500 for ALL their pages be even considered a site let alone rank 2nd. I am interested so see others experiences and thoughts on 'user experience' clean ups by Google and why these types of sites get missed?
Reporting & Analytics | | oznappies0