Anyone managed to decrease the "not selected" graph in WMT?
-
Hi Mozzers.
I am working with a very large E-com site that has a big issue with duplicate or near duplicate content. The site actually received a message in WMT listing out pages that Google deemed it should not be crawling. Many of these were the usual pagination / category sorting option URL issues etc.
We have since fixed the issue with a combination of site changes, robots.txt, parameter handling and URL removals, however I was expecting the "not selected" graph in WMT to start dropping.
The number of roboted pages has increased by around 1 million pages (which was expected) and indexed pages has actually increased despite removing hundreds of thousands of pages. I assume this is due to releasing some crawl bandwidth for more important pages like products.
I guess my question is two-fold;
1. Is the "not selected" graph cumulative, as this would explain why it isn't dropping?
2. Has anyone managed to get this figure to significantly drop? Should I even care? I am relating this to Panda by the way.
Important to note that the changes were made around 3 weeks ago and I am aware not everything will be re-crawled yet.
Thanks,
Chris -
Very interesting. I'm also convinced the "not selected" graph is a big clue towards a Panda penalty. I guess I will have to wait another couple of weeks to see if our changes have affected the graph. Maybe this time lag is why it can take upwards of 6 months to get recover from Panda!
-
Hi Chris
Here is the nice information about the "Not Selected" data in WMT. I hope this post will help you more to understand about the Not Selected Graph : http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2642366
-
The "Not Selected" isn't cumulative. The "Ever Crawled" is though.
I have a large Wordpress content site. It was hit by Panda on a very same day that my "not selected" multiplied by 8. I don't think it was a coincidence, and I didn't make any large changes to the site besides the regular addition of about 10 posts per week.
I've been able to effect a downward movement on the not selected count by removing/redirecting things like "replytocom" variable URLs in the comments section;reworking print and email versions of each article, etc. It very slow though, only reducing by an average of 100 per week.
Needless to say, I think the not selected metric means quite alot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Please select one, out of two
Which theme is more SEO friendly and Fast loading? Both on desktop and Mobile http://demo.mythemeshop.com/blogging/2014/03/26/age-steel/ Or http://demo.tagdiv.com/newsmag/td-post-cruise-2015-swim-trend-blurred-lines/
Intermediate & Advanced SEO | | Hall.Michael0 -
H Tags Vs "H Style" Tags?
Hey everybody! So I was wondering what the difference between the H tags and "H Style". My first thought is that it's just the style guide, and not actually a meta tag, but before I go around changing all these styles I want to make sure my computer isn't going to explode SEO juice. Thanks!
Intermediate & Advanced SEO | | HashtagHustler0 -
Google WMT Turning 1 Link into 4,000+ Links
We operate 2 ecommerce sites. The About Us page of our main site links to the homepage of our second site. It's been this way since the second site launched about 5 years ago. The sites sell completely different products and aren't related besides both being owned by us. In Webmaster Tools for site 2, it's picking up ~4,100 links coming to the home page from site 1. But we only link to the home page 1 time in the entire site and that's from the About Us page. I've used Screaming Frog, IT has looked at source, JavaScript, etc., and we're stumped. It doesn't look like WMT has a function to show you on what pages of a domain it finds the links and we're not seeing anything by checking the site itself. Does anyone have experience with a situation like this? Anyone know an easy way to find exactly where Google sees these links coming from?
Intermediate & Advanced SEO | | Kingof50 -
'Select your country' page leading to high Temporary Redirects
Hello all, I manage an ecommerce website and product prices are shown depending on what country you select. When a user does a product search or lands on a product page, they are immediately redirected to a 'select your country' page. After selecting their option, the user is redirected back to the product or search result page. The problem I face is that, this is leading to a high 'Temporary Redirects' list in my crawl diagnostic page. Looking at the list of temporary redirects, 90% are users being bounced to a 'select your country' page. Any advice to tackle this? Have you guys faced anything similar? Thanks Cyto
Intermediate & Advanced SEO | | Bio-RadAbs0 -
How to remove "Results 1 - 20 of 47" from Google SERP Snippet
We are trying to optimise our SERP snippet in Google to increase CTR, but we have this horrid "Results 1 - 20 of 47" in the description. We feel this gets in the way of the message and so wish to remove it, but how?? Any ideas apart from removing the paging from the page?
Intermediate & Advanced SEO | | speedyseo0 -
URL rewriting with "-" or with a space ?
Hi Which url should i use for my web site ? and why ? 1 : http://www.test.com/how-are-you.html 2 : http://www.test.com/how are you.html thanks
Intermediate & Advanced SEO | | nipponx0