Could large number of "not selected" pages cause a penalty?
-
My site was penalized for specific pages in the UK On July 28 (corresponding with a Panda update).
I cleaned up my website and wrote to Google and they responded that "no manual spam actions had been taken".
The only other thing I can think of is that we suffered an automatic penalty.
I am having problems with my sitemap and it is indexing many error pages, empty pages, etc... According to our index status we have 2,679,794 not selected pages and 36,168 total indexed.
Could this have been what caused the error?
(If you have any articles to back up your answers that would be greatly appreciate)
Thanks!
-
Canonical tag to what? Themselves? Or the page they should be? Are these pages unique by some URL variables only? If so, you can instruct Google to ignore specific get variables to resolve this issue but you would also want to fix your sitemap woes: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
This is where it gets sticky, these pages are certainly not helping and not being indexed, Google Webmaster tools shows us that, but if you have this problem, how many other technical problems could the site have?
We can be almost certain you have some kind of panda filter but to diagnose it further we would need a link and access to analytics to determine what has gone wrong and provide more detailed guidance to resolve the issues.
This could be a red herring and your problem could be elsewhere but with no examples we can only give very general responses. If this was my site I would certainly look to identify the most likely issues and work through this in a pragmatic way to eliminate possible issues and look at other potentials.
My advice would be to have the site analysed by someone with distinct experience with Panda penalties who can give you specific feedback on the problems and provide guidance to resolve them.
If the URL is sensitive and can't be shared here, I can offer this service and am in the UK. I am sure can several other users at SEOMoz can also help. I know Marie Haynes offers this service as I am sure Ryan Kent could help also.
Shout if you have any questions or can provide more details (or a url).
-
Hi,
Thanks for the detailed answer.
We have many duplicate pages, but they all have canonical tags on them... shouldn't that be solving the problem. Would pages with the canonical tag be showing up here?
-
Yes, this can definitely cause problems. In fact this is a common footprint in sites hit by the panda updates.
It sound like you have some sort of canonical issue on the site: Multiple copies of each page are being crawled. Google is finding lots of copies of the same thing, crawling them but deciding that they are not sufficiently unique/useful to keep in the index. I've been working on a number of sites hit with the same issue and clean up can be a real pain.
The best starting point for reading is probably this article here on SEOmoz : http://www.seomoz.org/learn-seo/duplicate-content . That article includes some useful links on how to diagnose and solve the issues as well, so be sure to check out all the linked resources.
-
Hey Sarah
There are always a lot of moving parts when it comes to penalties but the very fact that you lost traffic on a known panda date really points towards this being a Panda style of penalty. Panda, is an algorithmic penalty so you will not receive any kind of notification in Webmaster Tools and likewise, a re-inclusion request will not help, you have to fix the problem to resolve the issues.
The not selected pages are likely a big part of your problem. Google classes not selected pages as follows:
"Not selected: Pages that are not indexed because they are substantially similar to other pages, or that have been redirected to another URL. More information."
If you have the best part of 3 million of these pages that are 'substantially similar' to other pages then there is every change that this is a very big part of your problem.
Obviously, there are a lot of moving parts to this. This sounds highly likely this is part of your problem and just think how this looks to Google. 2.6 million pages that are duplicated. It is a low quality signal, a possible attempt at manipulation or god knows what else but what we do know, is that is unlikely to be a strong result for any search users so those pages have been dropped.
What to do?
Well, firstly, fix your site map and sort out these duplication problems. It's hard to give specifics without a link to the site in question but just sort this out. Apply the noindex tag dynamically if needs be, remove these duplicates from the sitemap, heck, remove the sitemap alltogether for a while if needs be till it is fixed. Just sort out these issues one way or another.
Happy to give more help here if I can but would need a link or some such to advise better.
Resources
You asked for some links but I am not completely sure what to provide here without a link but let me have a shot and provide some general points:
1. Good General Panda Overview from Dr. Pete
http://www.seomoz.org/blog/fat-pandas-and-thin-content
2. An overview of canonicalisation form Google
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066
3. A way to diagnose and hopefully recover from Panda from John Doherty at distilled.
http://www.distilled.net/blog/seo/beating-the-panda-diagnosing-and-rescuing-a-clients-traffic/
4. Index Status Overview from Google
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2642366
Summary
You have a serious problem here but hopefully one that can be resolved. Panda is a primarily focused at on page issues and this is an absolute doozy of an on page issue so sort it out and you should see a recovery. Keep in mind you have 75 times more problem pages than actual content pages at the moment in your site map so this may be the biggest case I have ever seen so I would be very keen to see how you get on and what happens when you resolve these issues as I am sure would the wider SEOMoz community.
Hope this helps & please fire over any questions.
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google pulling brand snippets from only some of my pages. Different settings or are they just being selective?
Hi, Moz community For some of the category-pages, Google is showing some of the brands in the SERP, like this: http://www.screencast.com/t/62wldbwc
Intermediate & Advanced SEO | | Inevo
This is the page-url: https://www.gsport.no/sport/loep/lopeklaer/loepebukse For other category-pages that seemingly is built with similar code and settings, Google doesn't show brands in the snippet: http://www.screencast.com/t/zU9cg7odf
The page-url: https://www.gsport.no/sport/loep/lopeklaer/loepejakke This all begs the questions:
If the two pages contain the same code/html in terms of schema.org / rich snippets, why is Google choosing to display the brands in the SERP for only one of them? And is there something I can do in order to make them display the brands for all my pages? Thank you
Sigurd Bjurbeck, INEVO (digital agency)0 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
301 Externally Linked, But Non-Producing Pages, To Productive Pages Needing Links?
I'm working on a site that has some non-productive pages without much of an upside potential, but that are linked-to externally. The site also has some productive pages, light in external links, in a somewhat related topic. What do you think of 301ing the non-productive pages with links to the productive pages without links in order to give them more external link love? Would it make much of a difference? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
How to structure links on a "Card" for maximum crawler-friendliness
My question is how to best structure the links on a "Card" while maintaining usability for touchscreens. I've attached a simple wireframe, but the "card" is a format you see a lot now on the web: it's about a "topic" and contains an image for the topic and some text. When you click the card it links to a page about the "topic". My question is how to best structure the card's html so google can most easily read it. I have two options: a) Make the elements of the card 2 separate links, one for the image and one for the text. Google would read this as follows. //image
Intermediate & Advanced SEO | | jcgoodrich
[](/target URL) //text
<a href=' target="" url'="">Topic</a href='> b) Make the entire "Card" a link which would cause Google to read it as follows: <a></a> <a>Bunch of div elements that includes anchor text and alt-image attributes above along with a fair amount of additional text.</a> <a></a> Holding UX aside, which of these options is better purely from a Google crawling perspective? Does doing (b) confuse the bot about what the target page is about? If one is clearly better, is it a dramatic difference? Thanks! PwcPRZK0 -
How would you handle 12,000 "tag" pages on Wordpress site?
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
Intermediate & Advanced SEO | | M_D_Golden_Peak1 -
High number of items per page or low number with more category pages?
In SEO terms, what would be the best method: High number of items per page or low number with more pages? For example, this category listing here: http://flyawaysimulation.com/downloads/90/fsx-civil-aircraft/ It has 10 items per page. Would there be any benefit of changing a listing like that to 20 items in order to decrease the number of pages in the category? Also, what other ways could you increase the SEO of category listings like that?
Intermediate & Advanced SEO | | Peter2640