When do Panda ranking factors apply when Google deindexes a page
-
Here is 2 scenarios
Scenario 1
Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them.
Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty.
Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed?
Scenario 2
I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?
What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty?
The pages are useful to my users so I need them to stay.
Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban?
Hope this makes sense
-
I have over 800,000 pages total that contain duplicate content "if" that is an issue with my definitions. I would assume that Panda would slap me hard for that, again "if" that is the issue. Since I have never tried to deindex this many pages I am hoping this works and I will take a few coffee breaks waiting because its going to be a while lol
I have nothing to lose and I feel like I have tried a ton. Thanks so much
-
"Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty."
Panda will not deindex pages. It might move them to the supplemental index, but they're not deindexed. Technically, Panda is not a penalty. It's an algorithmic demotion. If you've got a bunch of duplicate content, Google may choose not to index some of that content, or, more likely, to just show users the most appropriate page of that content.
Now, if Panda has affected your site because Google feels that the site consists of mostly duplicate or thin content then you'll need to noindex or significantly change or remove that content in order for Google to see that the quality has improved. You can't say that the content is essentially gone because Google is not showing it. That wouldn't change the factors that caused you to be affected by Panda. (Now, this is assuming that this is what the problem is, because we don't know that.)
"I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?"
If these pages were the primary reason for Panda to visit your site, then what would happen is that as Google recrawls your site they will start to recognize that the quality is improved. Then, at some point with a future Panda refresh (it may take several if there is a lot of content to crawl), you should see an increase in traffic. If the duplication was the only factor that Panda was concerned about then you'd likely see a dramatic improvement. If it was just one of the factors, you might see a smaller improvement. If you had a lot of factors, you may see very little or just some improvement.
If I understand the question right, I would say that the answer is to go ahead and add the noindex tag to these pages.
-
No offense man, I really want to figure out what the heck happened with my site, I really feel like I was hit from unfortunate circumstances.
My website is http://www.freescrabbledictionary.com/
The duplicate content I am referring too is that I generate my definitions for words from an API provided by https://www.wordnik.com/
I do site the resource on each definition at the bottom of every page (which was required by https://github.com/wordnik), an example is http://www.freescrabbledictionary.com/dictionary/word/testing/
I have never had a manual penalty from Google, I check Google Webmaster Tools all the time. I also use tools like Google analytics as well as moz.com, ahrefs.com and monitorbacklinks.com.
I used to rank for the keyword "scrabble dictionary" in the top 4 spots on average. For a long time I was #2 which was my biggest keyword traffic.
I remember when the first Panda update came out I was not hit. I notice the negative changes in my rankings after the second panda update and so on. Since Penguin was in the mix as well I cant even tell if I was hit with penguin.
I never paid or did black-hat backlinking
Again I never was hit with a manual penalty, this is 100% algorithmic
If you notice with the keyword "scrabble dictionary" my homepage does not rank for that keyword at all. Not anywhere in all the search results, where I used to rank top 4 spots.
Since I have been hit so hard I have nothing to lose so I have "noindex" 100% of each word definition, sentence example's and quotes which even though those are not copied (except for definitions) I did that just in case. This equates for about 90% of my site's pages indexed by Google.
I have changed my site design to equate for the "refresh" ranking factor, I have desperately comb through my site 1,000,000 times trying to figure out what happened, I have disavowed link 10 fold and nothing seems to affect my rankings. At this point I will try anything...I have nothing to lose.
-
Can you describe what happened to your site and why you believe you got a penalty.
make sure Type site:www.example.com in to Google it will show you what is indexed.
Be certain that you do not have a robots.txt file or something similar blocking your website go to https://www.feedthebot.com/tools/ type your domain in and it will tell you if you are blocking anything with your robots.txt do this on the URLs that you think not indexed.
Because if you are affected by a true panda penalty it would be a manual penalty you would receive word inside your Google Webmaster tools account. If you do not have one set one up.
https://www.google.com/webmasters/tools/home?hl=en
if you think you've been hit by an algorithm penalty not manual you can check by using tools listed in this URL
http://www.iacquire.com/blog/5-tools-to-help-you-identify-a-google-slap
now obviously because you're talking about duplicate content which it seems like you may have known existed somewhere else may be and please don't take offense you copied it?
In that case Google takes the domain with the most authority and gives it to that domain.
so time.com could probably take your entire site and you would be the one that looked like you stole their content.
remember Google does also consider the first time it was indexed however site authority trumps it.
Google will only acknowledge a difference if you actually have a manual penalty if you received a manual penalty it would come with instructions on what to do next.
My advice to you is if you have duplicate content that is taken from another website and not yours please remove the content second choice no index that content.
It could be that you have the misfortune of somebody finding out that you took their content and they do digital millennium takedown in many cases would damage or domain beyond repair. You would know if this occurred as well. I'm just letting you know it's not smart to have someone else's content on your site you should write it uniquely to meet your end-users needs and if the current content is very helpful to them I recommend you use that to create your own unique content not spinning it but unique.
please know that if you tell me you didn't take the content I will apologize right away. I do not mean to imply.
respectfully,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On page links
Hi I am really intrigued by Bloomberg strategy. if you look at their article pages they are full with internal links done with what I assume to be an automated process (too many pages to be done manually). it seems to work for them. I would love to hear your opinions.
On-Page Optimization | | ciznerguy
http://www.bloomberg.com/news/2014-11-26/uber-said-close-to-raising-funding-at-up-to-40b-value.html0 -
Rankings dropped overnight
We have been doing some optimization for a client. They were not ranked for many keywords before we started, and up until last week 10 were on page 1, with the the last few following closely behind. Everything was moving in the right direction. In this weeks ranking report every one has dropped significantly and some 30 places in one week. We have made no significant changes to the site in the last few weeks. The only issue we picked up on was the slow site speed on their current host 1&1. Any ideas what could have caused the site to have lost ground on all rankings so significantly?
On-Page Optimization | | FLDESIGN0 -
Can you canonical from one domain page to a different domain page
We are a boating site and have our main site with all it's products. We have an engine section within our main site. But we also have an outside domain, specific to a certain manufacturer of engines. So we want our customers to still find the engine information for this manufacturer within our main site, as well as find the manufacturer targeted engine site in the SERPS. My question is this: Can I canonical those pages within our main site to pages on the outside domain? Or does are canonicals to be used only within the same domain? Thanks,
On-Page Optimization | | tdawson090 -
Duplicate Page Content
Hi there, We keep getting duplicate page content issues. However, its not actually the same page.
On-Page Optimization | | HamiltonIsland
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content?0 -
Wrong Page is Ranking
My client is an Ecommerce reseller of a few major scooter brands. We currently rank fifth for a particular brand name but our main brand page isn't the one that ranks. Instead, it's a product page. The main brand page has an A rating from Moz for the desired keyword phrases. Neither page has any backlinks. Any ideas on why our main brand page would be outranked by a product page? What could we do to change this?
On-Page Optimization | | TrinShin0 -
What URL Should I use in Google Place Page?
Alright, I have a client that has 1 website and 14 locations. We want to create place pages for each of their locations but my question is which URL should I put in the place page and why? I can put in the root domain into each place page, or should I put in the URL that lands on the actual location on the root. example: domain.com/location1 Thanks!
On-Page Optimization | | tcseopro0 -
Ranking for specific pages
HI, Lets say my website is abc.com and my targeted keyword is abc for index page. Internal pages, like abc.com/apple.htm, abc.com/banana.htm. The targeted keyword for apple.htm is fresh apples, buy apples, and for banana.htm, fresh banana, buy banana. How to define these keywords in the campaign. Please suggest. Thanks.
On-Page Optimization | | younus0 -
What are the benefits of targeting one keyword phrase per page vs. multiple keywords per page
What are the benefits of optimizing a page for one keyword phrase versus a group of similar keywords, like this one that Rand posted on another blog entry http://bit.ly/7LzTxY: Ted Baker Ted Baker London Ted Baker Clothing Ted Baker Mens Ted Baker Mens Clothing Ted Baker Mens Collection
On-Page Optimization | | EricVallee340