When do Panda ranking factors apply when Google deindexes a page
-
Here is 2 scenarios
Scenario 1
Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them.
Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty.
Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed?
Scenario 2
I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?
What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty?
The pages are useful to my users so I need them to stay.
Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban?
Hope this makes sense
-
I have over 800,000 pages total that contain duplicate content "if" that is an issue with my definitions. I would assume that Panda would slap me hard for that, again "if" that is the issue. Since I have never tried to deindex this many pages I am hoping this works and I will take a few coffee breaks waiting because its going to be a while lol
I have nothing to lose and I feel like I have tried a ton. Thanks so much
-
"Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty."
Panda will not deindex pages. It might move them to the supplemental index, but they're not deindexed. Technically, Panda is not a penalty. It's an algorithmic demotion. If you've got a bunch of duplicate content, Google may choose not to index some of that content, or, more likely, to just show users the most appropriate page of that content.
Now, if Panda has affected your site because Google feels that the site consists of mostly duplicate or thin content then you'll need to noindex or significantly change or remove that content in order for Google to see that the quality has improved. You can't say that the content is essentially gone because Google is not showing it. That wouldn't change the factors that caused you to be affected by Panda. (Now, this is assuming that this is what the problem is, because we don't know that.)
"I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?"
If these pages were the primary reason for Panda to visit your site, then what would happen is that as Google recrawls your site they will start to recognize that the quality is improved. Then, at some point with a future Panda refresh (it may take several if there is a lot of content to crawl), you should see an increase in traffic. If the duplication was the only factor that Panda was concerned about then you'd likely see a dramatic improvement. If it was just one of the factors, you might see a smaller improvement. If you had a lot of factors, you may see very little or just some improvement.
If I understand the question right, I would say that the answer is to go ahead and add the noindex tag to these pages.
-
No offense man, I really want to figure out what the heck happened with my site, I really feel like I was hit from unfortunate circumstances.
My website is http://www.freescrabbledictionary.com/
The duplicate content I am referring too is that I generate my definitions for words from an API provided by https://www.wordnik.com/
I do site the resource on each definition at the bottom of every page (which was required by https://github.com/wordnik), an example is http://www.freescrabbledictionary.com/dictionary/word/testing/
I have never had a manual penalty from Google, I check Google Webmaster Tools all the time. I also use tools like Google analytics as well as moz.com, ahrefs.com and monitorbacklinks.com.
I used to rank for the keyword "scrabble dictionary" in the top 4 spots on average. For a long time I was #2 which was my biggest keyword traffic.
I remember when the first Panda update came out I was not hit. I notice the negative changes in my rankings after the second panda update and so on. Since Penguin was in the mix as well I cant even tell if I was hit with penguin.
I never paid or did black-hat backlinking
Again I never was hit with a manual penalty, this is 100% algorithmic
If you notice with the keyword "scrabble dictionary" my homepage does not rank for that keyword at all. Not anywhere in all the search results, where I used to rank top 4 spots.
Since I have been hit so hard I have nothing to lose so I have "noindex" 100% of each word definition, sentence example's and quotes which even though those are not copied (except for definitions) I did that just in case. This equates for about 90% of my site's pages indexed by Google.
I have changed my site design to equate for the "refresh" ranking factor, I have desperately comb through my site 1,000,000 times trying to figure out what happened, I have disavowed link 10 fold and nothing seems to affect my rankings. At this point I will try anything...I have nothing to lose.
-
Can you describe what happened to your site and why you believe you got a penalty.
make sure Type site:www.example.com in to Google it will show you what is indexed.
Be certain that you do not have a robots.txt file or something similar blocking your website go to https://www.feedthebot.com/tools/ type your domain in and it will tell you if you are blocking anything with your robots.txt do this on the URLs that you think not indexed.
Because if you are affected by a true panda penalty it would be a manual penalty you would receive word inside your Google Webmaster tools account. If you do not have one set one up.
https://www.google.com/webmasters/tools/home?hl=en
if you think you've been hit by an algorithm penalty not manual you can check by using tools listed in this URL
http://www.iacquire.com/blog/5-tools-to-help-you-identify-a-google-slap
now obviously because you're talking about duplicate content which it seems like you may have known existed somewhere else may be and please don't take offense you copied it?
In that case Google takes the domain with the most authority and gives it to that domain.
so time.com could probably take your entire site and you would be the one that looked like you stole their content.
remember Google does also consider the first time it was indexed however site authority trumps it.
Google will only acknowledge a difference if you actually have a manual penalty if you received a manual penalty it would come with instructions on what to do next.
My advice to you is if you have duplicate content that is taken from another website and not yours please remove the content second choice no index that content.
It could be that you have the misfortune of somebody finding out that you took their content and they do digital millennium takedown in many cases would damage or domain beyond repair. You would know if this occurred as well. I'm just letting you know it's not smart to have someone else's content on your site you should write it uniquely to meet your end-users needs and if the current content is very helpful to them I recommend you use that to create your own unique content not spinning it but unique.
please know that if you tell me you didn't take the content I will apologize right away. I do not mean to imply.
respectfully,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sale page ranking for "[blank] for sale" keywords
My company's Ecommerce site has a sale category that is currently out performing some of our normal categories in the SERPs for "[blank] for sale" keywords. For example the sale category landing page is ranking for the keyword "vegetable seeds for sale" rather than the vegetable seed category. Has anyone ever dealt with a similar situation to this? or does anyone have general advice for optimizing (or weakening) sale pages?
On-Page Optimization | | Scoleman1 -
How to change images of a page without loosing ranking?
Hi, I have two reasons to change some images of a page on a wordpress site: 1.Google speed service advise me to optmize the images size to better spead load times. 2.I want to change images titles (to improve seo optimization for the page keyword), so i need to replace them, since im using wordpress. Now the question is: Can i just change the images without worring about any related seo issues? Or should i follow some best practice to change images in order to not affect the ranking of the page? tx for your support!
On-Page Optimization | | Dreamrealemedia0 -
On Page Report Card F Grade Critical Factors
The website and page in question is http://www.upstrap-pro.com/ I sell non-slip camera straps and FYI for key word(s) camera strap(s) we were for a number of years on page 1 or 2. Google sold our registered trade-name _UP_strap® all over the web including Amazon. And of course we were hijacked for the keyword. Be that as it may According to SEOMOZ there are many errors on our homepage. I am having the host look at a number of SEOMOZ's report findings. Two critical findings that are making me nuts because I do not have the tech chops to understand why are: 1) Accessible to Engines <dl> <dt>Explanation</dt> <dd>Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible.</dd> </dl> 2) Appropriate Use of Rel Canonical <dl> <dt>Explanation</dt> <dd>If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL</dd> <dd>So here is the code:</dd> <dd>```
On-Page Optimization | | Asteg
<html xmlns="<a class="attribute-value">http://www.w3.org/1999/xhtml</a>"> <head> <title>DSLR-Camera-Straps Award Winning Non~Slip Shoulder Strapstitle> <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">An Amazing Camera Strap that will NOT slip off your shoulder! Neck straps are bad for your neck & camera slings are bulky. Easy 60 day money back return policy.</a>" /> <meta http-equiv="<a class="attribute-value">Content-type</a>" content="<a class="attribute-value">text/html;charset=UTF-8</a>" /> <base href="http://www.upstrap-pro.com/Merchant2/" /> <link type="<a class="attribute-value">text/css</a>" rel="<a class="attribute-value">stylesheet</a>" href="css/00000002/cssui.css" media="" /> <link rel="<a class="attribute-value">canonical</a>" href="http://www.upstrap-pro.com/" /> </dl>0 -
Issues with Product Pages Getting Index In Google
I just started working here the other week and one of the big issue is that a lot of the product pages are not getting index in google. We have an xml.gz site map they submitted a long time ago. My guess is it might be something with not enough content on the pages? Here are a few example of pages that are not getting index in google. http://www.rockymountainatvmc.com/p/43/-/439/716/-/33097/Alpinestars-Dual-Motorcycle-Gloves http://www.rockymountainatvmc.com/p/47/-/201/803/-/28948/Camelbak-Blowfish-2013 http://www.rockymountainatvmc.com/p/46/-/203/836/-/6996/MSR-Head-Case http://www.rockymountainatvmc.com/p/44/54/208/764/80/1220/Galfer-Brake-Pad-Sintered-Metal There are 100's that are not indexed just trying to figure out what we need to do! We are working on new content to them all but we have over 5000 products so it will take a long time. We also have the reviews on the pages and are looking at starting a Q&A on page to help get more unique content.
On-Page Optimization | | DoRM0 -
Dupelicate content home page and custom page question
I am working on a website that got hit by the penguin update. Didn't get hit terribly bad, but dropped from number one to number 9. As I'm going through the pages, the theme and content is a mess. To give an example, say the site is about custom colored marbles. The main page content covers custom colored marbles, custom promotional marbles, custom glass marbles, etc. Custom colored marbles is mentioned and covered on all pages, which I am going back and trying to make each page theme specific. There is also a custom page, so I am at a cross roads on how best to employ the focus of the custom page and the home page. I am thinking the home page should emphasize colored marbles, and the custom page should emphasize custom colored marbles. My fear is that making such a drastic change will bounce the site completely off front page and that it will take time for the custom page to come up in rankings. AS it stands now I am confused as to how it even ranks on first page as there's two pages with custom colored marbles emphasis. Id like to clean this up as much as possible so there are no big hits with future google updates, but I don't want the site to drop off either as that would be hard to explain to the owner. Yeah, we are cleaning up your site and making it google compliant and in so doing you no longer rank on first page. That won't put food on the table. Thanks for any advise on this.
On-Page Optimization | | anthonytjm0 -
How are your "Service Area" pages handling Penguin/Panda?
We just got a new client because of recent Penguin/Panda changes. A national "SEO" firm decided it was a good idea to set up a page for each service town or county they serve with nothing but duplicate content. Needless to say, on the week of the 23rd, their rankings tanked from 1st page (it's not a competitive niche) to 4th. I'm not bringing this up to brag, but rather because it got me thinking... How are your geographically targeted "service area" pages doing? Have the recent changes caused you to rethink your geographic targeting in any way?
On-Page Optimization | | BedeFahey0 -
Impact of mobile pages on current rankings
Morning all - I've been getting a bit of traffic recently for mobile phrases, so am thinking of putting a mobile optimised page on my site which users will be automatically directed to when they visit my site. My question is though, how will this affect the rankings of those current pages which are trying to target mobile users. For example, let's say I've got a page at www.betting.com/iphone which is ranking really well for those users looking to place a few bets on their iPhone. Once I stick my mobile optimised page up anyone clicking through to this URL will be re-directed to a generic mobile landing page at a different URL. Is this likely to effect my rankings of the original www.betting.com/iphone page at all given the fact that all visitors are being immediately re-directed elsewhere? Thanks very much for your help
On-Page Optimization | | theshortstack0 -
On page links?
Hi all, Ive be going through the pages in my site getting rid of errors so i can the work of a clean slate and get the best for my site. However, i have a large amount of pages which is flagged up by seo moz pro tool as too many on page links. How bad is this in terms of seo rankings? Thanks
On-Page Optimization | | wazza19850