Google Penguin 2.0 - How To Recover?
-
Hi all,
Last year, we have engaged a SEO company who promised to bring us to the first page on Google. But after 4 months, we actually found out that he might be using doing non quality mass link building tactic and this caused our ranking for all 3 sites we given to him to drop in ranking overnight on 22nd May 2012 after the Google Penguin 2.0 rolled out.Is there anything we can do to recover?
-
Exactly. Because they take this stuff quite seriously. And they're not just going to do a 10 second review if you've got 50,000 links, let alone take your word for it.
And since we're now in the age of "Google needs to teach people a lesson and create an atmosphere of deterrence", they no longer hesitate to take action when they believe it will be a better motivator.
-
Yeah, the worst thing you can do is remove 5 links, then go to Google and say "Hey guys, is that enough?", then 5 more links - "How about now, guys?", etc. You're wasting somebody's manual labor at Google, and believe me, it does piss them off.
-
And I've got a new client who had not received a manual penalty notice, yet they lost rankings from Penguin 1, so they did a disavow, then a reconsideration request after only cleaning up a fraction of the mess first. A week later, they were manually penalized and got the dreaded notice.
This is why its so important to be wiling to do a real clean-up, and personally I just don't see the overwhelming majority of sites being trusted enough as a brand (from brand-like signals) to do things half-ass or in reverse order.
-
So, here's the problem - it depends on how big you are. I've seen companies use reconsideration as a back-channel in some cases where the penalty seemed algorithmic, and they were big enough for Google to communicate with them. I suspect it's not the "approved" method and it won't work for most of us.
What's irritating is that some Google reps have said that disavow is applicable to Penguin, but others have said that disavow doesn't work without reconsideration. So, if Penguin is algorithmic AND we're supposed to disavow links BUT disavow only works with reconsideration AND you can' use reconsideration for algorithmic penalties, then pardon my French, but WTF? Some piece of "official" information is wrong - we just don't know which one.
The picture from SEOs I've talked over the last couple of years is much murkier than the official advice, as usual.
-
Interesting reply Dr. Pete. I had not heard that reconsideration could be at all useful for Penguin. In this article (http://searchengineland.com/penguin-update-recovery-tips-advice-119650), Danny Sullivan said he was told by Google,
"Within Google Webmaster Central, there’s the ability to file a reconsideration request. However, Google says this is an algorithmic change — IE, it’s a penalty that’s applied automatically, rather than a human at Google spotting some spam and applying what’s called a manual penality.
Because of that, Google said that reconsideration requests won’t help with Penguin. I was told:
Because this is an algorithmic change, Google has no plans to make manual exceptions. Webmasters cannot ask for reconsideration of their site, but we’re happy to hear feedback about the change on our webmaster forum."
-
Good discussion here.
I'd like to echo Dr. Pete when he says that we have not seen many credible cases of Penguin recovery. I find it very interesting that it has been several days since Penguin 2.0 and I have yet to see a credible case of recovery. I really thought that with the advent of the disavow tool we would see a good number of recovery cases but this has not happened as far as I can see. As such, I think that anyone who tells you what you need to do in order to recover is just taking their best guess.
When the disavow tool came out I had a few people give me some Penguin hit domains. I disavowed a large number of domains and fully expected to see a boost in rankings after 2.0 and some of these sites dropped even further.
My gut instinct is that in order to recover, sites will need to remove a large number of unnatural links and then do a FANTASTIC job at attracting new links. The problem is that sites that were ranking well previously on the power of spammy links probably weren't doing a great job at attracting links naturally. Plus, new links that are attracted are not likely to be exact anchor text links so ranking high for a particular keyword is going to be a challenge.
What I don't know is whether Penguin just devalues all of the spammy links or actually causes some type of negative ranking factor to them.
I have many questions and no one that I have seen so far really knows what the answer is to recovering from Penguin.
-
Well I originally wasn't going to comment anymore, but...
-
Karl: "Reconsideration request and the disavow tool DO work and we have used them on 2 clients with proof. It can take anything from 4-12 months for you to actually see the positive results, they do work" **-- Correlation does not equal causation. Waiting 4-12 months and then thinking that was the cause is pure guesswork. **
-
Dr. Pete: I enjoyed your write-up first of all, and you seem to be giving some more realistic advice on what can happen. One thing is standing out in your comment: "Disavow can work, but Google needs to see a clear removal effort and it almost always has to be paired with reconsideration"
-- Recondsideration Requests = A reconsideration for manual penalties = No change for algorithmic penalties
So of course it's possible that the disavow tool does work, but it seems to be so rare that any time it does there is a specific thread started somewhere about it.
- Dr. Pete: Creative 301's DO work, as I have numerous sites built on just that. You are correct in saying that they do not work like 2 years ago. There needs to be "padded" links to help counteract the bad ones, and maximize trust in my opinion. At best, you will actually see a long lasting site without the penalty, not necessarily a temporary uptick (although still possible of course). I have done it multiple times, it's not theory.
Everything that I have mentioned thus far this is under the assumption that 2.0 is similar in nature as 1.0 and is just an extension on that.
Lastly, it should be obvious at this point that I like Grey Hat for some projects. I try not to just accept the same information that is fed to the herd without testing it myself to see if it's true. Through testing I have found what works and what does not for my needs, and have also discovered that a lot of what they tell is in fact just another way to try and deter what works. I have big rankings to back up everything that I say.
-
-
Even Google's reps don't seem to agree on whether reconsideration works for Penguin, but I've seen a fair amount of evidence that disavow won't solve any problems without reconsideration, so I actually think you do have to file reconsideration in these cases.
"Creative" 301-redirects are very dangerous and do not work like they did 2+ years ago. At best, you'll see a temporary uptick and end up in a worse position down the road. I've even seen some folks suggesting (on limited evidence) that Penguin 2.0 clamped down harder on bad, redirected links. We've absolutely seen 301s carry penalties, both manual and algorithmic, over the past couple of years.
-
Just wrote up some data on Penguin 2.0:
http://www.seomoz.org/blog/penguin-2-were-you-jarred-and-or-jolted
I just want to add, though, that I'm not speculating about the new ranking factors yet, because we just don't have that information. No one has specifically recovered from Penguin 2.0, and I don't think anyone can tell you exactly what changed.
By the very fact that it's called "Penguin", though, I think it's safe to assume that these new factors are an extension of the old philosophy. I generally back Alan's procedure, because I've talked to reputable SEOs who have had success with it. That success often comes after a hard-fought battle, though. The number of Penguin 1.0 recovery stories that I can document are fairly few.
If you know for a fact you have bad links, you do need to try to remove them first. Disavow can work, but Google needs to see a clear removal effort and it almost always has to be paired with reconsideration, from what I'm seeing. Unfortunately, 2-3 Google reps have given us 2-3 stories on the process, so I'm going by what I've seen work for SEOs who I trust (who have shared details privately, in most cases).
-
Actually, I would agree with Alan. It would be best to try to get links removed first and then use disavow. As for the reconsideration requests I am picking up on a great deal of cynicism regarding these. Maybe this is just a strange coincidence but nowadays it seems that people always think their loss in traffic is penguin or panda. I actually had a situation where a site lost a bunch or traffic in late April of last year. Of course no one thought it was a manual penalty but in the end it was. After reviewing the information we didn't believe it was from the algorithm changes but a penalty. We did very little work because we weren't really aware of any wrong doing. Then we submitted forreconsideration and 3 days later received notice that there was a manual penalty and it had been removed.
Maybe this was a poor recommendation but I do believe that many people are trying to connect every loss of traffic to Panda and Penguin.
-
100% in agreement with Alan here. Reconsideration request and the disavow tool DO work and we have used them on 2 clients with proof. It can take anything from 4-12 months for you to actually see the positive results, they do work. Try and get the links removed first BEFORE using the disavow tool because Google wants to see that you have made an effort to get them removed rather than just take the easy way around!
It is true that you won't get responses from them all, especially if it is article websites where the webmaster rarely does anything on the site itself. That is when you use the disavow tool, just make sure that you are 100% certain that the links are doing your website harm.
Be honest though and look at which links are spammy and do your up-most to get them removed first. It takes time and a lot of effort but it will work....eventually!
-
Travis,
Please don't use this system to go on a political rant. If you personally have not to this point had any positive results from something it does not automatically mean that "solution" is invalid, fake, or provided purely for conspiracy reasons.
-
Google Best Practices = Propaganda to keep people poor.
The entire point of the spam team is to keep you from manipulating the rankings. They do this by any means necessary, including misleading propaganda.
Disavow tool = A tool for the Spam Team to gather information on platforms.
-
< sigh > and Travis is also not quite accurate. Disavow and Resubmit requests DO work when they're done properly.
-
Actually that first recommendation you got in this answer thread is both backward and flawed and does not follow best practices. No offense to Brad but it's just outright wrong.
The first step should be to clean up all the link mess - documenting the process - noting which sites were contacted, how they were contacted. Only after that is done should a disavow be submitted with all the links you couldn't get cleaned up.
And a resubmission request should only be made if a manual penalty was assessed, not if it was an algorithm penalty. So unless you got a manual penalty notice in Google Webmaster Tools, resubmission requests are not going to help.
-
Disagreeing here,
Following that advice will most likely not do anything except keep you in the dog house.
Let's go over it:
-
The disavow tool is complete rubbish and barely does anything (IF anything)
-
If your crappy SEO company is like most of the other crappy ones, they were simply building bulk links on platforms that can be posted to for free. No one who owns any of these sites is going to care, is even going to read a request, or even be able to. A lot of these sites get x,xxx+ links/posts added them daily. Your chances are slim to none, especially if there are a lot.
Asking your links to be removed will only ever work on smaller blogs where the links were posted and/or someone cares. Most of these links you would probably want to keep anyway.
-
Don't bother with a resubmission request. Again they are rubbish unless you have a squeaky clean link profile. More importantly though, as Brad pointed out, penguin is an algorithm update, NOT a manual penalty. Reconsideration requests will only work will manual penalties. IMO reconsideration requests will only get Google spam team employees eyes on your website for them to actually see your spam. The chances of them coming to the site otherwise are one in a million.
-
Can't argue with the comment of adding good content.
How To Actually Recover
-
Hopefully you were being smart and not doing the linkbuilding to your home page.
-
There are all kinds of creative 301-redirects that can be done to possibly shake the penalty without losing all your link juice. You have to create proper buffer links on the new pages.
-
In general new pages that you create will not have the penalty. Penguin is a page by page penalty, not a site-wide. So if you start with new things you should be fine. It sounds like your link building company was crap anyway so it shouldn't be hard to replicate the results of your old campaign.
-
If you want some hands on advice, I can make you a case study in recovery if your site fits the right criteria. Message me your details if interested.
Cheers
-
-
Thank you so much for your tips!
I will surely be doing that Brad!
-
My recommendation would to be to do the following items.
1. disavow all the links that you believe came from this practice
2. contact all the sites after disavow and ask them to remove the links to your site
3. submit a resubmission request through webmaster tools. Penguin 2.0 is not a manual penalty but in this case it would be good to alert Google that your site was hit hard but also you may have a manual penalty. I would want to try to fight against penguin 2.0 if it is possible that it was a manual penalty with strange timing.
4. change your strategy and start working on creating good content and earning good quality links.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Manual action penalty by Google
Hello, We have a big well-known brand - www.titanbet.com. This brand is well established and the site has been live for almost 4 years now ranking very well on some very strong KWs. we received a message from Google on Aug 29<sup>th</sup> saying “Google has detected a pattern of artificial or unnatural links pointing to your site” and that “Google has applied a manual spam action to titanbet.com/” The past 2 weeks since the penalty was received we saw some of our major KWs drop in rankings. BUT all brand related KWs were still ranked 1<sup>st</sup> Over the last weekend the penalty has worsen and we no longer rank on any of the brands KWs (we find the site in 5<sup>th</sup> page at best). Moreover, when searching for a sentence from the any of the page on the site in Google, we see other sites ahead of us in the SERPs. Based on the message we originally received from Google we have started cleaning some of the bad links to the site. We found a lot of links from bad sites, some of them are not indexed and probably penalized as well, some are from affiliate websites and some are from some automatic indexation websites based in China and Russia
Industry News | | Tit
we have started reaching out to some of these sites to try and have them remove our links. We are also worried about the duplication of our site. We have found many other sites (mostly affiliate websites) have copied and in some cases completely duplicated our content. Google for some reason has chosen to penalize us for this. Although we do not have control over these other sites. We have run copyscape to try and figure out which pages are the most problematic and we will try to re-write the content on these pages. But what if the other sites copy us again? Any suggestions on the above would be appreciated as we try to understand why Google has penalized us. thank you Titan Bet Team0 -
Google number one search result looks drastically different in firefox compared to chrome
I just noticed this today that some websites and brands look like this on firefox only, and others while still being number one result for their brand name, do not appear like this at all. also, this does not happen over chrome at all. both images provided for comparison are using the same google apps account logged in. It would be nice if someone could shed some light on as to why this happens sporadically and what does it take to be distinguished like this for your own brand if you own the identical domain.com or whatever. Zz7ZkX5.png lpuwheo.png
Industry News | | Raydon0 -
If I have a Google+ Business page, do I need a Google Places page as well?
It seems like the two are redundant? Any official word on this? I'm fairly OCD about things being tidy and I dont want to split my reviews / shares / etc between two profiles. Are they not the same thing? I searched for my company, both my plus business page and my places page came up. I attached a SS of the situation. placesvplus.png
Industry News | | jonnyholt1 -
Will Google ever begin penalising bad English/grammar in regards to rankings and SEO?
Considering Google seem to be on a great crusade with all their algorithm updates to raise the overall "quality" of content on the Internet, i'm a bit concerned with their seeming lack of action towards penalising sites that contain terrible English. I'm sure you've all noticed this when you attempt to do some proper research via Google and come across an article that "looks" to be what you're after, then you click through and realise it's obviously been either put together in a rush by someone not paying attention or putting much effort in, or been outsourced for cheap labour to another country whose workers aren't (close to being) native speakers. It's getting really old trying to make sense of articles that have completely incorrect grammar, entirely missing words, verb tenses that don't make any sense, randomly over-extravagant adjectives thrown in just as padding, etc. etc. No offense to all those from non-native speaking countries who are attempting to make a few bucks online, but this for me is becoming by far more of an issue in terms of "quality" of information online as opposed to some of the other search issues that are being given higher priority, and it just seems strange that Google have been so blasé about it up to this point - especially given so many of these articles and pages are nothing more than outsourced filler for cheap traffic. I understand it's probably hard to code in something so advanced, but it would go a long way towards making the web a better place in my opinion. Anyone else feeling the same way? Thoughts?
Industry News | | ExperienceOz1 -
Google: 7 Results on First page now (Good or Bad)?
Hi, So what are your thoughts on only 7 results on the first page of a search? Looks like its only in play for brands at the moment with increased sitelinks for the first result. I actually really like it, looks a lot better and its easier to take the overall information in on the page. I hope they role this out onto more generic terms as well. Should mean a higher CTR for those who have 'made it' to the first page. I suppose Google will pickup on a higher CTR on the ads well.
Industry News | | activitysuper0 -
Google Pushed Out Panda Update 3.9 Last Night
Google said it was rolling out a Panda algorithm last night. New data refresh of Panda starts rolling out tonight. ~1% of search results change enough to notice. More context: goo.gl/huekfHas anyone been affected yet ....I seem to see a jump in some newer sites less than 3 months old already for some rankings - I don't like seeing rankings that quick!Do you see anything?
Industry News | | Chenzo1 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Anyone know how to get into Google Advisor search?
Looking for information on how to get into Google Advisor (https://www.google.com/advisor/home). Google is rolling out their own meta search engines in select categories right now - finance and hotels to start - but i cant find any documentation, help or data on how to get yourself in that feed/search. Anyone have experience in with this yet?
Industry News | | rhutchings0