Google Penguin 2.1 Penalty - Recoverable?
-
Hello,
I have a client that was hit very bad by the Google Penguin 2.1 update. He mentioned he did an intensive link analysis and removed all the bad links; however there were a lot of them (around 6000). His domain has a decent sized domain authority of 30/100.
I'm wondering if it's worth it to try and save his domain, or start fresh from a new one. Due to the high number of links I'm not 100% confident that all the bad links were taken care of, and I've heard that even if you remove the links Google won't lift the penalties.
What would you do...get a new domain, or risk the next couple months trying to save the existing one?
-
Glad to hear it, Chris, and welcome to the community!
-
AMAZING HELP!!! This is good stuff here. I am in the same boat, and already started taking these steps before I even read this. I will say this is all working for me.
-
THIS.
Until more people understand what disavow actually does, it's going to be rare that we hear good recovery stories. Disavow gets rid of the rotten wood - it doesn't build you new foundations.
So if the disavow is done correctly and thoroughly, go build a new foundation. You can do that on the current site but yes, sometimes a new domain is the fastest & easiest way.
-
Beyond what Travis communicated I'll add this:
If a site was "artificially" ranking due to bad links, and those links are being removed, why would it somehow rank again after they're gone?
Think of it this way: You have a house built on a foundation made out of wood. Building codes require your foundation be built out of concrete. Take away the wood. Why would the house remain elevated at that point?
Without replacing the "artificial" signals with new, more trustworthy signals, a site isn't going to recover.
That then begs the question - how does the site get those new signals? More questionable links? Not advisable. Better to expend energy to generate higher quality reasons for ranking.
And that then leaves us with Travis' question about cost / worth effort considerations.
-
Have you looked at his disavow file? There might be a problem with it and the disavow never actually went through. I know it seems pretty basic, but I've solved a lot of problems by asking a prospect if they've checked their robots.txt configuration. I'm sure you understand what I'm talking about.
Perhaps you're right, perhaps enough bad links weren't removed. Though I've heard of recoveries taking up to one year, without new links earned or more old links removed, even though the disavow was thorough. Draw on multiple link sources and check again. I've manually plowed through thousands of links in a few days. Sometimes you just have to dig in, but it would be wiser to see if the disavow was properly formatted and submitted first.
If it was properly formatted and submitted, then you have to make a decision based on the client's business. At what point does time and money intersect at $0? Can you rebuild that traffic/profit faster than that by starting over? It's risky either way, but if you can find something went wrong in the first round there's still a fighting chance.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
25% of expired domains came with a Google manual penalty
25% of expired domains purchased came with a Google manual penalty, even when Moz spam score was 0 . Read the whole case study here: http://www.authoritywriters.com/2017/10/google-manual-penalty-on-expired-domains.html
Intermediate & Advanced SEO | | bluishclouds0 -
Google Penalties not in Webmaster tools?
Hi everybody, I have a client that used to rank very well in 2014. They launched an updated URL structure early January 2015, and since they rank very low on most of the keywords (except the brand keywords). I started working with them early this year, tried to understand what happened, but they have no access to their old website and I cant really compare. I tried the started optimisation methods but nothing seems to work. I have a feeling they have been penalised by Google, probably a Panda penalty, but their Webmaster tools account does not show any penalties under manual actions. Do people impose penalties that are not added to Webmaster tools? If so, is there away I can find out what penalties and what is wrong exactly so we can start fixing it? The website is for a recruitment agency and they have around 400 jobs listed on it. I would love to share the link to the website but I don't believe the client will be happy with that. Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Penguin 2.0 Recovery - Penguin Update Rerun yet or not
I have been hit by the penguin 2.0 update some five months back. I believe that I have an algorythmic penalty applied to my sites. While the work to cleanup etc has been done, there is certainly no recovery. I also notice a lack of recovery stories. In fact I think anyone affected cannot recover because a recalculation has not happened? Does anyone think that a recalculation of the penguin 2.0 penalties has happened? If so why do they think that.
Intermediate & Advanced SEO | | Jurnii0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Google Ranking Wrong Page
The company I work for started with a website targeting one city. Soon after I started SEO for them, they expanded to two cities. Optimization was challenging, but we managed to rank highly in both cities for our keywords. A year or so later, the company expanded to two new locations, so now 4 total. At the time, we realized it was going to be tough to rank any one page for four different cities, so our new SEO strategy was to break the website into 5 sections or minisites consisting of 4 city-targeted sites, and our original site which will now be branded as more of a national website. Our URL structures now look something like this:
Intermediate & Advanced SEO | | cpapciak
www.company.com
www.company.com/city-1
www.company.com/city-2
www.company.com/city-3
www.company.com.city-4 Now, in the present time, all is going well except for our original targeted city. The problem is that Google keeps ranking our original site (which is now national) instead of the new city-specific site we created. I realize that this is probably due to all of the past SEO we did optimizing for that city. My thoughts are that Google is confused as to which page to actually rank for this city's keyword terms and I was wondering if canonical tags would be a possible solution here, since the pages are about 95% identical. Anyone have any insight? I'd really appreciate it!0 -
Google penguin penalty(s), please help
Hi MozFans, I have got a question out of the field about www.coloringpagesabc.com.
Intermediate & Advanced SEO | | MaartenvandenBos
Question is why the rankings and traffic are going down down down the last 4 months. Costumer thinks he got hit by google penguin update(s). The site has about 600 page’s/posts al ‘optimized’ for old seo:
- Almost all posts are superb optimized for one keyword combination (like … coloring pages) there is a high keyword density on the keyword titles and descriptions are all the same like: <keyword>and this is the rest of my title, This is my description <keyword>and i like it internal linking is all with a ‘perfect’ keyword anchor text there is a ok backlink profile, not much links to inner pages
- there are social signals the content quality is low The site to me looks like a seo over optimized content farm Competition:
When I look at the competition. The most coloring pages websites don’t offer a lot of content (text) on there page. The offer a small text and the coloring pages (What it is about :-)) How to get the rankings back:
What I was thinking to do. rewrite the content to a smaller text. Low keyword density on the keyword and put the coloring pages up front. rewrite all titles and descriptions to unique titles and descriptions Make some internal links to related posts with a other anchor text. get linkbuilding going on inner pages get more social signals Am I on the right track? I can use some advise what to do, and where to start. Thanks!!</keyword></keyword> Maarten0 -
Keyword penalty?
One of our pages seems to have disappeared from Google SERPs, I did some analysis/research into this to try find out what is going on. Nothing jumps out.. - No noticeable traffic drops, especially on/after the Panda & Penguin updates. - Thorough checks on related keywords – no noticeable drops - Anchor text – brand name & natural anchor texts - 2/3 word phrases Keyword density 3-5% in content - No Google Manual Penalty with Notification in WBTools - Robot.txt checked - Checked sitemap.xml (recently updated) I expect if the page has dropped in SERPs then the traffic would drop also.. Anyone had same experiences or ideas how this page is affected?
Intermediate & Advanced SEO | | notnem0 -
Will this get penalized by google?
I had a thought recently, and perhaps it is a pretty bad thought, but i don't see the flaw in it, or how google would really detect it, so please correct me where I am wrong here. Say we ran some sort of marketing campeign and through that campeign we created about 100 extra pages on our domain. A lot of these pages are heavily shared on facebook, twitter, google+ etc. These pages also have several backlinks here and there. Now this campaign is over and so these pages no longer seem relevant to us. If we were to add 301 redirects to all these pages, to three different (and unrelated) internal pages (our primary targets) would this pass all the accumulated link juice on to those three target internal pages? Or would this behaviour get penalized by google?
Intermediate & Advanced SEO | | adriandg0