Large-Scale Penguin Cleanup - How to prioritize?
-
We are conducting a large-scale Penguin cleanup / link cleaning exercise across 50+ properties that have been on the market mostly all for 10+ years. There is a lot of link data to sift through and we are wondering how we should prioritize the effort.
So far we have been collecting backlink data for all properties from AHref, GWT, SeoMajestic and OSE and consolidated the data using home-grown tools.
As a next step we are obviously going through the link cleaning process. We are interested in getting feedback on how we are planning to prioritize the link removal work. Put in other words we want to vet if the community agrees with what we consider are the most harmful type of links for penguin.
- Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link
- Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution
- Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link
- Priority 4: Clean up low-quality links (other niche or no link juice)
- Priority 5: Clean up multiple links from same IP C class
Does this sound like a sound approach? Would you prioritize this list differently?
Thank you for any feedback /T
-
Your data sources are correct (AHREFs, Bing, Ose & Majestic) but I recommend including Bing as well. The data is free and you will find at least some links not shown in other sources.
The link prioritization you shared is absolutely incorrect.
"Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link"
While it is true site-wide links are commonly manipulative, removing the site wide link and keeping a single one does not necessarily make it less manipulative. You have only removed one of the elements which are often used to identify manipulative links.
"Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution"
A manipulative link is still manipulative regardless of the anchor text used. Based in April 2012, Google used anchor text as a means to identify manipulative links. That was over 18 months ago and Google's link identification process has evolved substantially since that time.
"Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link"
Same response as #1 & 2
"Priority 4: Clean up low-quality links (other niche or no link juice)"
See below
"Priority 5: Clean up multiple links from same IP C class"
The IP address should not be given any consideration whatsoever. You are using a concept that had validity years ago and is completely outdated.
bonegear.net IP address 66.7.211.83
vitopian.com IP address 64.37.49.163
There are no commonalities between the above two IP addresses, be it C block or otherwise, yet they are both hosted on the same server.
You have identified the issue affecting your site (Step 1) and collected a solid list of your backlinks using multiple sources (Step 2). The backlink report is an excellent step which places you well above most site owners and SEOs in your situation.
Step 3 - Identify links from every linking domain.
a. Have an experienced, knowledgeable human visit each and every linking domain. Yes, that is a lot of work but it is what's necessary if you are going to accurately identify all of the manipulative links. Prior to beginning this step, be absolutely sure the person can accurately identify manipulative links with AT LEAST 95% accuracy, although 100% is strongly desired.
b. Document the effort. I have had 3 clients who approached me with a Penguin issue, we confirmed there was not any manual action in place at the time we began the clean up process, but before we finished the sites incurred a manual penalty. Solid documentation of the clean up effort is required by Google in case the Penguin issue morphs into a manual penalty. Also, it just makes sense. You mentioned 50+ web properties so clearly others will be performing these tasks.
c. Audit the effort. A wise former boss once stated "You must inspect what you expect". Unless you carefully audit the work, the process will fail. Evaluators will mis-identify links. You will lose some quality links and manipulative links will be missed as well.
d. While you are on the site, capture manipulative site's e-mail address and contact forum URL (if any). This information is helpful to contact site owners to request link removal.
Step 4 - Conduct a Webmaster Outreach Campaign. Each manipulative domain needs to be contacted in a comprehensive manner. In my experience, most SEOs and site owners do not put in the required level of effort.
a. Send a professional request to the site's WHOIS e-mail address.
b. After 3 business days if no response is received, send the same letter to the site's e-mail address found on the website.
c. After another 3 business days, if no response is received submit the e-mail via the site's contact form. Take a screenshot of the submission on the site (not required for Penguin as no documentation is, but it is helpful for the process).
All of the manipulative link penalties (Penguin and manual) I have worked with have been cleaned up manually. With that said, we use Rmoov to manage the Webmaster Outreach process. It sends and maintains a copy of every e-mail sent. It even has a place to add the Contact Form URL. A big time saver.
If a site owner responds and removes the link, that's great. CHECK IT! If there are only a few links, manually confirm link removal. If there are many URLs, use Screaming Frog or another tool to confirm link removal.
If a site owner refuses or requests money, you can often achieve link removal by having further respectful conversations.
If a site owner does not respond, you can use "extra measures". Call the phone number listed in WHOIS. Send a physical letter to the WHOIS address. Reach out to them on social media sites. Is it a .com domain with missing WHOIS information? You can report them on INTERNIC. Is it a spammy wordpress.com or blogspot site? You can report that as well.
When Matt Cutts introduced the Disavow Tool, he clearly said "...at the point where you have written to as many people as you can, multiple times, you have really tried hard to get in touch and you have only been able to get a fraction of those links down and there is still a small fraction of those links left, that's where you can use our Disavow Tool".
The above process satisfies that requirement. In my experience, not much less than the above process meets that need. The overwhelming majority of those tackling these penalties try to perform the minimal amount of work possible, which is why forums are flooded with complaints about numerous attempts to remove manipulative link penalties and failing.
Upon completion of the above, THEN upload a Disavow list of the links you could not remove after every reasonable human effort. In my experience you should have removed at least 20% of the linking DOMAINS (with rare exceptions).
It can take up to 60 days thereafter, but if you truly cleaned up the links in a quality manner, then the Penguin issues should be fully resolved.
The top factors in determining whether you succeed or fail are:
1. Your determination to follow the above process thoroughly
2. The experience, training and focus of your team
You can resolve the issue in one round of effort and have the Penguin issue resolved within a few months....or you can be one of those site owners who thinks it is impossible and be struggling with the same issue a year later. If you are not 100% committed, RUN AWAY. By that I mean change domain names and start over.
Good Luck.
TLDR - Don't try to fool Google. Anchor text and site wide links are part of the MECHANISM used to identify manipulative links. Don't confuse the mechanism with the message. Google's clear message: EARN links, don't "build" links. Polishing up the old manipulative links is a complete waste of your time. AT BEST, you will enjoy limited success for a period of time until Google catches up. Many site owners and SEOs have already been there, and it is a painful process.
-
When you say "clean up" do you mean removing the links or disavowing them?
You will never be able to get them all removed, so in the end you will need to a Disavow anyways. If your time frame is short, you may want to make Priority One be doing a Disavow for each of the 50+ sites you are working with. Then you can proceed with attempting to get the links removed. I have not heard that there is any downside to having a link removed that already appears on your disavow file...
As for the order of the Priorities, you may want to shuffle them a bit depending on the different situations on the different websites. I suggest you read this Moz Blog article called It's Penguin-Hunting Season: How to Be the Predator and Not the Prey
...and then test a few of your sub-pages that used to rank well at the program used in this article which is called the Penguin Analysis Tool. I say sub-page because it needs a single keyword phrase you want rank that particular page for so it do the anchor text analysis. And that works better on focused sub-pages than on general homepages. $10 per website will let you fully evaluate two typical pages on each and see which facet of the link profile is most valuable to attack first.
-
Have you read the post at http://moz.com/blog/ultimate-guide-to-google-penalty-removal? Matt Cutts even called it out on Twitter as a good post. That's where I'd first look for ideas.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Sitemap for Large Website
i have more than 3500 pages on my website. Please let me know the best sitemap plugin for my website.
Intermediate & Advanced SEO | | Michael.Leonard1 -
Penguin or coincidence?
We have several country specific sites set up as folders of our main domain. We use hreflang tags to get the relevant site served in each country, with mixed success. In about 60% of searches the US site appears. Beginning in late October our rankings for the US have slowly, but reasonably steadily dropped. Each week we'll see 10 or so keyword rises and 15-30 keywords drop. Generally by 1-3 places. This is much more movement than we were seeing in the months prior to this. Is this a result of penguin, or just coincidence? The US subfolder is the only one which has seen drops overall, the rest have actually improved slightly during this time period. I would expect any impact due to Penguin to effect the whole domain? I've been checking through our backlinks and we do a have a handful of bad links, along with a 100 or so which look a little odd and not completely relevant. I have contacted sites and had some of these removed, and created a disavow list with the worst of the rest. I haven't asked for the site to be re-considered yet. We haven't had any message in webmaster tools re: bad links or similar. Cheers.
Intermediate & Advanced SEO | | ahyde0 -
Implementing Large-Scale Redirects
Hello All, I have a situation with my site where a vendor created a local directory of locations on a sub-domain of my site. This sub-domain has approximately 2000 pages. It is a PR3 and a good backlink profile (not many links. Mostly citations. Not spammed). It get decent traffic but 80% of the traffic is driven by ppc. We have created a new local section on the main page of our website and we are trying to weigh the benefit of redirecting all of those pages on the old sub-domain. We anticipate that this new section will begin to replace the old sub-domain in serps. Additionally, when our deal with the company that manages this sub-domain ends in three months, the pages will no longer exist. Is it worth redirecting the pages (you might need more information to give good insight into that)? Also, if we do implement approx. 2000 redirects, what effect will that have on the main site from an SEO perspective. Is it possible that Google might ignore this large scale redirect effort? Will the value also be limited by the fact the redirect might only be live for a month before the original pages are deleted? Any help/insight with this would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | ResponseMine0 -
If Penguin 2.0 targets specific pages and keywords, should I spend less SEO effort on them since will they be harder to optimize? Penalty repair is only starting at end of year.
I’m working with a company that got hit by Penguin 2.0. They’re going to switch to white-hat only for a few months and review analytics before considering repairing the penalty. In the meantime, would it make sense to focus less SEO effort (on-site optimization, link building, etc.) on any pages or keywords that were penalized or hit hardest? Or are those the pages we should work on the most? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
What are recommended best practices for hosting native advertising sponsored content on domains for large publishers?
On top of clear on-page sponsored content copy would you add meta robots to noindex native advertising? Google recently came out against this type of content in GNews, is there a similar directive for the main index?
Intermediate & Advanced SEO | | hankazarian0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Best Format for URLs on large Ecommerce Site?
I saw this article, http://www.distilled.net/blog/seo/common-ecommerce-technical-seo-problems/, and noticed that Geoff mentioned that product URLs format should be in one of the following ways: Product Page: site.com/product-name Product Page: site.com/category/sub-category/product-name However, for SEO, is there a preferred way? I understand that the top one may be better to prevent duplicate page issues, but I would imagine that the bottom would be better for conversion (maybe the user backtracks to site.com/category/sub-category/ to see other products that he may be interested in). Also, I'd imagine that the top URL would not be a great way to distribute link juice since everything would be attached to the root, right?
Intermediate & Advanced SEO | | eTundra0 -
Google Penguin Winners and Losers?
Hi All, Just wondering out of the SEOmoz community who has come out on top after Penguin and who has been hit and why. Personally my site has come out on top. I started workig on the site back in December and NOTHING had been done, no link development, no onpage, nothing, a virginal website. The site was chock-a-block with issues both technically and in content. After 4 months of hard work, we have climbed from 100+ to top ten on most of our phrases and post Penguin we have climbed even higher as some of our competitors were dragged down into the murky depths. So I think thats a win (for now). My focus has been on Guest posting, social outreach, reviews and getting my on page right (still a ways to go, but our CMS is clunky to say the least). A little humour attached 😉 (Why has no one yet stuck Matt Cutts head on a Penguin?) Are you a Penguin Winner or have you experienced the wrath of the penguin. keep-calm-and-deoptimise.jpg
Intermediate & Advanced SEO | | Aran_Smithson0