Really, is there much difference between an unnatural links warning and Penguin?
-
We know that the unnatural links warnings are manual and that Penguin is algorithmic. (I'm not talking about the latest round of confusing unnatural links warnings, but the ones sent out months ago that eventually resulted in a loss of rankings for those who didn't clean their link profiles up.)
Is there much difference in the recovery process for either? From what I can see, both are about unnatural/spammy linking to your site. The only difference I can see is that once you feel you've cleaned up after getting an unnatural links warning you can file a reconsideration request. But, if you've cleaned up after a Penguin hit you need to wait for the next Penguin refresh in order to see if you've recovered.
Are there other differences that I am not getting?
-
Thank you.
-
Yes, I would say so.
-
Thanks Ruth, so would you agree that the cleanup is the same? Whether you had a manual warning, or you got hit with Penguin, the way you would recover is the same (other than filing for reconsideration request with the former)?
-
The main difference between the two is that a reconsideration request is more likely to work with a link warning than with a regular Penguin hit. Penguin is algorithmic, whereas the link warnings were usually triggered by/resulted in manual penalties. Either way, it's a good idea to try to get as many spammy links removed/updated as possible, as well as build some new, non-spam links to increase the percentage of your links that are not spammy.
I wouldn't suggest building more spammy links to drown out the Penguin-targeted links - why not spend that time and effort building some natural links? They will last longer and if you do have to do a reconsideration request you're not running the risk that Google will also see your brand-new spam links.
-
haha been hit with a penalty because of spamming links? spam more links to your site that will fix everything! crazy
-
Is there much difference in the recovery process for either [Penguin or manual link penalty]?
Theoretically no, practically yes.
A manual penalty will be reviewed by the Google Spam Team. If you are not successful at removing the links, you will need to provide extensive documentation on the steps taken to remove the penalty. When Google manually reviews links, they will not remove the penalty simply because you adjusted anchor text. If the link is spammy, it needs to be removed regardless of the anchor text.
A penguin penalty can be algorithmically removed. Many SEO companies are simply manipulating the anchor text rather then removing the spammy links and they are getting away with it to at least some degree...for now. Another tactic is to "drown out" the links penalized by Penguin with other spammy links which do not use anchor text. These solutions are quite bad as these sites are subject to future penalties as Google improves their algorithms.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reasonable to Ask URL of Link from SEO Providing New Links before Link Activation?
My firm has hired an SEO to create links to our site. We asked the SEO to provide a list of domains that they are targeting for potential links. The SEO did not agree to this request on the grounds that the list is their unique intellectual property. Alternatively I asked the SEO to provide the URL that will be linking to our site before the link is activated. The SEO did not agree to this. However, they did say we could provide comments afterwards so they could tweak their efforts when the next 4-5 links are obtained next month. The SEO is adamant that the links will not be spam. For whatever it is worth the SEO was highly recommended. I am an end user; the owner and operator of a commercial real estate site, not an SEO or marketing professional. Is this protectiveness over process and data typical of link building providers? I want to be fair with the provider and hope I will be working with them a long time, however I want to ensure I receive high quality links. Should I be concerned? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Does adding more outgoing links on a high PA page decrease the juice passed to previous links?
Hi, I'm not sure how PA DA exactly works when the goal is to create backlinks to your site in order to have the most impact on passing PA DA juice (if there is such a thing) to ones money site. For example let's say you have a blog and the PA is 40 DA is 30. Let's say I create a backlink pointing to my site on the homepage of this blog, in which I desire better rankings for, and the links I created are only 1-3 outgoing links on this post which is again on the homepage. Then say in a months time, I want to add another post on the homepage (so the 40 PA and 30 DA stays the same) creating a backlink to one of my other money sites. Does adding this second round of backlinks result in sending less juice to the first? This is what I want to know. Thank you!
Intermediate & Advanced SEO | | z8YX9F800 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Penguin Apply To Internal Linking?
Is Penguin focused primarily on backlinks or does it also assess internal linking/anchor text? We've lost about 3,000 visitors a month since the rolling updates were implemented. I'm always careful not to over-react to algo updates but enough time has passed that I think the dust has settled. I try to stay white in all I do but I think if I've over-done anything its the internal linking related products/categories with exact match. My backlink profile also has an over-abundance of affiliate links but that's kind of out of my hands isn't it?
Intermediate & Advanced SEO | | AWCthreads0 -
Linking across categories
On a website when I link across in the same category should all the categories all pear on each page. Let's say I have 6 categories and 6 pages should I have the 6 links on all the pages ( such as A, B, C, D, E, on page 1 ( let's imagine this page is page F ), then on page A have link B, C D, E, F and so on for the 6 pages ( meaning all the links appear on all the pages across the category ) or should i just have let's say 3 links on page 1 ( link A, B, C ) , then link ( D, E, F ) on page 2, then A, E, F on page 3, link B, C F on page 4 and so on... ( which means that i vary the links that appear and that it is naturally ( at least I think ) going to boost the link that appears the most of the 6 pages ? I hope this is not too confusing, Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0 -
HELP - got the following message - Google Webmaster Tools notice of detected unnatural links
Hi All, While trying to grow we used several freelancers and small companies for guest blogging, article submissions etc. We lost about 90% of traffic from our peek at December. We don't know if it is related but we got the following message last week:
Intermediate & Advanced SEO | | BeytzNet
"Google Webmaster Tools notice of detected unnatural links to www.domain.com" Is it related (getting this message after two months of losing traffic)? What to do???? (P.S
We fired most of the companies we used months ago since we noticed they used bad methods. We didn't believe it can hurt us - just thought it would be useless...) Please Help...0 -
Value of Newspaper Comment Links
Although most newspaper comment sections are a no-follow zone, I have noticed that some comments I have posted with links end up being followed. The comments are participatory and the links relevant and even add to the conversation. My theory is that some comments are monitored and if the editors are looking to encourage discussion and don't feel like your spamming, why not take the no follow off. I do plan on doing some testing with poor, spammy comments on the same papers but am encouraged and would like to know what other people have found.
Intermediate & Advanced SEO | | phogan0