Google Reconsideration - Denied for the Third Time
-
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started.
Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines."
So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response!
I don't know what else to do? I did everything i could think of with the exception of deleting the whole site.
Any advice would be greatly appreciated.
Regards - Kyle
-
Kyle, interesting... I knew that is not possible. Do you have a lot of high quality backlinks left or did you start building new ones?
-
Hi Traian - i would have to disagree.
With the advice mentioned above from both Cyrus and Marie i was able to get the penalty lifted and whether it was normal or not my rankings and traffic have bounced back to exactly where they were if not higher than before the penalty.
-
when you removed those backlinks, you don't get to rank as before the penalty. those bad links were keeping you up and now you need to work your way up again, only that this time way, way more carefully than before...
-
Good job Kyle! I would take things one step further though. It is likely not enough to just disavow the domains. If you haven't already done so, make sure that you make efforts to manually get these links removed and then communicate this to Google.
-
Wow, thank you so much for all your guys help!
I just spent all day digging through the full link profile (combined from GWT and OSE) and i have more than doubled my domains in my disavow list. I have been surprised by the amount of domains that have simply expired over the course of this project.
Also, thank you for the pointer on hosting the process files on Google docs for the reconsideration, i was wondering where i would keep that!
I'll keep you all up-to-date as well as contribute to a blog post if you guys would like too.
Thanks again - Kyle
-
Hi Kyle,
First of all, I can't wait for Marie's book! I don't want to recover any ground that she's already gone over, so I'll just share a few thoughts.
1. Has Google verified it's a link penalty? "Site violates Google's quality guidelines" could also refer to on-site issues like hidden text or doorway pages. Given the information you provided, it's most likely a link based penalty, but you never know.
2. Not sure from your description, but I almost always disavow entire domains using the domain: command instead of individual URLs. I've seen requests rejected because they disavowed not enough URLs when they should have blocked the entire domain.
3. I agree with Marie. If you've been penalized, it's generally safer to error on the side of disavowing too many domains than not enough. This isn't to say you should disavow known good links, but if links are questionable, why take a chance.
4. Also agree with Marie on submitting documentation about your removal efforts. Wrote a post about it here: http://cyrusshepard.com/penalty-lifted/
(they tend to like everything in Google Docs files. Cuts down the risk of spam)
5. Minor point, but Google likes everything formatted in a UTF-8 encoded .txt file. I've never seen one rejected because of this, but I hear it happens.
6. I'm turning into a fan of Link Detox for toxic link discovery. Instead of running it in standard mode, upload a file of your complete backlink profile from Webmaster Tools and have Link Detox check those links. Sort the final list by hand - this means check each link! For hints, read Paddy Moogan's post about low quality links: http://www.stateofsearch.com/step-by-step-guide-finding-low-quality-links/
Damn, we should turn this into a blog post!
Hope this helps!
-
You know what Cyrus? I kid you not...this afternoon I suddenly got this thought that I should send you a copy once I got it finished. I know you have been involved in unnatural links cleanup. I'll be in touch!
-
Hi Marie,
Please let us all know when you finish the book!
-
Hey Marie - thanks for the details, i will let you know what else i find!
-
Yes, definitely. These need to be removed if possible and disavowed.
-
I generally create a Google Doc spreadsheet with my links and then have columns where I enter email addresses found on site, whois addresses and url of contact form. Then, I have columns next to those for reference numbers. Those reference numbers refer to a separate document in which I include the original source code of each email sent as well as a document with screenshots of contact forms submitted. It's a pain to do all of this but I have been successful in every single attempt at reconsideration using this method.
If you're interested, I am 95% finished writing a book on the process that I use to get rid of unnatural links penalties. You can contact me via my profile and I can send you my almost finished book at a discounted price. It does include a link to an example of the spreadsheet that I use.
-
Do you think this type of microblogging URL would be considered spam:
http://olcine.com/index.php/steffcolbere
Should i disavow these sites as well?
-
Marie, thanks for the tip on the total disavowed, i am currently in the process of downloading my link profiles from OSE and GWT to look over it again.
As for communicating i haven't submitted a document saying who i have contacted, how would you suggest documenting that? Do you have an example document to share?
-
Nick, what did your request look like when you got approved? An specifics you can share?
-
Thanks for the pointers on # of links in GWT, I will dig in deeper and see the trends over the last few months. As for the reconsideration request, do you have an examples of what people submitted that got approved?
-
Hi Kyle. The process is frustrating, isn't it?
I have a few thoughts for you. You mentioned that you disavowed over 40 different domains. That doesn't sound like many. Many sites that I have worked on have had hundreds and hundreds of domains that needed disavowing. It's possible that you haven't identified enough links as unnatural. In other words, it may be that Google wants you to address some links that you think could be natural but actually do go against the quality guidelines.
I've also seen sites fail at reconsideration because the disavow file was not properly formatted.
How well did you communicate your attempts at link removal to Google? If you have contacted webmasters and failed to get links removed then you need to document that well to Google.
-
I also agree with highland, you have to submit a file that shows all your work.
-
We have recovered from a manual link spam penalty about 2 weeks ago after 5 months of cleaning up stuff. I have cleaned about 85% of the links and submitted a recon. and 3 days later i got a message that said the MANUAL penalty was revoked.
Even though it s been 2 weeks we still have not seen any improvement on the rankings. We are now working on getting quality links .
I felt the same way back in January but kept at it. So clean up more if you can and give it another shot.
good luck I feel your pain.
Nick
-
Make it a point to check Google WMT to see if number of external links is declining. If the number is rising or staying constant, i would check the disavow file to make sure you are indeed capturing all spam domains.
I have found that a great reinclusion request can do the trick. The request should note what your wrongdoings where, what was your remediation, time spent and percentage of success. You should also apologize and your promise to be good.
-
Hmm. Well, the only other thing I could recommend would be reading this post on the matter. To summarize a bit, Google wants to see what you've done to fix the problem. Document what you've done and plead your case.
-
Highland, yes we have utilized it quite heavily with submitting over 40 different FULL domains not just urls.
-
Have you tried the disavow links tool? I know many people who have fought manual penalties and they have expressed that it's invaluable in getting rid of them.
-
I just don't think that is the right move, we still hold rankings for other pages, it just seems to be keyword/page specific some how
-
I know this may not be what you want to hear but it might make sense to start over. New domain and website. To be completely rid of the old site is a hard but necessary move.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get visibility in Google Discover?
Hey everyone, I run a website that publish articles about pets. I have read some great things about Google Discover and the potential traffic it can bring to publishers (Condé Nast reported up to 20% of traffic coming from Discover in the US, at a certain point). I am currently trying to get indexed and after reading Google guidelines and a Ahrefs guide, I have made many optimizations to my site: structured data, creating an author page, fixing image size and publishing date... so far, it's not working. I feel the lack of a knowledge graph for my business may affect my chances. I'm currently building a GMB page to fix this. Do you have other recommendations or success stories of your own experiments with Discover? An example of an article I tried to get indexed was https://www.lebernard.ca/teletravail-chien-guide-survie/. Obviously, I'm not expecting feedback on the quality of the content since it's in French, but I'm curious if you see anything from a technical perspective that doesn't work. Thanks a lot for your help! Charles
Intermediate & Advanced SEO | | Cheebee1240 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Google Sitelinks Search Box
For some reason, a search for our company name (“hometalk”) does not produce the search box in the results (even though we do have sitelinks). We are adding schema markup as outlined here, but we're not sure about: Will adding the code make the search bar appear (or at least increase the chances), or is it only going to change the functionality of the search box (to on-site search) for results that are already showing a search bar?
Intermediate & Advanced SEO | | YairSpolter0 -
Site not indexed in Google UK
This site was moved to a new host by the client a month back and is still not indexed in Google UK if you search for the site directly. www.loftconversionswestsussex.com Webmaster tools shows that 55 pages have been crawled and no errors have been detected. The client also tried the "Fetch as Google Bot" tactic in GWT as well as running a PPC campaign and the site is still not appearing in Google. Any thoughts please? Cheers, SEO5..
Intermediate & Advanced SEO | | SEO5Team0 -
Does Google read code as is or as rendered?
Question - Does Google read code as is or as rendered? So for example, with a Facebook Like box, it has all the profile pictures of people...will Google see these as all separate links or ignore them?
Intermediate & Advanced SEO | | jhinchcliffe0 -
Google Places Listing Active In Two Seperate Google Places Accounts?
Hi is there any issues with having a google places listing in two seperate google places accounts. For example we have a client who cannot access their old google places account (ex-employee had their login details which they can't get) and want us to take control over the listing. If we click the "is this your listing" manage this page button - and claim the listing, will this transfer the listing to our control? Or will it create a duplicate? Are there any problems having the listing in different separate accounts. Is it a situation in which the last person who manages the listing takes control? And the listing automatically deactivates from the old account? Do all the images remain aswell? Thanks,
Intermediate & Advanced SEO | | MBASydney
Tom0 -
Does Google punish sites for Backlinks?
Here is Matt Cutts video, for those of you who have not seen it already. http://www.youtube.com/watch?v=f4dAWb5jUws (Very Short) In this Video Matt explains that Google does not look at backlinks. Many link spamming sites have detected, there have been many website receiving warning messages in their Google web tools to deindex these links, etc.. My theory is that Google will not punish sites for backlinks. However, they manually check for "link farming sites" and warn anyone affiliated with them, just in case these links were built from a competitor. This way they can eliminate all the "Bad Link Farm" sites and not hurt anyone who does not deserve to be hurt. Google is not going to give us all their information to rank, they dont want us to rank. They want us to PPC. However, they do want to have the best SERPs available. I call it Google juggling! Thoughts?
Intermediate & Advanced SEO | | SEODinosaur0 -
Adding Millions of Products to Google
What is the best way to submit all of your product pages, millions, to Google for serps? XML, RSS, Google Product Search, etc. These are products that are updated on a daily basis, and change often.
Intermediate & Advanced SEO | | Copstead0