Google Reconsideration - Denied for the Third Time
-
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started.
Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines."
So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response!
I don't know what else to do? I did everything i could think of with the exception of deleting the whole site.
Any advice would be greatly appreciated.
Regards - Kyle
-
Kyle, interesting... I knew that is not possible. Do you have a lot of high quality backlinks left or did you start building new ones?
-
Hi Traian - i would have to disagree.
With the advice mentioned above from both Cyrus and Marie i was able to get the penalty lifted and whether it was normal or not my rankings and traffic have bounced back to exactly where they were if not higher than before the penalty.
-
when you removed those backlinks, you don't get to rank as before the penalty. those bad links were keeping you up and now you need to work your way up again, only that this time way, way more carefully than before...
-
Good job Kyle! I would take things one step further though. It is likely not enough to just disavow the domains. If you haven't already done so, make sure that you make efforts to manually get these links removed and then communicate this to Google.
-
Wow, thank you so much for all your guys help!
I just spent all day digging through the full link profile (combined from GWT and OSE) and i have more than doubled my domains in my disavow list. I have been surprised by the amount of domains that have simply expired over the course of this project.
Also, thank you for the pointer on hosting the process files on Google docs for the reconsideration, i was wondering where i would keep that!
I'll keep you all up-to-date as well as contribute to a blog post if you guys would like too.
Thanks again - Kyle
-
Hi Kyle,
First of all, I can't wait for Marie's book! I don't want to recover any ground that she's already gone over, so I'll just share a few thoughts.
1. Has Google verified it's a link penalty? "Site violates Google's quality guidelines" could also refer to on-site issues like hidden text or doorway pages. Given the information you provided, it's most likely a link based penalty, but you never know.
2. Not sure from your description, but I almost always disavow entire domains using the domain: command instead of individual URLs. I've seen requests rejected because they disavowed not enough URLs when they should have blocked the entire domain.
3. I agree with Marie. If you've been penalized, it's generally safer to error on the side of disavowing too many domains than not enough. This isn't to say you should disavow known good links, but if links are questionable, why take a chance.
4. Also agree with Marie on submitting documentation about your removal efforts. Wrote a post about it here: http://cyrusshepard.com/penalty-lifted/
(they tend to like everything in Google Docs files. Cuts down the risk of spam)
5. Minor point, but Google likes everything formatted in a UTF-8 encoded .txt file. I've never seen one rejected because of this, but I hear it happens.
6. I'm turning into a fan of Link Detox for toxic link discovery. Instead of running it in standard mode, upload a file of your complete backlink profile from Webmaster Tools and have Link Detox check those links. Sort the final list by hand - this means check each link! For hints, read Paddy Moogan's post about low quality links: http://www.stateofsearch.com/step-by-step-guide-finding-low-quality-links/
Damn, we should turn this into a blog post!
Hope this helps!
-
You know what Cyrus? I kid you not...this afternoon I suddenly got this thought that I should send you a copy once I got it finished. I know you have been involved in unnatural links cleanup. I'll be in touch!
-
Hi Marie,
Please let us all know when you finish the book!
-
Hey Marie - thanks for the details, i will let you know what else i find!
-
Yes, definitely. These need to be removed if possible and disavowed.
-
I generally create a Google Doc spreadsheet with my links and then have columns where I enter email addresses found on site, whois addresses and url of contact form. Then, I have columns next to those for reference numbers. Those reference numbers refer to a separate document in which I include the original source code of each email sent as well as a document with screenshots of contact forms submitted. It's a pain to do all of this but I have been successful in every single attempt at reconsideration using this method.
If you're interested, I am 95% finished writing a book on the process that I use to get rid of unnatural links penalties. You can contact me via my profile and I can send you my almost finished book at a discounted price. It does include a link to an example of the spreadsheet that I use.
-
Do you think this type of microblogging URL would be considered spam:
http://olcine.com/index.php/steffcolbere
Should i disavow these sites as well?
-
Marie, thanks for the tip on the total disavowed, i am currently in the process of downloading my link profiles from OSE and GWT to look over it again.
As for communicating i haven't submitted a document saying who i have contacted, how would you suggest documenting that? Do you have an example document to share?
-
Nick, what did your request look like when you got approved? An specifics you can share?
-
Thanks for the pointers on # of links in GWT, I will dig in deeper and see the trends over the last few months. As for the reconsideration request, do you have an examples of what people submitted that got approved?
-
Hi Kyle. The process is frustrating, isn't it?
I have a few thoughts for you. You mentioned that you disavowed over 40 different domains. That doesn't sound like many. Many sites that I have worked on have had hundreds and hundreds of domains that needed disavowing. It's possible that you haven't identified enough links as unnatural. In other words, it may be that Google wants you to address some links that you think could be natural but actually do go against the quality guidelines.
I've also seen sites fail at reconsideration because the disavow file was not properly formatted.
How well did you communicate your attempts at link removal to Google? If you have contacted webmasters and failed to get links removed then you need to document that well to Google.
-
I also agree with highland, you have to submit a file that shows all your work.
-
We have recovered from a manual link spam penalty about 2 weeks ago after 5 months of cleaning up stuff. I have cleaned about 85% of the links and submitted a recon. and 3 days later i got a message that said the MANUAL penalty was revoked.
Even though it s been 2 weeks we still have not seen any improvement on the rankings. We are now working on getting quality links .
I felt the same way back in January but kept at it. So clean up more if you can and give it another shot.
good luck I feel your pain.
Nick
-
Make it a point to check Google WMT to see if number of external links is declining. If the number is rising or staying constant, i would check the disavow file to make sure you are indeed capturing all spam domains.
I have found that a great reinclusion request can do the trick. The request should note what your wrongdoings where, what was your remediation, time spent and percentage of success. You should also apologize and your promise to be good.
-
Hmm. Well, the only other thing I could recommend would be reading this post on the matter. To summarize a bit, Google wants to see what you've done to fix the problem. Document what you've done and plead your case.
-
Highland, yes we have utilized it quite heavily with submitting over 40 different FULL domains not just urls.
-
Have you tried the disavow links tool? I know many people who have fought manual penalties and they have expressed that it's invaluable in getting rid of them.
-
I just don't think that is the right move, we still hold rankings for other pages, it just seems to be keyword/page specific some how
-
I know this may not be what you want to hear but it might make sense to start over. New domain and website. To be completely rid of the old site is a hard but necessary move.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ignoring Canonical and choosing its own
Hey Mozzers, We have several products that all have upto 6 different versions, they are the same product but in a different specification. As users search via these specifications (within our website) it is beneficial to keep all 6 products as different listings on the website. In google however it is not. So we kept all 6 listing but chose 1 to be the google landing page, the only different between them all is the technical specification + occasionally size. But 95% of the pages are the same. Let call the products A, B, C, D, E, F, we made all the canonicals point to C because this is out best selling version of the product. However, google has chosen E to rank instead. What is my best move here? Should i accept the page google has chosen and change the canonicals the point to that version or should I be stubborn and try to get google to change which version it ranks. As always many thanks.
Intermediate & Advanced SEO | | ATP0 -
Google images
Hi, I am working on a website with a large number (millions) of images. For the last five months Ihave been trying to get Google Images to crawl and index these images (example page: http://bit.ly/1ePQvyd). I believe I have followed best practice in the design of the page, naming of images etc. Whilst crawlng and indexing of the pages is going reasonably well with the standard crawler, the image bot has only crawled about half a million images and indexed only about 40,000. Can anyone suggest what I could do to increase this number 100 fold? Richard
Intermediate & Advanced SEO | | RichardTay0 -
Problem with Google reading https homepage?
Hi Moz Community, In July, we changed our homepage to https via a 301 redirect from http (the only page on our site with https). Our homepage receives an A grade in the ‘On Page Grader’ by Moz for our desired keyword. We have increased our backlink efforts directly to our homepage since we switched to the SSL homepage. However, we still have not increased in search ranking for our specific keyword. Is there something we could have missed when doing the 301 redirect (submitting a new sitemap, changing rotbots.txt files, or anything else??) that has resulted in Google not correctly accessing the https version? (the https page has been indexed by Google). Any help would be greatly appreciated.
Intermediate & Advanced SEO | | G.Anderson0 -
What time to expect until onpage change is reflected in google ranking?
Most pages on our site are crawled by google about once per week. We plan to implement a new navigation structure with much more interlinks among our pages. I would like to test it first just for a few pages to measure impact. How long may it take approximately until an onpage change related to link structure is reflected in google rankings? Any experience?
Intermediate & Advanced SEO | | lcourse0 -
Google: How to See URLs Blocked by Robots?
Google Webmaster Tools says we have 17K out of 34K URLs that are blocked by our Robots.txt file. How can I see the URLs that are being blocked? Here's our Robots.txt file. User-agent: * Disallow: /swish.cgi Disallow: /demo Disallow: /reviews/review.php/new/ Disallow: /cgi-audiobooksonline/sb/order.cgi Disallow: /cgi-audiobooksonline/sb/productsearch.cgi Disallow: /cgi-audiobooksonline/sb/billing.cgi Disallow: /cgi-audiobooksonline/sb/inv.cgi Disallow: /cgi-audiobooksonline/sb/new_options.cgi Disallow: /cgi-audiobooksonline/sb/registration.cgi Disallow: /cgi-audiobooksonline/sb/tellfriend.cgi Disallow: /*?gdftrk Sitemap: http://www.audiobooksonline.com/google-sitemap.xml
Intermediate & Advanced SEO | | lbohen0 -
Google Listings
How can i make my pages appear in google results such as menu, diner, hours, contact us etc.. when some searches for my keyword or domain take a look at this screen shot Thanks UbqY4kwA UbqY4kwA
Intermediate & Advanced SEO | | vlad_mezoz0 -
How do i know if my domain has given penalty by Google and why?
I have problems with my website to google search results. Some of the important keywords frequently and sharply rankings vary a lot lately. For example; 07-14-2012 Term1 Position #12 07-20-2012 Term1 Position #38 07-24-2012 Term1 Position #10 07-26-2012 Term1 Position #40 I wonder what is the cause of this incident? Is there any Penalties? If there is then how do i know this?
Intermediate & Advanced SEO | | ersaky0 -
Google is not Indicating any Links to my site
We built a new store on another ccTLD and linked to it from some of our other domains in a few locations. I am noticing that with the Google operator command "links:" we are seeing nothing linking to our site anywhere. Some things to clarify: These are not no-follow links These pages linking to our new domain are indexed The pages being linked to on our new domain are indexed This is not a flash site or heavy in JavaScript The links existed the day the site was launched so when the new pages were crawled they existed. "Site:" command in Google shows me that my new site is indexed. What could potentially be causing this? I am trying to get these newer ccTLD's to begin ranking and I understand that I need to get links going to these pages since they are fairly new (2.5 months) so I can outrank the .com in the SE's in those locales. (Like Google.co.uk)
Intermediate & Advanced SEO | | DRSearchEngOpt0