Google Reconsideration - Denied for the Third Time
-
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started.
Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines."
So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response!
I don't know what else to do? I did everything i could think of with the exception of deleting the whole site.
Any advice would be greatly appreciated.
Regards - Kyle
-
Kyle, interesting... I knew that is not possible. Do you have a lot of high quality backlinks left or did you start building new ones?
-
Hi Traian - i would have to disagree.
With the advice mentioned above from both Cyrus and Marie i was able to get the penalty lifted and whether it was normal or not my rankings and traffic have bounced back to exactly where they were if not higher than before the penalty.
-
when you removed those backlinks, you don't get to rank as before the penalty. those bad links were keeping you up and now you need to work your way up again, only that this time way, way more carefully than before...
-
Good job Kyle! I would take things one step further though. It is likely not enough to just disavow the domains. If you haven't already done so, make sure that you make efforts to manually get these links removed and then communicate this to Google.
-
Wow, thank you so much for all your guys help!
I just spent all day digging through the full link profile (combined from GWT and OSE) and i have more than doubled my domains in my disavow list. I have been surprised by the amount of domains that have simply expired over the course of this project.
Also, thank you for the pointer on hosting the process files on Google docs for the reconsideration, i was wondering where i would keep that!
I'll keep you all up-to-date as well as contribute to a blog post if you guys would like too.
Thanks again - Kyle
-
Hi Kyle,
First of all, I can't wait for Marie's book! I don't want to recover any ground that she's already gone over, so I'll just share a few thoughts.
1. Has Google verified it's a link penalty? "Site violates Google's quality guidelines" could also refer to on-site issues like hidden text or doorway pages. Given the information you provided, it's most likely a link based penalty, but you never know.
2. Not sure from your description, but I almost always disavow entire domains using the domain: command instead of individual URLs. I've seen requests rejected because they disavowed not enough URLs when they should have blocked the entire domain.
3. I agree with Marie. If you've been penalized, it's generally safer to error on the side of disavowing too many domains than not enough. This isn't to say you should disavow known good links, but if links are questionable, why take a chance.
4. Also agree with Marie on submitting documentation about your removal efforts. Wrote a post about it here: http://cyrusshepard.com/penalty-lifted/
(they tend to like everything in Google Docs files. Cuts down the risk of spam)
5. Minor point, but Google likes everything formatted in a UTF-8 encoded .txt file. I've never seen one rejected because of this, but I hear it happens.
6. I'm turning into a fan of Link Detox for toxic link discovery. Instead of running it in standard mode, upload a file of your complete backlink profile from Webmaster Tools and have Link Detox check those links. Sort the final list by hand - this means check each link! For hints, read Paddy Moogan's post about low quality links: http://www.stateofsearch.com/step-by-step-guide-finding-low-quality-links/
Damn, we should turn this into a blog post!
Hope this helps!
-
You know what Cyrus? I kid you not...this afternoon I suddenly got this thought that I should send you a copy once I got it finished. I know you have been involved in unnatural links cleanup. I'll be in touch!
-
Hi Marie,
Please let us all know when you finish the book!
-
Hey Marie - thanks for the details, i will let you know what else i find!
-
Yes, definitely. These need to be removed if possible and disavowed.
-
I generally create a Google Doc spreadsheet with my links and then have columns where I enter email addresses found on site, whois addresses and url of contact form. Then, I have columns next to those for reference numbers. Those reference numbers refer to a separate document in which I include the original source code of each email sent as well as a document with screenshots of contact forms submitted. It's a pain to do all of this but I have been successful in every single attempt at reconsideration using this method.
If you're interested, I am 95% finished writing a book on the process that I use to get rid of unnatural links penalties. You can contact me via my profile and I can send you my almost finished book at a discounted price. It does include a link to an example of the spreadsheet that I use.
-
Do you think this type of microblogging URL would be considered spam:
http://olcine.com/index.php/steffcolbere
Should i disavow these sites as well?
-
Marie, thanks for the tip on the total disavowed, i am currently in the process of downloading my link profiles from OSE and GWT to look over it again.
As for communicating i haven't submitted a document saying who i have contacted, how would you suggest documenting that? Do you have an example document to share?
-
Nick, what did your request look like when you got approved? An specifics you can share?
-
Thanks for the pointers on # of links in GWT, I will dig in deeper and see the trends over the last few months. As for the reconsideration request, do you have an examples of what people submitted that got approved?
-
Hi Kyle. The process is frustrating, isn't it?
I have a few thoughts for you. You mentioned that you disavowed over 40 different domains. That doesn't sound like many. Many sites that I have worked on have had hundreds and hundreds of domains that needed disavowing. It's possible that you haven't identified enough links as unnatural. In other words, it may be that Google wants you to address some links that you think could be natural but actually do go against the quality guidelines.
I've also seen sites fail at reconsideration because the disavow file was not properly formatted.
How well did you communicate your attempts at link removal to Google? If you have contacted webmasters and failed to get links removed then you need to document that well to Google.
-
I also agree with highland, you have to submit a file that shows all your work.
-
We have recovered from a manual link spam penalty about 2 weeks ago after 5 months of cleaning up stuff. I have cleaned about 85% of the links and submitted a recon. and 3 days later i got a message that said the MANUAL penalty was revoked.
Even though it s been 2 weeks we still have not seen any improvement on the rankings. We are now working on getting quality links .
I felt the same way back in January but kept at it. So clean up more if you can and give it another shot.
good luck I feel your pain.
Nick
-
Make it a point to check Google WMT to see if number of external links is declining. If the number is rising or staying constant, i would check the disavow file to make sure you are indeed capturing all spam domains.
I have found that a great reinclusion request can do the trick. The request should note what your wrongdoings where, what was your remediation, time spent and percentage of success. You should also apologize and your promise to be good.
-
Hmm. Well, the only other thing I could recommend would be reading this post on the matter. To summarize a bit, Google wants to see what you've done to fix the problem. Document what you've done and plead your case.
-
Highland, yes we have utilized it quite heavily with submitting over 40 different FULL domains not just urls.
-
Have you tried the disavow links tool? I know many people who have fought manual penalties and they have expressed that it's invaluable in getting rid of them.
-
I just don't think that is the right move, we still hold rankings for other pages, it just seems to be keyword/page specific some how
-
I know this may not be what you want to hear but it might make sense to start over. New domain and website. To be completely rid of the old site is a hard but necessary move.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage is deindexed in Google
Please help for some reason my website home page has disappeared, we have been working on the site but nothing that I can think of which would block it. There are no warnings in google console? Can anyone lend a hand in understanding what has gone wrong, I would really appreciate it. The site is: http://www.discountstickerprinting.co.uk/ Seems to be working again but I had to fetch the home page in google console, any idea why this has happened cannot afford a heat op at this age lol?
Intermediate & Advanced SEO | | BobAnderson0 -
What should I do if same content ranked twice or more on Google?
I have a Bangla SEO related blog where I have written article like "Domain Selection" "SEO Tools" "MOZ" etc. All the article has been written in Bengali language. I have used wp tag for every post. I have submit xml site map generated by Yoast SEO. However I kept "no index" for category. I know well duplicate content is a major problem for SEO. After publishing my content Google ranked them on 1st page. But my fear is that most of the content twice or more. The keywords are ranked by post, wp post tag and Archive. Now I have a fear of penalty. Please check the screenshot and please suggest me what to do. uRCHf yq7m2 rSLKFLG
Intermediate & Advanced SEO | | AccessTechBD0 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
Does google have trigger words it does not like?
If we mention words such as supplement or lesser word cosmetic are they a trigger to anything Google side such as a more through look at the website or such as I can see sex ect being one. I am not selling the above but we do sell the stickers for them (not the sex ones) so just airing on caution on weather to have a page talking about them on the site.
Intermediate & Advanced SEO | | BobAnderson0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Google plus
"Google+ members, and to a lesser extent others who are signed into Google, will be able to search against both the broader web and their own Google+ social graph. That’s right; Google+ circles, photos, posts and more will be integrated into search in ways other social platforms can only dream about." What is meant by " and to a lesser extent others who are signed into Google" ? Does it mean that non-google plus members won't be able to view Google+photos, posts ?
Intermediate & Advanced SEO | | seoug_20050