5th Reconsideration Request, Have i missed anything...
-
Hi Guys,
I wonder if any of you can help me out.I'll be shortly submitting another reconsideration request to Google.I've been working on removing bad / spammy links to our site http://goo.gl/j7OpL over the past 6 months and so far every reconsideration request I have submitted has been knocked back with the following message:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Dear site owner or webmaster of http://goo.gl/j7OpL ,
We received a request from a site owner to reconsider http://goo.gl/j7OpL for compliance with Google's Webmaster Guidelines.
We've reviewed your site and we still see links to your site that violate our quality guidelines .
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes .
We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I've removed over 70% of all our links - we had some large sitewide links on big sites with exact match anchor text to our main money keyword, I've also removed a large link network that our previous SEO company setup.
Today I have completed an overhaul of all our internal links, near enough every blog post that we added to the site had a link back to the home page with an exact match money keyword.
1 thing that I did notice was when we got hit by the penalty it didn't affect every keyword we target just our main / most competitive keyword, yes some of our other keywords took a dip in rankings but not as much as our main keyword.
When I submit our next reconsideration request I'll also attach a spreadsheet of links that I can't remove either because I can't find any contact details / blocked by whois or I'm just not getting a response when I email them.
If anyone can point out anything else that I have missed or might have missed that would be great.
Thanks,
Scott
-
Ryan's given you a super generous answer! I wanted to add a couple of things:
You mentioned that you will attach a list of links that you couldn't get removed. It may help to go even further. What I usually do is attach a document that contains a copy of each email that I have sent for sites that I was unsuccessful with. And, if I got a negative response back I would include that email as well.
I also include screenshots of every contact form that I have submitted. It may be overkill but from Google's perspective if you just say, "I tried to contact them" that's not enough.
You're probably already doing this, but be super humble in your request and make sure that you tell Google you are committed to following the quality guidelines from this point on. I think part of the reason why Google makes webmasters go through this is because they want to be sure that they understand the gravity of trying to game the system with SEO tactics.
And like Ryan said...be really tough on yourself when it comes to links. I have seen a number of webmasters that say, "NO! That's not an unnatural link! It came from an article that I wrote", or something like that. But in reality almost every link that you have had a hand in creating is one that is considered unnatural to Google.
Good luck! If you are successful, it would be great for you to post about your success here in the Q&A to encourage others.
Marie
-
Great answer yet again Ryan.
Thanks for your detailed response.
Thanks,
Scott
-
Hi Scott,
Removing manual penalties for manipulative links is a complex task. The result for most people is to repeatedly have the Reconsideration Request declined. If you tried another 5 times, the results are not likely to change. At a high level there is likely an error in one of three areas:
1. You need to use a comprehensive list of all known backlinks to your site. Using the list from Google is not even close to enough. I use Google WMT + OSE + Raven (Majestic) + AHREFs + SEMrush + Bing. If you do not start with a comprehensive list of links, you will continue to miss addressing manipulative links and Google will not even pay any attention to your Reconsideration Request.
2. You need to ensure your idea of a manipulative link is calibrated with Google. The process begins with being intimately familiar with Google's Guidelines. A few questions to ask for each link:
-
if search engines did not exist, would this link be here?
-
who created the link / content? If the link was created by the site owner, it would likely be considered manipulative
-
how credible is the site? the web page? the content? is it focused on a specific topic or a grab bag?
-
what value does this link / page offer to users?
The above list is not comprehensive, and there are other factors to weigh. There are corner cases as well. What I can share is the PA and DA of the pages involved should not be given any consideration at all. Additionally, there is not any automated tool which can be used for making an organic vs manipulative link determination. I have reviewed several and, to put it nicely, they seem to offer completely false hope to desperate site owners.
3. You need to make a solid, good-faith effort to contact linking sites to request the links be removed. Do not simply change anchor text as that does not make the link any less manipulative. Don't give up simply because the WHOIS e-mail is not valid. Try the WHOIS e-mail, the site e-mail and the Contact Form (if any) on the site. If a site owner denies your link removal request the first time, respond to them in a very polite manner and ask in a different way.
I have been involved with the Reconsideration Request for numerous clients in your situation. Items 1 & 2 are the most common issues and they are show stoppers.
Good Luck.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does order of site: URLs denote anything of great importance?
Howdy! Whilst looking through a few clients via the 'site:' function i've noticed that the order of pages can sometimes begin with the homepage and follow the hierarchy that is laid out on the site. However, there are instances where the top page of the 'site:' search will be a sub-page and not the home page. My question is, does this order of pages denote anything of importance? Many thanks!
Intermediate & Advanced SEO | | Corbec8880 -
Missing Google verification
I just went to check my client sites in Google search console and noticed a whole bunch of them no longer 'verified'. They were all previously verified. Why would they suddenly change status to 'not verified'? Does this affect anything (eg. search analytics data flowing through to GA)? Does this mean I have to verify all over again?
Intermediate & Advanced SEO | | muzzmoz0 -
Manual Removal Request Versus Automated Request to Remove Bad Links
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google. Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask? I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome. Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
Directory concerns - am I right to request nofollow?
A client had taken a free trial on a directory - a niche directory which only takes food related websites. They mentioned, in passing, that the directory listing was replicated across 90 food-relevant "partner" sites [alarm bells!] - some of which use nofollow - some which don't, apparently. The main directory doesn't use nofollow and offers a mix of monthly-fee based listings or free listings. I've demanded a nofollow backlink from the main site and partner sites, or no backlink... what are your thoughts?
Intermediate & Advanced SEO | | McTaggart0 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
What Happens If a Hreflang Sitemap Doesn't Include Every Language for Missing Translated Pages?
As we are building a hreflang sitemap for a client, we are correctly implementing the tag across 5 different languages including English. However, the News and Events section was never translated into any of the other four languages. There are also a few pages that were translated into some but not all of the 4 languages. Is it good practice to still list out the individual non-translated pages like on a regular sitemap without a hreflang tag? Should the hreflang sitemap include the hreflang tag with pages that are missing a few language translations (when one or two language translations may be missing)? We are uncertain if this inconsistency would create a problem and we would like some feedback before pushing the hreflang sitemap live.
Intermediate & Advanced SEO | | kchandler0 -
Google Reconsideration - Denied for the Third Time
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started. Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines." So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response! I don't know what else to do? I did everything i could think of with the exception of deleting the whole site. Any advice would be greatly appreciated. Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
Does Blocking ICMP Requests Affect SEO?
All in the title really. One of our clients came up with errors with a server header check, so I pinged them and it times out. The hosting company have told them that it's because they're blocking ICMP requests and this doesn't affect SEO at all... but I know that sometimes pinging posts, etc... can be beneficial so is this correct? Thanks, Steve.
Intermediate & Advanced SEO | | SteveOllington0