Google Reconsideration Request - Most Efficient Process
-
Hi,
I'm working on a Google reconsideration request for a site with a longstanding penalty.
Here's what I did:
Round 1
- Downloaded a CSV of all the domains and all the pages linking to the site. Went through the lot manually and sorted them into three types: Disavow Domain, Disavow Page, Keep
- All low-quality domains were disavowed, all pages from places like blogspot with low-quality links on certain blogs were disavowed. Submitted disavow file, then sent a detailed reconsideration request including a link to the disavow file.
Reconsideration request was not successful. Google gave two examples of links I should remove, bizarrely the examples they gave were already disavowed, which seemd a bit odd. So I took this to mean Google Webmaster Tools and disavow files were in themselves not enough. The links I kept were largely from PRWeb syndication which seems legit.
Round 2
Here's what I'm doing now. Any ideas for how the below process can be improved to get the maximum chance of a successful request, please let me know.- Get all linking pages from Webmaster Tools as before and also MajesticSEO's Historic Index. This gave me around three times more domains to remove. The additionnal domains from Majestic that weren't in Webmaster tools I just put them all in the disavow file.
- Conduct a manual link removal email campaign. I've got around 2500 domains to go through, so how can I best do this. My process at the moment is:
- Use software to get email addresses from whois records
- send them an email
- make a spreadsheet of responses
- include link to spreadsheet in Google Docs as well as link to new disavow file
Should I research each site manually to get email addresses? It does seem rather a waste of an offshorer's time, from what I've seen some people use offshorers and others have used software tools successfully. The other thing is sending the emails, how can I do this? Any smtp email campaign site won't let me use their service because the emails are not opt-in, they classify it as spam. Does anyone know a solution to send 2500 emails legitimately from a webmail account for example? I'm having to send bulk emails to get rid of spam links.
Finally most of the offending links have keyword anchor text from spun articles, I've deleted all the sites except EzineArticles. Would you delete this too, it's an awful site but client is hung up on it. ExineArticle links may have some value, on the other hand it's more of the same keyword-rich anchor text articles. Keep or disavow the individual pages?
Finally, anything else I've missed? Anything to add? Thanks for all your help
-
I personally do everything manually. I think that the link removal tools can work great for some sites, but your best chance at identifying the bad links and keeping the good ones is to look at them manually. 2500 domains is a lot, but not impossible. I'm currently working on an account of about that size and it will take me about 10-14 days to go through as many. Once you get going you will recognize patterns and it will go faster.
I used to get emails on my own but I have just hired someone to do this for me. I find that the automated tools miss a lot of them. I was considering hiring from o-desk or mechanical turk, but in my situation, because my business is expanding and most of what I do is penalty removal, it's worth my while to hire and train someone to do this for me.
btw...if you've got 2500 domains, you won't have 2500 emails. Many will be offline or nofollowed or perhaps even natural.
Ezine Articles links definitely need to be removed if they are followed links. Often times those links are nofollowed, but if you have a high enough account level there then they are followed and need to go.
A few other points:
-Yes you're right. It's not enough to just disavow. Google's going to want to see evidence that you've tried hard to remove links.
-Lately I have only be using links from WMT and not other sources like Majestic and ahrefs. That may cut down on the number of domains you have to deal with. So far it is working for me.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
SEO Value of Google+?
Hi Mozers, Does having a Google+ page really impact SEO? Thanks, Yael
Intermediate & Advanced SEO | | yaelslater1 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Google is not honoring my descriptions
I finally got our title tags honored and now Google is just making the descriptions whatever it wants. This is happening on pretty much every one of our pages. An example: http://www.sqlsentry.com/products/plan-explorer/sql-server-query-view SERPS = SQL Server MVP Aaron Bertrand shares a demo kit for Plan Explorer to give you better insight into the advantages of the tool, and to help you share its virtues ... Description tag = SQL Sentry Plan Explorer is a free query plan analysis tool that will allow you to find the most expensive operators by CPU, I/O, or both. I can see the description tag when I view source so I know that it is pulling it from the table correctly. What can I do to fix this?
Intermediate & Advanced SEO | | Sika220 -
Why is my site not getting crawled by google?
Hi Moz Community, I have an escort directory website that is built out of ajax. We basically followed all the recommendations like implementing the escaped fragment code so Google would be able to see the content. Problem is whenever I submit my sitemap on Google webmastertool it always 700 had been submitted and only 12 static pages had been indexed. I did the site query and only a number of pages where indexed. Does it have anything to do with my site being on HTTPS and not on HTTP? My site is under HTTPS and all my content is ajax based. Thanks
Intermediate & Advanced SEO | | en-gageinc0 -
Our Robots.txt and Reconsideration Request Journey and Success
We have asked a few questions related to this process on Moz and wanted to give a breakdown of our journey as it will likely be helpful to others! A couple of months ago, we updated our robots.txt file with several pages that we did not want to be indexed. At the time, we weren't checking WMT as regularly as we should have been and in a few weeks, we found that apparently one of the robots.txt files we were blocking was a dynamic file that led to the blocking of over 950,000 of our pages according to webmaster tools. Which page was causing this is still a mystery, but we quickly removed all of the entries. From research, most people say that things normalize in a few weeks, so we waited. A few weeks passed and things did not normalize. We searched, we asked and the number of "blocked" pages in WMT which had increased at a rate of a few hundred thousand a week were decreasing at a rate of a thousand a week. At this rate it would be a year or more before the pages were unblocked. This did not change. Two months later and we were still at 840,000 pages blocked. We posted on the Google Webmaster Forum and one of the mods there said that it would just take a long time to normalize. Very frustrating indeed considering how quickly the pages had been blocked. We found a few places on the interwebs that suggested that if you have an issue/mistake with robots.txt that you can submit a reconsideration request. This seemed to be our only hope. So, we put together a detailed reconsideration request asking for help with our blocked pages issue. A few days later, to our horror, we did not get a message offering help with our robots.txt problem. Instead, we received a message saying that we had received a penalty for inbound links that violate Google's terms of use. Major backfire. We used an SEO company years ago that posted a hundred or so blog posts for us. To our knowledge, the links didn't even exist anymore. They did.... So, we signed up for an account with removeem.com. We quickly found many of the links posted by the SEO firm as they were easily recognizable via the anchor text. We began the process of using removem to contact the owners of the blogs. To our surprise, we got a number of removals right away! Others we had to contact another time and many did not respond at all. Those we could not find an email for, we tried posting comments on the blog. Once we felt we had removed as many as possible, we added the rest to a disavow list and uploaded it using the disavow tool in WMT. Then we waited... A few days later, we already had a response. DENIED. In our request, we specifically asked that if the request were to be denied that Google provide some example links. When they denied our request, they sent us an email and including a sample link. It was an interesting example. We actually already had this blog in removem. The issue in this case was, our version was a domain name, i.e. www.domainname.com and the version google had was a wordpress sub domain, i.e. www.subdomain.wordpress.com. So, we went back to the drawing board. This time we signed up for majestic SEO and tied it in with removem. That added a few more links. We also had records from the old SEO company we were able to go through and locate a number of new links. We repeated the previous process, contacting site owners and keeping track of our progress. We also went through the "sample links" in WMT as best as we could (we have a lot of them) to try to pinpoint any other potentials. We removed what we could and again, disavowed the rest. A few days later, we had a message in WMT. DENIED AGAIN! This time it was very discouraging as it just didn't seem there were any more links to remove. The difference this time, was that there was NOT an email from Google. Only a message in WMT. So, while we didn't know if we would receive a response, we responded to the original email asking for more example links, so we could better understand what the issue was. Several days passed we received an email back saying that THE PENALTY HAD BEEN LIFTED! This was of course very good news and it appeared that our email to Google was reviewed and received well. So, the final hurdle was the reason that we originally contacted Google. Our robots.txt issue. We did not receive any information from Google related to the robots.txt issue we originally filed the reconsideration request for. We didn't know if it had just been ignored, or if there was something that might be done about it. So, as a last ditch final effort, we responded to the email once again and requested help as we did the other times with the robots.txt issue. The weekend passed and on Monday we checked WMT again. The number of blocked pages had dropped over the weekend from 840,000 to 440,000! Success! We are still waiting and hoping that number will continue downward back to zero. So, some thoughts: 1. Was our site manually penalized from the beginning, yet without a message in WMT? Or, when we filed the reconsideration request, did the reviewer take a closer look at our site, see the old paid links and add the penalty at that time? If the latter is the case then... 2. Did our reconsideration request backfire? Or, was it ultimately for the best? 3. When asking for reconsideration, make your requests known? If you want example links, ask for them. It never hurts to ask! If you want to be connected with Google via email, ask to be! 4. If you receive an email from Google, don't be afraid to respond to it. I wouldn't over do this or spam them. Keep it to the bare minimum and don't pester them, but if you have something pertinent to say that you have not already said, then don't be afraid to ask. Hopefully our journey might help others who have similar issues and feel free to ask any further questions. Thanks for reading! TheCraig
Intermediate & Advanced SEO | | TheCraig5 -
Can links indexed by google "link:" be bad? or this is like a good example by google
Can links indexed by google "link:" be bad? Or this is like a good example shown by google. We are cleaning our links from Penguin and dont know what to do with these ones. Some of them does not look quality.
Intermediate & Advanced SEO | | bele0 -
How to make Google forget my pages ?
Hello all ! I've decided to delete many pages from my website wich had poor content. I've made a php 301 redirect from all these old pages to a unique page (not the home page, a deep page). My problem is that this modification has been made a week ago and my position in the SERPs have crashed down... What can I do ? I believe that I'll get up again when Google will see that these pages don't exist anymore but it could take a long time 😞 (these page are in the Google cache with a date older than my modification's date) I've read somewhere that I should put a link to the destination page (where old pages are 301 redirected) but I don't understand how it could help... Can someone help me ? Tell me what I've done wrong... These pages were very poor and I've deleted them in order to boost the global quality of my site... It should help me in the SERPs, not penalize me...
Intermediate & Advanced SEO | | B-CITY0