Google Reconsideration Request - Most Efficient Process
-
Hi,
I'm working on a Google reconsideration request for a site with a longstanding penalty.
Here's what I did:
Round 1
- Downloaded a CSV of all the domains and all the pages linking to the site. Went through the lot manually and sorted them into three types: Disavow Domain, Disavow Page, Keep
- All low-quality domains were disavowed, all pages from places like blogspot with low-quality links on certain blogs were disavowed. Submitted disavow file, then sent a detailed reconsideration request including a link to the disavow file.
Reconsideration request was not successful. Google gave two examples of links I should remove, bizarrely the examples they gave were already disavowed, which seemd a bit odd. So I took this to mean Google Webmaster Tools and disavow files were in themselves not enough. The links I kept were largely from PRWeb syndication which seems legit.
Round 2
Here's what I'm doing now. Any ideas for how the below process can be improved to get the maximum chance of a successful request, please let me know.- Get all linking pages from Webmaster Tools as before and also MajesticSEO's Historic Index. This gave me around three times more domains to remove. The additionnal domains from Majestic that weren't in Webmaster tools I just put them all in the disavow file.
- Conduct a manual link removal email campaign. I've got around 2500 domains to go through, so how can I best do this. My process at the moment is:
- Use software to get email addresses from whois records
- send them an email
- make a spreadsheet of responses
- include link to spreadsheet in Google Docs as well as link to new disavow file
Should I research each site manually to get email addresses? It does seem rather a waste of an offshorer's time, from what I've seen some people use offshorers and others have used software tools successfully. The other thing is sending the emails, how can I do this? Any smtp email campaign site won't let me use their service because the emails are not opt-in, they classify it as spam. Does anyone know a solution to send 2500 emails legitimately from a webmail account for example? I'm having to send bulk emails to get rid of spam links.
Finally most of the offending links have keyword anchor text from spun articles, I've deleted all the sites except EzineArticles. Would you delete this too, it's an awful site but client is hung up on it. ExineArticle links may have some value, on the other hand it's more of the same keyword-rich anchor text articles. Keep or disavow the individual pages?
Finally, anything else I've missed? Anything to add? Thanks for all your help
-
I personally do everything manually. I think that the link removal tools can work great for some sites, but your best chance at identifying the bad links and keeping the good ones is to look at them manually. 2500 domains is a lot, but not impossible. I'm currently working on an account of about that size and it will take me about 10-14 days to go through as many. Once you get going you will recognize patterns and it will go faster.
I used to get emails on my own but I have just hired someone to do this for me. I find that the automated tools miss a lot of them. I was considering hiring from o-desk or mechanical turk, but in my situation, because my business is expanding and most of what I do is penalty removal, it's worth my while to hire and train someone to do this for me.
btw...if you've got 2500 domains, you won't have 2500 emails. Many will be offline or nofollowed or perhaps even natural.
Ezine Articles links definitely need to be removed if they are followed links. Often times those links are nofollowed, but if you have a high enough account level there then they are followed and need to go.
A few other points:
-Yes you're right. It's not enough to just disavow. Google's going to want to see evidence that you've tried hard to remove links.
-Lately I have only be using links from WMT and not other sources like Majestic and ahrefs. That may cut down on the number of domains you have to deal with. So far it is working for me.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing Parameterized URLs from Google Index
We have duplicate eCommerce websites, and we are in the process of implementing cross-domain canonicals. (We can't 301 - both sites are major brands). So far, this is working well - rankings are improving dramatically in most cases. However, what we are seeing in some cases is that Google has indexed a parameterized page for the site being canonicaled (this is the site that is getting the canonical tag - the "from" page). When this happens, both sites are being ranked, and the parameterized page appears to be blocking the canonical. The question is, how do I remove canonicaled pages from Google's index? If Google doesn't crawl the page in question, it never sees the canonical tag, and we still have duplicate content. Example: A. www.domain2.com/productname.cfm%3FclickSource%3DXSELL_PR is ranked at #35, and B. www.domain1.com/productname.cfm is ranked at #12. (yes, I know that upper case is bad. We fixed that too.) Page A has the canonical tag, but page B's rank didn't improve. I know that there are no guarantees that it will improve, but I am seeing a pattern. Page A appears to be preventing Google from passing link juice via canonical. If Google doesn't crawl Page A, it can't see the rel=canonical tag. We likely have thousands of pages like this. Any ideas? Does it make sense to block the "clicksource" parameter in GWT? That kind of scares me.
Intermediate & Advanced SEO | | AMHC0 -
Google snippet chosen why?
We have a page about buying property in the Megeve area of the Alps in France. We are No.2 on Google.co.uk for the term "megeve property for sale" and No.1 for "megeve property". http://www.prestigeproperty.co.uk/MegeveProperty/Properties.asp If you search for "megeve property for sale", Google serves our META description as the snippet: Ski chalets, homes and apartments for sale in this exclusive, prestigious Rhone Alpes village - 520000-16500000 EUR. However, we noticed that searching for just "megeve property" serves up a much better snippet taken from the text on the page: A crucial factor for potential property buyers is that there is a strong rental market in Megève and this remains high all year around with properties close to the ... Does anyone know why Google would serve this particular snippet instead of the META description. Is it the number of strong and descriptive words used, or some other reason?
Intermediate & Advanced SEO | | PPGUKLTD0 -
How to NOT appear in Google results in other countries?
I have ecommerce sites the only serve US and Canada. Is there a way to prevent a site from appearing in the Google results in foreign countries? The reason I ask is that we also have a lot of informational pages that folks in other countries are visiting, then leaving right after reading. This is making our overall Bounce Rate very high (64%). When we segment the GA data to look at just our US visitors, then the Bounce Rate drops a lot. (to 48%) Thanks!
Intermediate & Advanced SEO | | GregB1230 -
Switching from Google Plus Local to Google Plus Business
Greetings, We have a website design firm located in India. We wanted to target customers in our city who are looking for website design locally. And with google plus local and a few content marketing would get us into first page very soon because none in the competition is using social signals or even content marketing. BUT unfortunately from last month even though our Google Places is verified we cant verify our Google Local Plus page https://plus.google.com/b/116513400635428782065/ It just shows error 500. Its a bug and its been a year for people without it being addressed. So we are skeptical if our strategy would work without Google+. At the least we decided we would just make company local page and connect it with website. But it might not have effect as local. So we are still unsure which step to take either to wait for google to fix it.(feedbacks emails calls nothing worked) OR We start the process with Google Business Category.
Intermediate & Advanced SEO | | hard0 -
Google local listing
I have a site and i registerd for local listing in google but i have not received any letter from google.It is second time i request for pin one month back and this time also did not received letter from google. what should i do?
Intermediate & Advanced SEO | | Alick3000 -
How do I presuade Google to re-consider my site?
A few weeks ago I got an emai from Google that my site is suspected to violating Google guidelines-->suspected links manipulationg Google Page rank. My site dropped to the second page. I have contacted some of the top webmasters who link to me and they have removed the links or added a nofollow. When I asked for re-consideation I got an answear that there are still suspected links. What do I do now? I can't remove all of my links?! BTW this happened before the offical Pinguin Update.
Intermediate & Advanced SEO | | Ofer230 -
Google Places / Google Analytics
I apologize first if this comes across as extremely novice, but I realized I really didn't know the answer and so - here I am. 🙂 Is anyone familiar with tracking google place traffic in google analytics? Is it possible? I'd love to know how many of our visitors are coming from our google place listings (we have several locations throughout the state.) Much gratitude in advance ~ Alicia
Intermediate & Advanced SEO | | Aaronetics0 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0