Manual Removal Request Versus Automated Request to Remove Bad Links
-
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google.
Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask?
I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome.
Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests?
Thanks, Alan
-
I agree with Moosa here. When we went through this we used Link Detox to help identify the links we wanted to remove/disavow and RMOOV to send an automated email campaign. The response rate was less than 5%as I recall and usually took multiple emails if there was to be a response.
This is the nice thing about the tools as they track success for you. It's also a really good idea to use a "throw away"email address,as many of these may be reported by the recipients as spam and get your email account added to spam filters.I think the personal touch thing is more for outreach. Not worth the effort here.
Best!
-
Alan, if I would be at your place, I would have moved to a program like link detox instead of the manual labor and here are some reasons why!
- You are emailing to the real people so no matter what trick you use, there are chances that you may fail, especially if they have decided not to remove the links.
- The removal ratio can dramatically increase if you offer a small amount to remove a link but again disavow is a better and easy option that will help you save your time and money.
- Manual Labor to do a work that might or might not work is a bad investment in my opinion, on the other hand manual labor will be much more expensive as compare to a tool like Link Detox.
Link Detox will find bad links, email them and give you the list of bad links that contain your website link. You can get that data and create a disavow file and submit it to Google.
All in all, I understand your point but in my opinion it is not a very good investment.
Hope this helps!
-
Hi Alan
When I pull links, I do so from WMT, Majestic, OSE, and Ahrefs.
Reason being, you're going to see different links from different tools. No one source covers them all, so it's best to get as much data as you can from different places.
I will read into LinkDetox and tell you if anything is a red flag to me, but again, your statement from the other question thread seems like a lot money for automation and "too good to be true".
Please let me know if you have any more questions or comments - would love to help where I can and see you through! Best of luck!
-
Hi Patrick:
Thanks for your in depth response!! The expedite tools in Link Detox is described here: http://www.linkdetox.com/boost.
But if Google will now process disavow files in a few months as the MOZ blogpost your refer to states, I guess there is no point in using boast.
Our site never received a manual penalty from Googlebut did drop in ranking after the first Penguin in April 2012. Recover since then has been sporadic and uneven despite a major investment in SEO.
I have pretty much followed the procedure you describe. Only deviation is that I compiled the links from Google Webmaster Tools plus the Link Detox database. I wonder if we are missing a significant number of links by not sourcing AHREFs, MOZ. If I can identify 80-90% of the bad links I think it is sufficient. I don't expect 100% in removing them.
Thanks again for your assistance!!
Alan
-
Hi there
Based on some previous work I have done, webmasters are substantially more responsive to manual outreach and can definitely tell the difference.
Always include:
-
Their name
-
Both in the subject line and greeting
-
I like "Attn: (name) / Link Removal Request"
-
Their site domain name
-
Links to pages with examples of your link
-
Thank them for their time
-
Signature with proper contact information
Always respond to emails - good, bad, or indifferent - people respond to a real human being. Thank them for removal, kindly respond to apprehension or irritability, and answer (within reason) questions they may have. Do not be hostile back. I would usually send three emails:
1. Stating my reason for reaching out and where my link is located.
2. If I didn't hear back, about four days later, I would follow up. Again letting them know where my link is located.
3. If I didn't hear back, about 3-5 days later, I would let them know that this would be my last email before disavowing their link.Usually, I didn't make it to three. Remember to document and keep records of your outreach in case you somehow get a manual action - you'll need it.
Here is a great link removal resource:
Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)Always consider disavow files a tool and friend - they do work. If you can't get links removed and you fear a manual action, these will be your next line of defense - especially if you are dealing with hundreds of bad links.
Take the time to manually reach out to webmasters if you can - it will pay off. I also want to suggest LinkRisk as another tool to look into for your link audits and outreach. It has been a big help for me.
Hope this helps! Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Site wide links - should they be nofollow or followed links
Hi We have a retail site and a blog that goes along with the site. The blog is very popular and the MD wanted a link from the blog back to the main retail site. However as this is a site wide link on the blog, am I right in thinking this really should be no follow link. The link is at the top of every page. Thanks in advance for any help
Intermediate & Advanced SEO | | Andy-Halliday0 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
Should you bother with an "impact links" manual action
I have a couple sites that have these, and I have done a lot of work to get them removed, but there seems to be very little if any benefit from doing this. In fact, sites were we have done nothing after these penalties seem to be doing better than ones where we have done link removal and the reconsideration request. Google says "I_f you don’t control the links pointing to your site, no action is required on your part. From Google’s perspective, the links already won’t count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you’re able to get the artificial links removed, submit a reconsideration request__. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action._" I would guess a lot of people with this penalty don't even know they have it, and it sounds like leaving it alone really doesn't hurt your site. If seems to me that just simply ignoring this and building better links and higher quality content should help improve your site rankings vs. worrying about trying to get all these links removed/disavowed. What are your thoughts? Is it worth trying to get this manual action removed?
Intermediate & Advanced SEO | | netviper0 -
Links in body text
From a purely SEO /link juice perspective, is there any benefit to linking from body text to a page that is in a pervasive primary navigation? The primary nav puts a link at the top of the HTML. With the tests done by members of this site, the "first link counts" rule negates the link juice value of a link in the body text if there is already a link in the nav. Now I've also seen the data on using hash tags to get a second or third link, but ignoring that, it would seem that links in the body text to pages in the nav have zero effect. This brings me to another question - block level navigation. If anchor text links pass more juice than links in the top navigation, why would you put your most coveted target pages in the top nav? You would be better off building links in the content, which would create a poor user experience. To me, the theory that anchor text links in the body pass more juice than links in the primary nav doesn't make any sense. Can someone please explain this to me?
Intermediate & Advanced SEO | | CsmBill0 -
How to minimalise links in your footer
Hi guys, I'm working on the website to improve the internal linking structure. We have thousand of pages, and on every single page we have the same footer with the same links. For this reason I would like to change the footer in only relevant links for the user, but also for the robots. So for the user I leave in the general main links Home / Contact / Promotions and customise a part of the links to specific links about the section they are looking at. Now my idea was to add to the General Main links a Nofollow, so I direct the robots in a better structure about how to read the website. I have been reading a lot about internal linkbuilding- like http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday and http://www.seomoz.org/learn-seo/internal-link http://www.searchenginejournal.com/information-architecture-rocket-science-simplified/22503/ and a lot more, too much to display all. but my question would be, is it smart to internally start using NOFOLLOW's on links. because I do found also some negative comments on this approach http://www.dashboardjunkie.com/noindex-nofollow-canonical-and-disallow I hope to get some feedback from the community to make up my mind.
Intermediate & Advanced SEO | | Letty0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Directories - Bad or Good for Link Building (Discussion on Penguin)
Hello, I would like to hear everybodies opinion on directories for link building now that penguin is out. Here's a good background post: http://www.seomoz.org/blog/web-directory-submission-danger Do you think their out? How do you still use them? Which ones do you stick to?
Intermediate & Advanced SEO | | BobGW0