Manual Removal Request Versus Automated Request to Remove Bad Links
-
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google.
Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask?
I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome.
Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests?
Thanks, Alan
-
I agree with Moosa here. When we went through this we used Link Detox to help identify the links we wanted to remove/disavow and RMOOV to send an automated email campaign. The response rate was less than 5%as I recall and usually took multiple emails if there was to be a response.
This is the nice thing about the tools as they track success for you. It's also a really good idea to use a "throw away"email address,as many of these may be reported by the recipients as spam and get your email account added to spam filters.I think the personal touch thing is more for outreach. Not worth the effort here.
Best!
-
Alan, if I would be at your place, I would have moved to a program like link detox instead of the manual labor and here are some reasons why!
- You are emailing to the real people so no matter what trick you use, there are chances that you may fail, especially if they have decided not to remove the links.
- The removal ratio can dramatically increase if you offer a small amount to remove a link but again disavow is a better and easy option that will help you save your time and money.
- Manual Labor to do a work that might or might not work is a bad investment in my opinion, on the other hand manual labor will be much more expensive as compare to a tool like Link Detox.
Link Detox will find bad links, email them and give you the list of bad links that contain your website link. You can get that data and create a disavow file and submit it to Google.
All in all, I understand your point but in my opinion it is not a very good investment.
Hope this helps!
-
Hi Alan
When I pull links, I do so from WMT, Majestic, OSE, and Ahrefs.
Reason being, you're going to see different links from different tools. No one source covers them all, so it's best to get as much data as you can from different places.
I will read into LinkDetox and tell you if anything is a red flag to me, but again, your statement from the other question thread seems like a lot money for automation and "too good to be true".
Please let me know if you have any more questions or comments - would love to help where I can and see you through! Best of luck!
-
Hi Patrick:
Thanks for your in depth response!! The expedite tools in Link Detox is described here: http://www.linkdetox.com/boost.
But if Google will now process disavow files in a few months as the MOZ blogpost your refer to states, I guess there is no point in using boast.
Our site never received a manual penalty from Googlebut did drop in ranking after the first Penguin in April 2012. Recover since then has been sporadic and uneven despite a major investment in SEO.
I have pretty much followed the procedure you describe. Only deviation is that I compiled the links from Google Webmaster Tools plus the Link Detox database. I wonder if we are missing a significant number of links by not sourcing AHREFs, MOZ. If I can identify 80-90% of the bad links I think it is sufficient. I don't expect 100% in removing them.
Thanks again for your assistance!!
Alan
-
Hi there
Based on some previous work I have done, webmasters are substantially more responsive to manual outreach and can definitely tell the difference.
Always include:
-
Their name
-
Both in the subject line and greeting
-
I like "Attn: (name) / Link Removal Request"
-
Their site domain name
-
Links to pages with examples of your link
-
Thank them for their time
-
Signature with proper contact information
Always respond to emails - good, bad, or indifferent - people respond to a real human being. Thank them for removal, kindly respond to apprehension or irritability, and answer (within reason) questions they may have. Do not be hostile back. I would usually send three emails:
1. Stating my reason for reaching out and where my link is located.
2. If I didn't hear back, about four days later, I would follow up. Again letting them know where my link is located.
3. If I didn't hear back, about 3-5 days later, I would let them know that this would be my last email before disavowing their link.Usually, I didn't make it to three. Remember to document and keep records of your outreach in case you somehow get a manual action - you'll need it.
Here is a great link removal resource:
Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)Always consider disavow files a tool and friend - they do work. If you can't get links removed and you fear a manual action, these will be your next line of defense - especially if you are dealing with hundreds of bad links.
Take the time to manually reach out to webmasters if you can - it will pay off. I also want to suggest LinkRisk as another tool to look into for your link audits and outreach. It has been a big help for me.
Hope this helps! Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Importance of external links in 2018
How important are external links in 2018. How much of a percentage do they represent when deciding to rank a page. I imagine it depends on the query but I was wondering it if 10 % of of 60 % ? My feeling is that with good content you can get on almost any query on the 1 st page without links because that would be too penalising to small business if they had no possibility to rank with just content. Looking forward to getting some feedback.
Intermediate & Advanced SEO | | seoanalytics2 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
Are footer links important?
We currently display a list of links in the footer of our site to help boost SEO. They were put in place years ago and in a recent discuss with our UX team they requested we remove them from the site. Do footer links have any value? Or is this an old dated practice that no longer works? If we remove the footer links should we expect to see if have an impact on our SEO traffic?
Intermediate & Advanced SEO | | Mivito0 -
Should I Remove Dates From My Old Posts
I have a web site that has content about home improvement topics but the site has no new content since 2010. All the posts on the wordpress site have the date which are all 2010 and prior. Is there a downside in terms of search engine rankings to remove the dates or changing the dates? What are the risks to removing the dates? Could I lose rankings if I do this? Do you have any personal experience with this situation?
Intermediate & Advanced SEO | | alpha170 -
Removing Canonical Links
We implemented rel=canonical as we decided to paginate our pages. We then ran some testing and on the whole pagination did not work out so we removed all on-page pagination. Now, internally when I click for example a link for Widgets I get the /widgets.php but searching through Google I get to /widgets.php?page=all . There are not redirects in place at the moment. The '?page=all' page has been rated 'A' by the SEOmoz tool under On Page Optimization reports and performs much better than the exact same page without the '?page=all' (the score dips to a 'D' grade) so need to tread carefully so we don't lose the link value. Can anyone advise us on the best way forward? Thanks in advance.
Intermediate & Advanced SEO | | jannkuzel0 -
Link to domain
Let's say i want to rank for rental car service and purchases a domain rental-car-service and creates a site http://www.rental-car-service.com There will be few persons who won't use anchor text to link to the site, but will simply link using URL ( in this case http://www.rental-car-service.com ) So, will a link to http://www.rental-car-service.com from another site using http://www.rental-car-service.com as anchor text help the keyword rental car service ?
Intermediate & Advanced SEO | | seoug_20050 -
Do outbound links matter?
The value of inbound links is clear but do the number of outbound links matter when it comes to SEO and search engine rankings?
Intermediate & Advanced SEO | | casper4340 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0