Alternative Link Detox tools?
-
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
-
I just wanted to add to this discussion to say that I created a tool that helps me create really good spreadsheets for link auditing. It aggregates links from a number of sources, reduces the list down to one link from each domain, and marks the nofollows. It also tells you which links are from domains that are on my blacklist of domains that I almost always disavow. The blacklist contains over 14,000 domains at this point and is growing. And, it tells you which links are from domains that I usually ignore such as dmoz scrapers and domain stats pages where we know the link is not one made for SEO purposes.
I'm not a fan of tools that automate the decision making promises because I've seen so many of them mark fantastic links as bad ones and miss a whole bunch of really spammy links. If you're trying to escape Penguin, you have to be way more accurate than this.
It's still in a beta phase right now as I am working on making it as useful as possible, but you can see the details here: http://www.hiswebmarketing.com/manual-link-audits/
-
If you are looking specifically for link analysis tools then a pretty good alternative is http://linkrisk.com/
I have managed to get many penalties overturned based solely on using them as an analysis tool.
-
Agreed - it's not much fun, but every reputable link auditor I know uses multiple available sources. All of the tools (including our own at Moz) have different biases, and when you're trying to get a complete a list as possible, you need to use as many sources as you can.
I would highly recommend against going too automated - the cost "savings" short-term could be lost quickly if you start cutting potentially good links. It really depends on your current risk/reward profile. If you're already hit hard with a penalty, then cutting deep and fast may be a good bet (and automation would be more effective). If you're being proactive to prevent future issues, then relying too much on automation could be very dangerous.
-
Like they said, compile/export everything, combine then remove duplicates and insert to the tool of your choice, like link risk, link detox or even rmoov if you want to contact these webmasters
Be sure to still check the list since it's never 100% right. Some good, natural links can be classified within their calculations of bad urls.
-
I agree with everything that Travis said… the reason why you are witnessing different number of total links is because of the index you are using! GWT will give you limited amount of data where as Open site explorer will show you a bit more links (there index fetch every link that has been shared on twitter) where as the largest link index I know are Ahrefs and Majestic SEO.
My advice would be to get the data from all sources, remove the duplicates and then run link detox. Keep a very close look of what link detox says are bad links because no one other than Google know what exactly is a bad links so all others are just using their own formula.
I am sure if you are going to add the link file on “Link Risk” the results might be different from Link Detox.
Just keep a close eye and decide if you want a particular link to be removed.
Planning to remove links? There is a tool that can help you with that www.rmoov.com just give it a try and remove the links that are bad in your eye!
Hope this helps!
-
The difference between the number of links you see across various sources is because of the resources themselves. Some backlink services only crawl so much. Google can only crawl so much of the internet.
Your best bet is to use multiple sources. I would go with GWT, Majestic SEO and aHrefs, then filter duplicates. You'll have a much better understanding of where the site stands. Once you have that, you can use Cemper Link Detox to upload the data.
Be very careful, Link Detox still throws some false positives. Though I expect it to get better every day. There's a machine learning element to it that's based on human feedback.
Finally, I would be very careful of fully automating anything like a disavow/removal process. Do you really want something so delicate taken out of your hands? It's still very necessary to manually check each link so you know that you're getting rid of the bad and keeping the good.
Link Detox is the closest thing there is, that I'm aware of, that will help 'automate' the process in a safe-ish way. The subject of link removal/disavow is something so sensitive I wouldn't outsource it. Then again, I hate the idea of outsourcing overflow blog writing work to competent people. Call me a control freak.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking to External Websites?
Is it good to link external websites from every page. Since, the on-page grader shows there should be one link pointing to an external source. I have a website that can point to an external website from every page using the brand name of the specific site like deal sites do have. Is it worth having external link on every page, of-course with a no-follow tag?
Intermediate & Advanced SEO | | welcomecure0 -
Google Webmaster Tools Set-up
Hello! My URL of www.morganlindsayphotography.com I have set up in google webmaster tools. I also have added the instance of morganlindsayphotography.com (non www version) My blog is located at www.morganlindsayphotography.com/blog/ My question is do I add a sitemap to the "<a>www.morganlindsayphotography.com/blog/</a>" as well as the <a>www.morganlindsayphotography.com</a> ? Thank you!
Intermediate & Advanced SEO | | 393morgan0 -
Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
Greeting MOZ Community: I run www.nyc-officespace-leader.com, a real estate website in New York City. The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two? What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same. I am concerned that page bloat has something to do with a recent drop in ranking. Thanks everyone!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Google disavow tool
I have an algorithmic penalty on one of my websites. I never received a notification of a manual penalty in GWMT and even sent in a reconsideration request 6 months ago ad they told me their were no manual penalties on the website. I have cleaned up my link profile and what I could not clean up I sent in using the Google disavow tool a few days ago. I've heard to just wait if it's algorithmic or should I send in another reconsideration request for disavow links tool?
Intermediate & Advanced SEO | | MarkHIggins0 -
Help Identifying Unnatural Links
http://bit.ly/XT8yYYHi,Any help with the below will be most appreciated.We received an unnatural links warning in Webmaster Tools and noticed a large drop in our rankings. We downloaded and carried out a full link audit (3639 links) and logged in an excel spreadsheet with the following status: OK, Have Contacted, Can't Contact, Not SureWe have had some success but the majority of the ones we identified are not contactable.We use the dis-avow tool to tell Google of these. We then submitted a reconsideration request where we explained to Google our efforts and that we can supply them with our audit if necessary by email as you can't upload any evidence.A few days later we received a response suggesting that we still have unnatural links. We are a little stuck as we don't know what they can be:1. Is Google actually looking at our dis-avowed links before making this judgement?2. We have missed something that Google is considering bad but we can't see in our audit?Again we need a little help as we are trying to sort this out but can't see what we are falling down on.I can provide our spreadsheet if necessary.Many ThanksLee
Intermediate & Advanced SEO | | LeeFella0 -
Cross linking between categories
Is it useful for SEO to cross link between TOP level categories, let's say I have a Home page and then 2 sub categories, one about green widgets one about red widgets
Intermediate & Advanced SEO | | seoanalytics
Should i create a link from the green widget to the red widget or should I leave those are separate silos ? I know that within a silo i need to cross link ( from green widget 1 to green widget 2 etc... ) but how about about from the main category to the other main category ?0 -
Mobile alternates and redirects
Hi! We have a desktop version of our site at http://www.domain.com, and some weeks ago, we launched a mobile edition at http://m.domain.com, replicating the most important sections of the site, but not yet all of them. Actually, if you access with a mobile device userAgent to any desktop url you are redirected to the home of the mobile web. This is the only redirect implemented about mobile and desktop versions. A) Shall we also redirect "Googlebot-Mobile" to our mobile site, or it could be considered cloaking?
Intermediate & Advanced SEO | | marianoSoler98
B) Its necessary to implement the rel="alternate" media="handheld" tag in all of our Desktop SEO URLs? And in our mobile ones? Can't it be implemented via sitemaps like the rel="alternate" hreflang="x" tag?
C) Would the linkbuilding job done on the Desktop version affects the Mobile also, or we would still need to do a separate job? Thanks!0 -
Optimising My Website Link Containers
Hi, I'm looking at my links containers and trying to optimise them. I would be greatful if anyone can give me some feedback on my plan for perfect optimaisation. My links are constructed as follows: I have a two states:
Intermediate & Advanced SEO | | James77
1/. A Non Hover state which contains an Image and Text
2/. A Hover state which contains a bit more text - I do this as containing full text on the non hover state would not be good for users and would look ugly as well. Here's an example block of the HTML - as you can see from the URL, its quite a deep page level. From the URL and Alt / Titles the Page I am Linking to is about: "The Royal Hotel Accommodation New York Holidays". I Just a bit confused on how I should apply ALT and Title (Titles in particular) attributes given the nested DiV's etc - I can apply these to parent level, or apply all levels, or apply them to a mix. Also is there any obvious thinks you can think of I am missing that may help onsite SEO? Thanks in Advance CURRENT UNOPTIMISED CODE:
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
MY OPTIMISED CODE (Adding Title and Alt attributes):
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
0