Alternative Link Detox tools?
-
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
-
I just wanted to add to this discussion to say that I created a tool that helps me create really good spreadsheets for link auditing. It aggregates links from a number of sources, reduces the list down to one link from each domain, and marks the nofollows. It also tells you which links are from domains that are on my blacklist of domains that I almost always disavow. The blacklist contains over 14,000 domains at this point and is growing. And, it tells you which links are from domains that I usually ignore such as dmoz scrapers and domain stats pages where we know the link is not one made for SEO purposes.
I'm not a fan of tools that automate the decision making promises because I've seen so many of them mark fantastic links as bad ones and miss a whole bunch of really spammy links. If you're trying to escape Penguin, you have to be way more accurate than this.
It's still in a beta phase right now as I am working on making it as useful as possible, but you can see the details here: http://www.hiswebmarketing.com/manual-link-audits/
-
If you are looking specifically for link analysis tools then a pretty good alternative is http://linkrisk.com/
I have managed to get many penalties overturned based solely on using them as an analysis tool.
-
Agreed - it's not much fun, but every reputable link auditor I know uses multiple available sources. All of the tools (including our own at Moz) have different biases, and when you're trying to get a complete a list as possible, you need to use as many sources as you can.
I would highly recommend against going too automated - the cost "savings" short-term could be lost quickly if you start cutting potentially good links. It really depends on your current risk/reward profile. If you're already hit hard with a penalty, then cutting deep and fast may be a good bet (and automation would be more effective). If you're being proactive to prevent future issues, then relying too much on automation could be very dangerous.
-
Like they said, compile/export everything, combine then remove duplicates and insert to the tool of your choice, like link risk, link detox or even rmoov if you want to contact these webmasters
Be sure to still check the list since it's never 100% right. Some good, natural links can be classified within their calculations of bad urls.
-
I agree with everything that Travis said… the reason why you are witnessing different number of total links is because of the index you are using! GWT will give you limited amount of data where as Open site explorer will show you a bit more links (there index fetch every link that has been shared on twitter) where as the largest link index I know are Ahrefs and Majestic SEO.
My advice would be to get the data from all sources, remove the duplicates and then run link detox. Keep a very close look of what link detox says are bad links because no one other than Google know what exactly is a bad links so all others are just using their own formula.
I am sure if you are going to add the link file on “Link Risk” the results might be different from Link Detox.
Just keep a close eye and decide if you want a particular link to be removed.
Planning to remove links? There is a tool that can help you with that www.rmoov.com just give it a try and remove the links that are bad in your eye!
Hope this helps!
-
The difference between the number of links you see across various sources is because of the resources themselves. Some backlink services only crawl so much. Google can only crawl so much of the internet.
Your best bet is to use multiple sources. I would go with GWT, Majestic SEO and aHrefs, then filter duplicates. You'll have a much better understanding of where the site stands. Once you have that, you can use Cemper Link Detox to upload the data.
Be very careful, Link Detox still throws some false positives. Though I expect it to get better every day. There's a machine learning element to it that's based on human feedback.
Finally, I would be very careful of fully automating anything like a disavow/removal process. Do you really want something so delicate taken out of your hands? It's still very necessary to manually check each link so you know that you're getting rid of the bad and keeping the good.
Link Detox is the closest thing there is, that I'm aware of, that will help 'automate' the process in a safe-ish way. The subject of link removal/disavow is something so sensitive I wouldn't outsource it. Then again, I hate the idea of outsourcing overflow blog writing work to competent people. Call me a control freak.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I use links intag instead of "ahref" tag can Google read links inside div tag?
Hi All, Need a suggestion on it. For buttons, I am using links in tag instead of "ahref". Do you know that can Google read links inside "div" tag? Does it pass rank juice? It will be great if you can provide any reference if possible.
Intermediate & Advanced SEO | | pujan.bikroy0 -
Should we optimise our internal links?
Hi again, We recently had a technical search audit done by a specialist agency and they discovered a number of internal links that caused redirects to happen. The agency has recommended we update all of these links to link directly to the destination so we don't lose out on link equity. We'd just like to know if you think this would be a worthwhile use of our time. Our web team seem to think that returning a 301 to a crawler means that the crawler will stop indexing the original URL and instead index the redirected destination? Thanks all. Clair
Intermediate & Advanced SEO | | iescape2 -
Does adding more outgoing links on a high PA page decrease the juice passed to previous links?
Hi, I'm not sure how PA DA exactly works when the goal is to create backlinks to your site in order to have the most impact on passing PA DA juice (if there is such a thing) to ones money site. For example let's say you have a blog and the PA is 40 DA is 30. Let's say I create a backlink pointing to my site on the homepage of this blog, in which I desire better rankings for, and the links I created are only 1-3 outgoing links on this post which is again on the homepage. Then say in a months time, I want to add another post on the homepage (so the 40 PA and 30 DA stays the same) creating a backlink to one of my other money sites. Does adding this second round of backlinks result in sending less juice to the first? This is what I want to know. Thank you!
Intermediate & Advanced SEO | | z8YX9F800 -
350 (Out the 750) Internal Links Listed by Webmaster Tools Dynamically Generated-Best to Remove?
Greetings MOZ Community: When visitors enter real estate search parameters in our commercial real estate web site, the parameters are somehow getting indexed as internal links in Google Webmaster Tools. About half are 700 internal links are derived from these dynamic URLs. It seems to me that these dynamic alphanumeric URL links would dilute the value of the remaining static links. Are the dynamic URLs a major issue? Are they high priority to remove? The dynamic URLs look like this: /listings/search?fsrepw-search-neighborhood%5B%5D=m_0&fsrepw-search-sq-ft%5B%5D=1&fsrepw-search-price-range%5B%5D=4&fsrepw-search-type-of-space%5B%5D=0&fsrepw-search-lease-type=1 These URLs do not show up when a SITE: URL search is done on Google!
Intermediate & Advanced SEO | | Kingalan10 -
Thoughts on Proactive Link Disavow
One of my newish hobby sites has began to attract some crappy links - as per Google Webmaster Tools, Links To Your Site report. The typical .ru and .pl kind of crap that seems to seep into all somewhat successful sites' link profiles. I have not received any notifications or penalties, BUT I am considering proactively disavowing these, but wanted to bounce this idea off some other SEOs before proceeding. Cheers!
Intermediate & Advanced SEO | | David_ODonnell0 -
Do links from twitter count in SEOMoz's Toolbar link count?
I am using the Chrome extension and looking at a SERP, when a page is said to have 2000 incoming links, does that include tweets with a link back to this page? What about retweets. Are those counted separately or as one? And what about independent tweets that have exactly the same content (tweet text + link)
Intermediate & Advanced SEO | | davhad0 -
How many links home on a page?
We are planning on a mega menu which will have around 300 links and a mega slider which will have around 175 links if our developer has their way. In all I could be looking at over 500 links from the home page. The Mega Menu will flatten the site link structure out but I am worried this slider on the home page which is our 4th most visited page behind our 3 core category pages. What are your thoughts?
Intermediate & Advanced SEO | | robertrRSwalters0 -
Can links indexed by google "link:" be bad? or this is like a good example by google
Can links indexed by google "link:" be bad? Or this is like a good example shown by google. We are cleaning our links from Penguin and dont know what to do with these ones. Some of them does not look quality.
Intermediate & Advanced SEO | | bele0