Alternative Link Detox tools?
-
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
-
I just wanted to add to this discussion to say that I created a tool that helps me create really good spreadsheets for link auditing. It aggregates links from a number of sources, reduces the list down to one link from each domain, and marks the nofollows. It also tells you which links are from domains that are on my blacklist of domains that I almost always disavow. The blacklist contains over 14,000 domains at this point and is growing. And, it tells you which links are from domains that I usually ignore such as dmoz scrapers and domain stats pages where we know the link is not one made for SEO purposes.
I'm not a fan of tools that automate the decision making promises because I've seen so many of them mark fantastic links as bad ones and miss a whole bunch of really spammy links. If you're trying to escape Penguin, you have to be way more accurate than this.
It's still in a beta phase right now as I am working on making it as useful as possible, but you can see the details here: http://www.hiswebmarketing.com/manual-link-audits/
-
If you are looking specifically for link analysis tools then a pretty good alternative is http://linkrisk.com/
I have managed to get many penalties overturned based solely on using them as an analysis tool.
-
Agreed - it's not much fun, but every reputable link auditor I know uses multiple available sources. All of the tools (including our own at Moz) have different biases, and when you're trying to get a complete a list as possible, you need to use as many sources as you can.
I would highly recommend against going too automated - the cost "savings" short-term could be lost quickly if you start cutting potentially good links. It really depends on your current risk/reward profile. If you're already hit hard with a penalty, then cutting deep and fast may be a good bet (and automation would be more effective). If you're being proactive to prevent future issues, then relying too much on automation could be very dangerous.
-
Like they said, compile/export everything, combine then remove duplicates and insert to the tool of your choice, like link risk, link detox or even rmoov if you want to contact these webmasters
Be sure to still check the list since it's never 100% right. Some good, natural links can be classified within their calculations of bad urls.
-
I agree with everything that Travis said… the reason why you are witnessing different number of total links is because of the index you are using! GWT will give you limited amount of data where as Open site explorer will show you a bit more links (there index fetch every link that has been shared on twitter) where as the largest link index I know are Ahrefs and Majestic SEO.
My advice would be to get the data from all sources, remove the duplicates and then run link detox. Keep a very close look of what link detox says are bad links because no one other than Google know what exactly is a bad links so all others are just using their own formula.
I am sure if you are going to add the link file on “Link Risk” the results might be different from Link Detox.
Just keep a close eye and decide if you want a particular link to be removed.
Planning to remove links? There is a tool that can help you with that www.rmoov.com just give it a try and remove the links that are bad in your eye!
Hope this helps!
-
The difference between the number of links you see across various sources is because of the resources themselves. Some backlink services only crawl so much. Google can only crawl so much of the internet.
Your best bet is to use multiple sources. I would go with GWT, Majestic SEO and aHrefs, then filter duplicates. You'll have a much better understanding of where the site stands. Once you have that, you can use Cemper Link Detox to upload the data.
Be very careful, Link Detox still throws some false positives. Though I expect it to get better every day. There's a machine learning element to it that's based on human feedback.
Finally, I would be very careful of fully automating anything like a disavow/removal process. Do you really want something so delicate taken out of your hands? It's still very necessary to manually check each link so you know that you're getting rid of the bad and keeping the good.
Link Detox is the closest thing there is, that I'm aware of, that will help 'automate' the process in a safe-ish way. The subject of link removal/disavow is something so sensitive I wouldn't outsource it. Then again, I hate the idea of outsourcing overflow blog writing work to competent people. Call me a control freak.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Poor internal linking?
Hi guys, Analyzing a large e-commerce site 10,000 pages on Magento and not getting much organic traffic to level 3 sub-category pages, the URLs are like: Primary Keyword Target: BODY MOISTURISERS https://www.adorebeauty.com.au/skin-care/bath-body/moisturisers.html Primary Keyword Target: LIP MASKS https://www.adorebeauty.com.au/skin-care/masks/lip-masks.html Plus another 40 other URLs at level 3 with low organic performance. Authority of the domain is strong, so it's not an authority issue I believe its internal linking. Besides linking form the blog and breadcrumbs is there anything we can do to improve internal linking to these level 3 pages? Cheers.
Intermediate & Advanced SEO | | nattyhall0 -
Contextual links (is this screen shot considered contextual /editorial links ?)
Hello, Is the screen shot below considered contextual ?https://imgur.com/a/mrbQq and does it have any value or no value What is the value on a scale from 0 to 10 (if you know) of a contextual link versus non contextual links. Thank you, mrbQq
Intermediate & Advanced SEO | | seoanalytics0 -
Rel=canonical and internal links
Hi Mozzers, I was musing about rel=canonical this morning and it occurred to me that I didnt have a good answer to the following question: How does applying a rel=canonical on page A referencing page B as the canonical version affect the treatment of the links on page A? I am thinking of whether those links would get counted twice, or in the case of ver-near-duplicates which may have an extra sentence which includes an extra link, whther that extra link would count towards the internal link graph or not. I suspect that google would basically ignore all the content on page A and only look to page B taking into account only page Bs links. Any thoughts? Thanks!
Intermediate & Advanced SEO | | unirmk0 -
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
Linking from purchased businesses to my own
Hi All, An SEO and Google guidelines question. We've recently purchased several local businesses that have websites. Legally, we've put a disclaimer saying we've purchased those businesses, the question is whether we should link from those sites to our main site. Will this bring a manual action from Google? It's legitimate that we'd like the visitors from those websites come to our main site because those business no longer named the way they were. So, is it OK to link from these sites to ours? Will this violate Google's guidelines regarding backlinking? Should we even link and if so add the rel:nofollow tag? Thanks!
Intermediate & Advanced SEO | | OrendaLtd2 -
Pages with excessive number of links
Hi all, I work for a retailer and I've crawled our website with RankTracker for optimization suggestions. The main suggestion is "Pages with excessive number of links: 4178" The page with the largest amount of links has 634 links (627 internal, 7 external), the lowest 382 links (375 internal, 7 external). However, when I view the source on any one of the example pages, it becomes obvious that the site's main navigation header contains 358 links, so every new page starts with 358 links before any content. Our rivals and much larger sites like argos.co.uk appear to have just as many links in their main navigation menu. So my questions are: 1. Will these excessive links really be causing us a problem or is it just 'good practice' to have fewer links
Intermediate & Advanced SEO | | Bee159
2. Can I use 'no follow' to stop Google etc from counting the 358 main navigation links
3. Is have 4000+ pages of your website all dumbly pointing to other pages a help or hindrance?
4. Can we 'minify' this code so it's cached on first load and therefore loads faster? Thank you.0 -
Subdomain Metrics Links??
I have been analysing my companies website against our competitors and we beat them hands down on everything apart from the total links in the subdomain metrics. Our competitor jumped above us a couple of months ago to grab the number one spot for our industries most valuable keyword. They have had a new website designed and after looking at the source code and running it through SEO MOZ in comparison to our site I can't see how they have manged to do it. We beat them hands down on all factors apart from subdomain metrics > Total links where they have twice as many. When it comes to Page Specific Metrics and Root Domain Metrics we easily beat them on all factors. Does anyone have any ideas what I need to do to improve the subdomain metrics? Thanks
Intermediate & Advanced SEO | | Detectamet0 -
Strange Linking Data in Webmaster Tools
I run a site that was a Wordpress blog with Edirectory software for a directory on the back end. I've scrapped the Edirectory and built the entire site on Wordpress. After the site change I'm seeing about 700 404 Not Found crawling errors, which appear to be old Edirectory pages that no longer exist. My understanding is that they'll cycle out eventually. What troubles me is the linking data I'm seeing. In the "Links to My Site" area of Webmaster tools, I'm seeing 4,430 links to the "About" page, another 2,900 to an obscure deleted directory listing page and only 2,050 to the home page. I show 1,700 links to a terms and conditions pdf and other strange data. To summarize, I'm showing huge numbers of links to obscure pages. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | JSOC0