Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Alternative Link Detox tools?
-
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
-
I just wanted to add to this discussion to say that I created a tool that helps me create really good spreadsheets for link auditing. It aggregates links from a number of sources, reduces the list down to one link from each domain, and marks the nofollows. It also tells you which links are from domains that are on my blacklist of domains that I almost always disavow. The blacklist contains over 14,000 domains at this point and is growing. And, it tells you which links are from domains that I usually ignore such as dmoz scrapers and domain stats pages where we know the link is not one made for SEO purposes.
I'm not a fan of tools that automate the decision making promises because I've seen so many of them mark fantastic links as bad ones and miss a whole bunch of really spammy links. If you're trying to escape Penguin, you have to be way more accurate than this.
It's still in a beta phase right now as I am working on making it as useful as possible, but you can see the details here: http://www.hiswebmarketing.com/manual-link-audits/
-
If you are looking specifically for link analysis tools then a pretty good alternative is http://linkrisk.com/
I have managed to get many penalties overturned based solely on using them as an analysis tool.
-
Agreed - it's not much fun, but every reputable link auditor I know uses multiple available sources. All of the tools (including our own at Moz) have different biases, and when you're trying to get a complete a list as possible, you need to use as many sources as you can.
I would highly recommend against going too automated - the cost "savings" short-term could be lost quickly if you start cutting potentially good links. It really depends on your current risk/reward profile. If you're already hit hard with a penalty, then cutting deep and fast may be a good bet (and automation would be more effective). If you're being proactive to prevent future issues, then relying too much on automation could be very dangerous.
-
Like they said, compile/export everything, combine then remove duplicates and insert to the tool of your choice, like link risk, link detox or even rmoov if you want to contact these webmasters
Be sure to still check the list since it's never 100% right. Some good, natural links can be classified within their calculations of bad urls.
-
I agree with everything that Travis said… the reason why you are witnessing different number of total links is because of the index you are using! GWT will give you limited amount of data where as Open site explorer will show you a bit more links (there index fetch every link that has been shared on twitter) where as the largest link index I know are Ahrefs and Majestic SEO.
My advice would be to get the data from all sources, remove the duplicates and then run link detox. Keep a very close look of what link detox says are bad links because no one other than Google know what exactly is a bad links so all others are just using their own formula.
I am sure if you are going to add the link file on “Link Risk” the results might be different from Link Detox.
Just keep a close eye and decide if you want a particular link to be removed.
Planning to remove links? There is a tool that can help you with that www.rmoov.com just give it a try and remove the links that are bad in your eye!
Hope this helps!
-
The difference between the number of links you see across various sources is because of the resources themselves. Some backlink services only crawl so much. Google can only crawl so much of the internet.
Your best bet is to use multiple sources. I would go with GWT, Majestic SEO and aHrefs, then filter duplicates. You'll have a much better understanding of where the site stands. Once you have that, you can use Cemper Link Detox to upload the data.
Be very careful, Link Detox still throws some false positives. Though I expect it to get better every day. There's a machine learning element to it that's based on human feedback.
Finally, I would be very careful of fully automating anything like a disavow/removal process. Do you really want something so delicate taken out of your hands? It's still very necessary to manually check each link so you know that you're getting rid of the bad and keeping the good.
Link Detox is the closest thing there is, that I'm aware of, that will help 'automate' the process in a safe-ish way. The subject of link removal/disavow is something so sensitive I wouldn't outsource it. Then again, I hate the idea of outsourcing overflow blog writing work to competent people. Call me a control freak.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is a page with links to all posts okay?
Hi folks. Instead of an archive page template in my theme (I have my reasons), I am thinking of simply typing the post title as and when I publish a post, and linking to the post from there. Any SEO issues that you can think of? Thanks in advance!
Intermediate & Advanced SEO | | Nobody16165422281340 -
Tool for user intent
Hello, Is there a tool that can tell me what the user intent of my keyword is and how I should present my page (the type of content users want to see it, what questions they want answered ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Do I have to many internal links which is diluting link juice to less important pages
Hello Mozzers, I was looking at my homepage and subsequent category landing pages on my on my eCommerce site and wondered whether I have to many internal links which could in effect be diluting link juice to much of the pages I need it to flow. My homepage has 266 links of which 114 (43%) are duplicate links which seems a bit to much to me. One of my major competitors who is a national company has just launched a new site design and they are only showing popular categories on their home page although all categories are accessible from the menu navigation. They only have 123 links on their home page. I am wondering whether If I was to not show every category on my homepage as some of them we don't really have any sales from and only concerntrate on popular ones there like my competitors , then the link juice flowing downwards in the site would be concerntated as I would have less links for them to flow ?... Is that basically how it works ? Is there any negatives with regards to duplicate links on either home or category landing page. We are showing both the categories as visual boxes to select and they are also as selectable links on the left of a page ? Just wondered how duplicate links would be treated? Any thoughts greatly appreciated thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Using disavow tool for 404s
Hey Community, Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp). It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question. Feel free to ask any questions that may help you understand the issue more. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
How to detect a bad link and remove ?
As per google penguin, all the low quality back links are going to affect the website SERPS hugely. So, we need to find all the bad back links and then remove them one by one. What I would like to know is, what tool do you use to find all the bad back links ? And how do we know which is a bad back link or bad website, where our link should not be there ? Then what service what do you suggest for back links removal. I contacted LinkDelete.com and they quoted me 97$ for a month to remove all links in less than 3 weeks.
Intermediate & Advanced SEO | | monali123
Let me know, what you suggest.0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Cross linking between categories
Is it useful for SEO to cross link between TOP level categories, let's say I have a Home page and then 2 sub categories, one about green widgets one about red widgets
Intermediate & Advanced SEO | | seoanalytics
Should i create a link from the green widget to the red widget or should I leave those are separate silos ? I know that within a silo i need to cross link ( from green widget 1 to green widget 2 etc... ) but how about about from the main category to the other main category ?0 -
Link Age as SEO factor?
Hi Guys
Intermediate & Advanced SEO | | VividLime
I have a client who ranks well within a competitive sector of the travel industry. They are planning CMS move which will involve changing from .cfm to .aspx We will be doing the standard redirects etc However Matt's statement here on 301 redirects got me thinking
http://www.youtube.com/watch?v=zW5UL3lzBOA&t=0m24s He says that basically you loose a bit of page rank when you do a 301 redirect. Now, we will be potentially redirecting 1000s of links and my thinking is 'a lot of a little, adds up to a lot' In other words, 1000s of redirects may have a big enough impact to loose some rankings in a very competitive and aggressive space. So recommended that we contact the sites who has the link highest value and ask them to manually change the links from cfm to aspx. This will then mean that there are no loss value as with a 301 redirect. -But now I have another dilemma which I'm unsure about. So the main question:
Is link age factor in rankings ? If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?0