Alternative Link Detox tools?
-
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
-
I just wanted to add to this discussion to say that I created a tool that helps me create really good spreadsheets for link auditing. It aggregates links from a number of sources, reduces the list down to one link from each domain, and marks the nofollows. It also tells you which links are from domains that are on my blacklist of domains that I almost always disavow. The blacklist contains over 14,000 domains at this point and is growing. And, it tells you which links are from domains that I usually ignore such as dmoz scrapers and domain stats pages where we know the link is not one made for SEO purposes.
I'm not a fan of tools that automate the decision making promises because I've seen so many of them mark fantastic links as bad ones and miss a whole bunch of really spammy links. If you're trying to escape Penguin, you have to be way more accurate than this.
It's still in a beta phase right now as I am working on making it as useful as possible, but you can see the details here: http://www.hiswebmarketing.com/manual-link-audits/
-
If you are looking specifically for link analysis tools then a pretty good alternative is http://linkrisk.com/
I have managed to get many penalties overturned based solely on using them as an analysis tool.
-
Agreed - it's not much fun, but every reputable link auditor I know uses multiple available sources. All of the tools (including our own at Moz) have different biases, and when you're trying to get a complete a list as possible, you need to use as many sources as you can.
I would highly recommend against going too automated - the cost "savings" short-term could be lost quickly if you start cutting potentially good links. It really depends on your current risk/reward profile. If you're already hit hard with a penalty, then cutting deep and fast may be a good bet (and automation would be more effective). If you're being proactive to prevent future issues, then relying too much on automation could be very dangerous.
-
Like they said, compile/export everything, combine then remove duplicates and insert to the tool of your choice, like link risk, link detox or even rmoov if you want to contact these webmasters
Be sure to still check the list since it's never 100% right. Some good, natural links can be classified within their calculations of bad urls.
-
I agree with everything that Travis said… the reason why you are witnessing different number of total links is because of the index you are using! GWT will give you limited amount of data where as Open site explorer will show you a bit more links (there index fetch every link that has been shared on twitter) where as the largest link index I know are Ahrefs and Majestic SEO.
My advice would be to get the data from all sources, remove the duplicates and then run link detox. Keep a very close look of what link detox says are bad links because no one other than Google know what exactly is a bad links so all others are just using their own formula.
I am sure if you are going to add the link file on “Link Risk” the results might be different from Link Detox.
Just keep a close eye and decide if you want a particular link to be removed.
Planning to remove links? There is a tool that can help you with that www.rmoov.com just give it a try and remove the links that are bad in your eye!
Hope this helps!
-
The difference between the number of links you see across various sources is because of the resources themselves. Some backlink services only crawl so much. Google can only crawl so much of the internet.
Your best bet is to use multiple sources. I would go with GWT, Majestic SEO and aHrefs, then filter duplicates. You'll have a much better understanding of where the site stands. Once you have that, you can use Cemper Link Detox to upload the data.
Be very careful, Link Detox still throws some false positives. Though I expect it to get better every day. There's a machine learning element to it that's based on human feedback.
Finally, I would be very careful of fully automating anything like a disavow/removal process. Do you really want something so delicate taken out of your hands? It's still very necessary to manually check each link so you know that you're getting rid of the bad and keeping the good.
Link Detox is the closest thing there is, that I'm aware of, that will help 'automate' the process in a safe-ish way. The subject of link removal/disavow is something so sensitive I wouldn't outsource it. Then again, I hate the idea of outsourcing overflow blog writing work to competent people. Call me a control freak.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Working on new link structure
Hello Mozzzzzzzzzzz I'm currently working on the new link structure for our website. We currently organize our content in sub folder =Main category
Intermediate & Advanced SEO | | floaumet
= = Sub category
= = =Specialty
= = = Product main name
= = = ==Product specific name
= = = == =Manufacturer Each of them has some potential strong KW and I will be happy to use it on the URL. URL are more than 50 kw long when I use all This are very niche item for which people may refer to them through different names (potential folders) My current concerns will be Should we make long url respecting the structure (Main category)/(Sub category)/(Specialty)/(Product main name)/(Product specific name)/(maufacturer) Should I combine some like (Main category)/(Sub category)/(Specialty)/(Product main name)/(Product specific name)-(maufacturer) Should I keep them simple /(product_main name) Should I keep the main folders just to display the articles belonging to this category (Main category)/(Sub category)/(Specialty)/(Product main name) and then keep the product under a sub folder only? Any other idea?0 -
Internal Linking
Hi, I'm doing internal anchor text links. Relative path. if I use /destination-page instead of https://website.com/destination-page will I still receive a transfer of internal Google trust to the destination page? Does google treat just the / url the same as full url??
Intermediate & Advanced SEO | | Scotty_Wilson0 -
Link building
ok mozers i have a few questions. I am starting a new seo campaign and i want to target traffic for "how to make money on autopilot" Question 1. when it comes to link building i have seen some articles saying that i should not send all of my links to my landing page at once but to send links to my backlinks then index then using tiered link building. Is this a must or not? will i get penalized if i build 20 targeted links to my landing page in 1 day, lets say 20, pr7-9 domains? or should i tier it out and link maybe 5 pr9 domains to my landing page, then link 10 pr5 domains to each of those 5 pr9 domains and maybe link 20-pr1 domains to each of those tiered 2 pr5 domains? eq: Tier 1 = 5 PR-9 Tier 2 = 50 PR-5 Tier 3 = 1,000 PR-1 Question 2. Is their a certain amount of backlinks i need to use in order to out do my competitor? or does it just matter on the metrics of my backlinks? and when it comes to indexing these links do i need to index just the 5 pr9 links? or do i need to index all of them? or should i just index the landing page through google webmasters tools and hope it indexes all connecting pages? will doing any of these get my landing page indexed faster in order to rank faster? Question 3. Types of link building. Ok i am targeting guest blogs, wordpress sites, etc to put a link on. Should i focus on smm 'social media marketing' as well? or can i just focus on the traditional seo tactics first? Question 4. Keyword research. ok so my blog post is 'how to make money on autpilot' and from my keyword suggestion tools it picked up a list of keywords suggestions to target. Competition ranges from low to high, search volume ranges from 10 to 1900 visitors per month, after organizing the most relevant keywords to add to my campaign should i target each of the these keywords by creating a link building campaign for each one and target it to my landing page or use it as my 2nd or 3rd tier? those are the questions i really have for now. Here is my blog post http://www.vemomedia.com/how-to-make-money-on-autopilot/ Please feel me in on what i am needing to do in order to get some ranking and on how to run a link building campaign the correct way. Thanx in advance!
Intermediate & Advanced SEO | | djgbshows1 -
Tools to test meta descriptions?
Hey does anyone know of any tools which can test your meta descriptions against competitors meta descriptions for specific keyword terms. I know one tool called SERP Turkey which uses mechanical turk, i was wondering if there is any others on the market? Even a tool which can automatically score your meta description against others on the SERP results page. E..g optimised, keyword, call to action, etc. Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Should I have as few internal links as possible?
On most pages of my site i have a Quick Links section, which gives x3 cross sales links to other products, a newsletter sign up link, link to Blog, x4 links from images to surveys, newsletters, feedback etc. Will these links be hurting my optimal SEO juice between pages, should the number of internal links be kept to a minimum? My site is www.over50choices.co.uk if that helps. Thanks
Intermediate & Advanced SEO | | AshShep1
Ash0 -
Google disavow tool
I have an algorithmic penalty on one of my websites. I never received a notification of a manual penalty in GWMT and even sent in a reconsideration request 6 months ago ad they told me their were no manual penalties on the website. I have cleaned up my link profile and what I could not clean up I sent in using the Google disavow tool a few days ago. I've heard to just wait if it's algorithmic or should I send in another reconsideration request for disavow links tool?
Intermediate & Advanced SEO | | MarkHIggins0 -
Penalized for "Unnatural Links" on Webmaster Tools
Has anyone ever logged in to Google Webmaster tools and seen a message about them seeing unnatural links (as a warning) Our homepage lost all its rankings. I will submit a reconsideration request. We don't engage in link buying practices (some directories, thats all.) Any feedback, please? Thanks
Intermediate & Advanced SEO | | PaulDylan0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0