Has anyone used this? www.linkdetox.com/
-
Has anyone used this? www.linkdetox.com/
Any opinions about it?
-
I have found it to be quite useful. Its good for client pitches and looking at their back link profiles before the meeting. Its also a useful tool for examining the back link profiles of sites you are thinking about approach for link request or blog post.
-
I've tried it, and it's reasonable. It allows for pretty good detailed scoping of backlinks and buckets those into good, suspicious and toxic.
Using backlink tools you could do this without a tool like link detox, but if you are an agency or Professional SEO that handles a considerable number of sites this is helpful.
It also allows you to export links into disavow format, for the ones you have already contacted for removal and have had no response from.
Hope this helps!
Todd -
-
ahhhh... link didn't work....
-
Llanero,
Here's a link to [">another thread](<img%20class=) on the topic here in Q&A. Looks like people like it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whether to use new domain or old ecommerce site domain that has been incomplete for a long time.
Hello, We are starting a second store in our niche. Which of the following should I choose: A. We have a site from a year and a half ago that we put content on but never actually added products. The category and article content needs to be completely rewritten. We will completely rewrite the content to be much better and up to date. We're planning on adding products and rewriting the manufacturer descriptions. B. We could use a new domain that is closer to exact match for our main keyword. We'd just buy one for $15 I don't know whether A or B would be the fastest way to get the site going. I'm concerned that leaving a site half done for a year could cause an issue, but I really don't know. If you've got experience with this, please advise. Thank you.
White Hat / Black Hat SEO | | BobGW0 -
The differences between XXX.domain.com and domain.com/XXX?
hi guys i would like to know which seo value is better? for example if i would put a link in xxx.domain.com or domain.com/XXX which one will give me a better seo value? does it give the same? assuming that domain.com have a huge PR RANK itself. why do people bother making XXX.domain.com instead? hope for clarification thanks!
White Hat / Black Hat SEO | | andzon0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
How/why is this page allowed to get away with this?
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs. I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years. My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
White Hat / Black Hat SEO | | Millermore0 -
What is the difference between using .htaccess file and httpd.conf in implementing thousands of 301 redirections?
What is the best solution in terms of website loading time or server load? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Has anyone been able to recover a site from that was slapped by panda?
I have a client that the only thing I can determine is over optimization of a couple anchor terms which the person no longer ranks for..I tried mixing up with brandname , brandname.com and a diversity of links but nothing seems to budge anyone have a similar problem?
White Hat / Black Hat SEO | | foreignhaus0 -
Does anyone know of a good link building case study? A B2B focus would be a plus
Looking for a solid analysis of a white-hat campaign that showed tangible results (if one exists).
White Hat / Black Hat SEO | | RiseSEO0