Questions About Link Detox
-
Greetings:
In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings.
As a novice I have a few questions for you regarding this the use of Link Detox:
-We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking?
-Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file?
-Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives?
-A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true.
-We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances?
Thanks for your feedback, I really appreciate it!!!
Alan
-
Hi there
A lot of this sounds extremely odd to me - I have never heard of anyone being able to expedite a disavow file and you shouldn't automate link removal requests, too much can go wrong if people reach out to you. Instead, try this:
-
Collect your backlinks
-
Google Webmaster Tools
-
Majestic
-
Ahrefs
-
Assess your data
-
Good links
-
These links help your business
-
These links are relevant to your site
-
These links are relevant to the content they link to
-
These links don't have spammy or low metrics
-
Bad links
-
These links aren't relevant to your site or content
-
These links are part of template directories (example)
-
These links do supply traffic to your site
-
These links tend to have low metrics
-
These links come from sites that look untrustworthy
-
Neutral links
-
Nofollow links
-
Disavowed links
-
Not active
-
Prioritize your bad links
-
What ones are you removing?
-
Reach out to those webmasters 3 times with 4 days between each message
-
I outlined a
-
Create your disavow file (yes - they work)
-
Links removed - in case they come back or webmaster misses a link
-
Links you couldn't remove
-
Links that requested payment
-
Links previously in disavow file
I have never used LinkDetox, but from what you described, I'm skeptical. I have used LinkRisk and I liked it a lot, you can learn more about it here.
Try this manually one time through - you could learn a lot about how this all works and what to look for. I included other links in my answer to your other question as well that will help you.
Hope all of this helps! Good luck! Let me know if you have anymore questions or comments - would love to help!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Do You Do Link Building??
I am starting to use the Moz pro tools like optimizing on page SEO for keywords and looking for opportunities. I know link building is a huge part for getting rankings on keywords in google search. Where do I start and how do I do the link building process for specific keywords I can rank for?? Thank you in advance for your help.
Intermediate & Advanced SEO | | wickerparadise1 -
URL Structure Question
Am starting to work with a new site that has a domain name contrived to help it with a certain kind of long tail search. Just for fictional example sake, let's call it WhatAreTheBestRestaurantsIn.com. The idea is that people might do searches for "what are the best restaurants in seattle" and over time they would make some organic search progress. Again, fictional top level domain example, but the real thing is just like that and designed to be cities in all states. Here's the question, if you were targeting searches like the above and had that domain to work with, would you go with... whatarethebestrestaurantsin.com/seattle-washington whatarethebestrestaurantsin.com/washington/seattle whatarethebestrestaurantsin.com/wa/seattle whatarethebestrestaurantsin.com/what-are-the-best-restaurants-in-seattle-wa ... or what and why? Separate question (still need the above answered), would you rather go with a super short (4 letter), but meaningless domain name, and stick the longtail part after that? I doubt I can win the argument the new domain name, so still need the first question answered. The good news is it's pretty good content. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
International SEO Question
_The company I work for has a website www.example.com that ranks very well in English speaking countries - US, UK, CA. For legal reasons, we now need to create www.example.co.uk to be accessible and rank in google.co.uk. Obviously we want this change to be as smooth as possible with little effect on rankings in the UK. We have two options that we're talking through at the moment - Use the hreflang tag on both the .com and the .co.uk to tell Google which site to rank in each country. My worry with this is that we might lose our rankings in the UK as it will be a brand new site with little to no links pointing to it. 301 redirect to the .co.uk based on UK IP addresses. I'm skeptical about this. As a 301 passes most of the link juice, I'm not sure how Google would treat this type of thing - would the .com lose ranking? So my questions are - would we lose ranking in the UK if we use option 1? Would option 2 work? What would you do? Any help is appreciated._
Intermediate & Advanced SEO | | awestwood0 -
Links with Parameters
The links from the home page to some internal pages on my site have been coded in the following format by my tech guys: www.abc.com/tools/page.html?hpint_id=xyz If I specify within my Google Webmaster tools that the parameter ?hpint_id should be ignored and content for the user does not change, Will Google credit me for a link from the home page or am I losing something here. Many thanks in advance
Intermediate & Advanced SEO | | harmit360 -
Excessive navigation links
I'm working on the code for a collaborative project that will eventually have hundreds of pages. The editor of this project wants all pages to be listed in the main navigation at the top of the site. There are four main dropdown (suckerfish-style) menus and these have nested sub- and sub-sub-menus. Putting aside the UI issues this creates, I'm concerned about how Google will find our content on the page. Right now, we now have over 120 links above the main content of the page and have plans to add more as time goes on (as new pages are created). Perhaps of note, these navigation elements are within an html5 <nav>element: <nav id="access" role="navigation"> Do you think that Google is savvy enough to overlook the "abundant" navigation links and focus on the content of the page below? Will the <nav>element help us get away with this navigation strategy? Or should I reel some of these navigation pages into categories? As you might surmise the site has a fairly flat structure, hence the lack of category pages.</nav> </nav> </nav>
Intermediate & Advanced SEO | | boxcarpress1 -
How to ping the links
When i do link building for my website, how can i let the search engines know about that. is there any way of pinging?
Intermediate & Advanced SEO | | raybiswa0 -
Ask a Question
We use DNN and we have case studies ran from our CMS. This is so we can have them in lists by category on service/market pages and show specific ones when needed. Then there is the case study detail page, (this is where the problem exists)to where you read out the case study in full detail and see the images and story. We enter our Case Studies into the CMS and this determines which website they show, and it creates URLs from the titles. However, on the detail page, the case studies all share the same page, Case Study.aspx, and they resolve to that page with their respected URLs in place. As seen here, http://www.structural.net/case-study/1/new-marlins-stadium.aspx Because they all share the same page they are being pulled as duplicate pages. They do show in the SERPS with the right title and URL and it all looks great, but they get errors for having duplicate page content and titles. Is there a way to solve this, or is this something I should even worry about?
Intermediate & Advanced SEO | | KJ-Rodgers0 -
Dynamic Links vs Static Links
There are under 100 pages that we are trying to rank for and we'd like to flatten our site architecture to give them more link juice. One of the methods that is currently in place now is a widget that dynamically links to these pages based on page popularity...the list of links could change day to day. We are thinking of redesigning the page to become more static, as we believe it's better for link juice to flow to those pages reliably than dynamically. Before we do so, we need a second opinion.
Intermediate & Advanced SEO | | RBA0