Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing poor domain authority backlinks worth it?
-
Hey Moz,
I am working with a client on more advanced SEO tactics. This client has a reputable domain authority of 67 and 50,000+ backlinks.
We're wanting to continue SEO efforts and stay on top of any bad backlinks that may arise.
Would it be worth asking websites (below 20 domain authority) to remove our links? Then, use the disavow tool if they do not respond.
Is this a common SEO practice for continued advanced efforts? Also, what would your domain authority benchmark be? I used 20 just as an example.
Thanks so much for your help.
Cole
-
Awesome responses guys. Anyone else have any other insight?
-
I updated my response while you were writing yours.
I don't doubt your insight. But The Googles doesn't sleep.
When you're doing a local campaign, with strictly above the board links, you should move as fast as possible.
-
That would be bad.
You should follow the rough 10-80-10 rule, whether you are building 10 links or 10,000 links. And you should always do it slowly.
I agree there are no specific percentages. You have to look at the big picture over a long period of time.
-
Let's say someone reads this and decides to get their first 10% in the crappy category. That would not be good for them. Further, there aren't any specific percentages that I'm aware of.
Yes, The Googles does have to pick the best of the worst. I'm not in doubt of that.
Yes, sometimes you inherit a mess but it seems to work. Manual reviews happen.
-
Big picture: What a good "problem" to have!
Without taking a close look at your specific URL...
...my first instinct is that the answer to your question is almost certainly a giant...
**No.
DO THE HARD THING: NOTHING!!!!** There is a real danger of overthinking this stuff and neglecting the fundamentals.
I faced the same issue with a DA72 site for a leading SME In his field who had 450,000+ backlinks....some from major media outlets and universities, but most from "nobodies" in the field. This is good!
What you want in a classic Inverted U-shaped curve in terms of DA.
-
10 % crappy links
-
80 % middling links
-
10% super high quality links
You mess with this at your peril!!!! Beware. "Bad" links are not necessarily bad in the grand scheme of universe. Every credible and authoritative site should have some. They are part of a natural link profile.
Getting rid of the <20 DA authority links could hurt...badly.
Focusing excessively on tweaking or sculpting the middling 80% of your links is probably a mistake. You could shoot yourself in the foot.
Less is more.
It might be better to just keep doing what you're doing.
This is hard...and requires great discipline!
-
-
Happy to be contrary. Another good thing about Link Detox is that the service has been trained - mostly for the good - by users manually reviewing the quality of their links. If easylinkseodirectory4u.com has been flagged enough, it's more likely to get caught by the machine.
Once you have uploaded your list and reviewed the links, you will get a pretty accurate risk rating. It scales from shades of low to high. I don't think Link Detox has ever given me a false Toxic rating on individual links either.
I'm not a client scalper, so if you would like to PM the domain name, I can take a look.
-
Excellent, quality response. Thanks so much.
I would love to hear from any disavow experts, maybe even costs of them (of course, I don't want to break any Moz rules that may be applicable).
Cole
-
Setting a DA cut-off from the outset is a bit too arbitrary. What if it's a link from a site with low DA and a low PA now, but later the site becomes the next New York Times? You don't want to disavow the next New York Times, but that's what an arbitrary number would have you do.
Further, DA and PA can be gamed to a certain extent. I'm sure Rap Genius has a pretty solid DA, but they were penalized all the same. So it would appear that using DA as a cut-off would be less than ideal.
There's no real easy way to do a disavow. You have to think about characteristics, context and intent. If you have links that pass juice, but were obviously paid - that may be a candidate. If there's a vast preponderance of links from seemingly low quality directories with exact match anchor text - those would be candidates for closer scrutiny as well. Dead giveaways are usually 'sponsored' links that pass juice.
Low quality directories usually let everyone in. You will know them by their viagra and casino anchor text. They're usually a pretty safe disavow candidate.
Does the site have a lot of links from spam blog comments from sites that are obviously unrelated? Has there been some guest blogging on free for all blogs? Those links would require some review as well.
Definitely prioritize your exact match anchor text links for review.
I would suggest you start with gathering link data from numerous sources:
- Google Webmaster Tools
- Bing Webmaster Tools
- Ahrefs
- Majestic SEO
- Etc.
Then filter the duplicates via spreadsheet voodoo. After that, drop it into a service like Link Detox. But be careful, it still throws false positives and false negatives. So again, there's no real way of getting out of a manual review. But Link Detox will speed up the process.
Are there plenty of disavow services out there? Sure, but I've never used them. I'm far too paranoid. A disavow is a delicate and lengthy process.
Are there some great disavow pros/individuals out there? Definitely. I would be far more likely to trust them. In fact, a couple will likely chime in here. Though they may be a little bit outside the budget. I don't know.
One final, important, point: A disavow is not a panacea. They take as long as they take. Though it is good that you appear to be proactive. You never know when the next Penguin filter will land. The site may be right with The Googles now, but it might not be later.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Url Removes Backlink
Hello MOZ Community, I have question regarding Bad Backlink Removal. My Site's Post's Image got 4 to 5k backlinks from unknown sites and also their is no contact details on their site so that i can contact them to remove. So, I have an idea for which i want suggestion " If I change the url that receieves backlinks" does this will remove backlinks? For Example: https://example.com/test/ got 5k backlinks if I change this url to https://examplee.com/test-failed/ does this will remove those 5k backlinks? If not then How Can I remove those Backlinks? I Know about disavow but this takes time.
Intermediate & Advanced SEO | | Jackson210 -
6 .htaccess Rewrites: Remove index.html, Remove .html, Force non-www, Force Trailing Slash
i've to give some information about my website Environment 1. i have static webpage in the root. 2. Wordpress installed in sub-dictionary www.domain.com/blog/ 3. I have two .htaccess , one in the root and one in the wordpress
Intermediate & Advanced SEO | | NeatIT
folder. i want to www to non on all URLs Remove index.html from url Remove all .html extension / Re-direct 301 to url
without .html extension Add trailing slash to the static webpages / Re-direct 301 from non-trailing slash Force trailing slash to the Wordpress Webpages / Re-direct 301 from non-trailing slash Some examples domain.tld/index.html >> domain.tld/ domain.tld/file.html >> domain.tld/file/ domain.tld/file.html/ >> domain.tld/file/ domain.tld/wordpress/post-name >> domain.tld/wordpress/post-name/ My code in ROOT htaccess is <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase / #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://domain.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L]</ifmodule> The above code do 1. redirect www to non-www
2. Remove trailing slash at the end (if exists)
3. Remove index.html
4. Remove all .html
5. Redirect 301 to filename but doesn't add trailing slash at the end0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
Community inside the domain or in a separate domain
Hi there, I work for an ecommerce company as an online marketing consultant. They make kitchenware, microware and so on. The are reviewing their overall strategy and as such they want to build up a community. Ideally, they would want to have the community in a separate domain. This domain wouldn't have the logo of the brand. This community wouldn't promote the brand itself. The brand would post content occassionally and link the store domain. The reasoning of this approach is to not interfere in the way of the community users and also the fact that the branded traffic acquired doesn't end up buying at the store I like this approach but I am concerned because the brand is not that big to have two domains separated and lose all the authority associated with one strong domain. I would definitely have everything under the same domain, store and community, otherwise we would have to acquire traffic for two domains. 1. What do you think of both scenarios, one domain versus two? Which one is better? 2. Do you know any examples of ecommerce companies with successful communities within the store domain? Thanks and regards
Intermediate & Advanced SEO | | footd0 -
XML Sitemap on another domain
Hi, We've rebuilt our website and created a better sitemap index structure. There's a good chance that we not be able to append the XML files to existing site for technical reasons (don't get me started). I'm reaching out because I'm wondering if can we place the XML files on another website or subdomain? I know this is not best practice and probably very grey but I'm looking for alternatives. If there answer is DON'T DO IT let me know too. Thx
Intermediate & Advanced SEO | | WMCA0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | | essdee0 -
New Site: Use Aged Domain Name or Buy New Domain Name?
Hi,
Intermediate & Advanced SEO | | peterwhitewebdesign
I have the opportunity to build a new website and use a domain name that is older than 5 years or buy a new domain name. The aged domain name is a .net and includes a keyword.
The new domain would include the same keyword as well as the U.S. state abbreviation. Which one would you use and why? Thanks for your help!0