On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
-
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
-
Highland's analysis is correct. You should only be using the disavow tool carefully and with a measured approach, and usually in dealing with penalties.
-
Wait, you just saw a bunch of links and disavowed them because... you saw a bunch of links? Did you have any penalties or rank drops? Anything that would lead you to believe these links are actively harming your rankings?
Disavow is a cleanup tool, not a preventative tool. I mean, there's a really good reason why they tell you NOT to use this tool lightly. If i could put this on a giant neon sign I would, but here's what Google says on their help page (emphasis mine)
This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.
If I were you, I'd yank the disavow out right now
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Drastic surge of link spam in Webmaster Tools' Link Profile
Hello all I am trying to get some insights/advice on a recent as well as drastic increase in link spam within my Webmaster Tools' Link Profile. Before I get into more detail, I would like to point out, that I did find some relevant MOZ community posts addressing this type of issue. However, my link spam situation may have to be approached from a different angle, as it concerns two sites at the same time and somewhat in the same way. Basically, starting in July 2017, from one day to the other, a multitude of domains (50+) is generating link spam (at least 200 links a month and counting) and to cut a long story short, I believe the sites are hacked. This is because most of the domain names sound legit and load the homepage, but all the sub-pages linking to my site contain "adult" gibberish. In addition, it is interesting to see, that each sub-page follows the same pattern, scraping content from my homepage including the on-page links - that generate the spammy backlinks to my sites - while inserting the adult gibberish in between (basically it's all just text and looks like as if a bot is at work). Therefore, it's not like my link is being inserted "specifically" into pages or to spam me with the same anchor text over and over. So, I am not sure what kind of link spam this really is (or the purpose of it). Some more background information: As mentioned above, this link spam (attack?) is affecting two of my sites and it started off pretty much simultaneously (in addition, the sites focus on a competitive niche). The interesting detail is, that one site suffered a manual penalty years ago, which has been lifted (a disavowal file exists and no further link building campaigns have been undertaken after the cleanup), while the other site has never seen any link building efforts - it is clean, yet the same type of spam is flooding that websites' link profile too. In the webmaster forums the overall opinion is, that Google ignores web spam. All well. However, I am still concerned, that the dozens of spammy links pointing to the website "with a history" may pose a risk (more spam on a daily basis on both sites though). At the same time I wonder, why the other "clean" site is facing the same issue. The clean sites' rankings do not appear to be impacted, while the other website has seen some drops, but I am still observing the situation. Therefore, should I be concerned for both sites or even start an endless disavowal campaign on the site with a history? PS: This MOZ article appears to advice so: https://moz.com/blog/do-we-still-need-to-disavow-penguin "In most cases, sites that have a history of collecting unnatural links tend to continue to collect them. If this is the case for you, then it’s best to disavow those on a regular basis (either monthly or quarterly) so that you can avoid getting another manual action." What is your opinion? Sorry for the long post and many thanks in advance for any help/insight.
White Hat / Black Hat SEO | | Hermski0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
How do I know what links are bad enough for the Google disavow tool?
I am currently working for a client who's back link profile is questionable. The issue I am having is, does Google feel the same way about them as I do? We have no current warnings but have had one in the past for "unnatural inbound links". We removed the links that we felt were being referred to and have not received any further warnings, nor have we noticed any significant drop in traffic or rankings at any point. My concern is that if I work towards getting the more ominous looking links removed (directories, reciprocal links from irrelevant sites etc.), either manually or with the disavow tool, how can I be sure that I am not removing links that are in fact helping our campaign? Are we likely to suffer from the next Penguin update if we chose to proceed without moving the aforementioned links? or is Google only likely to target the serious black hat links (link farms etc.)? Any thoughts or experiences would be greatly appreciated.
White Hat / Black Hat SEO | | BallyhooLtd0 -
I think I've been hit by Penguing - Strategy Discusson
Hi, I have a network of 50 to 60 domain names which have duplicated content and whose domains are basically a geographical location + the industry I am in. All of these websites have links to my main site. Over the weekend I saw my traffic fall. I attribute our drop in rankings to what people are calling Penguing 1.1. I want to keep my other domains as we are slowly creating unique content for each of those sites. However, in the mean time, clearly I need to deal with the inbound linking and anchor text problem. Would adding a nofollow tag to all links that point to my main site resolve my issue with Google's penguin update? Thanks for the help.
White Hat / Black Hat SEO | | MangoMan160 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010 -
Beaten in SERP's by a site going 'all in' on 2 keywords in their anchor text profile.
I would like to get peoples thoughts on putting 80% of your anchor text links in just 2 keywords vs a nice spread of branded and longtail keywords.. like I am. recently fell off the first page for a key SERP.. and the site in P10 has gone nuts on just that two keyword's.. I know we have a good site onpage/ conversion / low bounce rate page views etc.. Pretty sure we get more traffic than them. Seems that this obvious bloated anchor text profiling has worked for them though.. What do you guys think/know?
White Hat / Black Hat SEO | | robertrRSwalters0