Removing Domains From Disavow File
-
We may have accidentally included the wrong domains in our Disavow file and have since removed most domains leaving the only very highly rated spammy links (using moz's new spam score)in the file.
How long can it take for to google to recognise this change?ThanksMike
-
Great! Thank you for your help.
-
Hi Mike,
I recommend you to read this guide of spam score from Moz:
https://moz.com/help/guides/link-explorer/spam-score
Start reading on this part: "Another site's Spam Score - Again, this doesn't mean that these sites are spammy. This percentage represents a wide variety of potential signals ranging from content concerns to low authority metrics. Since this is just based on correlation with penalization, rather than causation, the solution isn't necessarily to disregard sites or disavow links with higher Spam Scores. Instead, we'd recommend using it as a guide for kick starting investigations. Be sure to check out a site's content and its relevance in linking back to you before disregarding or disavowing."
I personally never use Disavow Links Tool. I manually delete links or simply create new ones to reduce to percentaje of "spammy links" or the percentaje of links that have the same anchor...
But if I had to say a spam rating where I would use the disavow links tool, it probably would be higher than 60-80%, depending on my personal opinion of how spammy I see the website. If I see it very spammy, higher than 60%, if a don't see it very spammy, higher than 80%.
Hope that helps
-
Hi Pau Pl
Thank you for the response,How often do you advise to use the disavow file? for example we use the new Moz tool that provides a spam rating from 1 to 100% and we tend to disavow links from site that are higher than 80% with active links (99% of these are from hotlinking image sites).ThanksMike
-
Hi mlb7,
Matt Cutts explained this around 2015:
When you are disavowing links, you can know that a link in your disavow file is considered disavowed once you see that Google has cached the page where the link resides. But when it comes to reavowing, we have no way of knowing when Google is going to start counting that link again or whether it will be given the same weight.
Reavowing a link can “take a lot longer than disavowing it,” though no one knows how long that is. Google wants to be really certain that spammers are not going to try to figure out which links are helping or hurting them by doing disavow and reavow experiments.
I recommend you to take a look to this video from Matt Cutts: https://www.youtube.com/watch?time_continue=1&v=393nmCYFRtA
Sources:
https://searchenginewatch.com/sew/how-to/2409081/can-you-reavow-links-you-have-peviously-disavowed
https://ahrefs.com/blog/google-disavow-links/Hope that helps, good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our crawler was not able to access the robots.txt file on your site.
Good morning, Yesterday, Moz gave me an error that is wasn't able to find our robots.txt file. However, this is a new occurrence, we've used Moz and its crawling ability many times prior; not sure why the error is happening now. I validated that the redirects and our robots page are operational and nothing is disallowing Roger in our robots.txt. Any advice or guidance would be much appreciated. https://www.agrisupply.com/robots.txt Thank you for your time. -Danny
Moz Pro | | Danny_Gallagher0 -
1500 Domains... Where to begin? & Web Structure Question.
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos. Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began. Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web. My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions... Am I right in thinking that we have to build a bridge before we burn a bridge? Is it worth fixing up some of those other domains to have original content to try and bolster what we already have? Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.? Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web. Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
Moz Pro | | HashtagHustler0 -
Remove geographic modifiers from keyword list
I just pulled a search term report for all of 2013 from my PPC account. What I got was 673,000 rows of terms that have garnered at least 1 impression in 2013. This is exactly what I was looking for. My issue is that the vast majority of terms are geo-modified to include the city, the city and state or the zip code. I am trying to remove the geographic information to get to a list of root words people are interested in based on their search query patterns. Does anyone know how to remove all city, state and zip codes quickly without having to do a find and replace for each geo-modifier in excel? for example, if i could get a list of all city and state combinations in the US and a list of all zip codes, and put that list on a separate tab and then have a macro find and remove from the original tab any instances of anything from the second tab, that would probably do the trick. Then I could remove duplicates and have my list of root words.
Moz Pro | | dsinger0 -
NoFollow Links from Subdomain to root domain better than DoFollow Links?
Our service at fotograf.de is a shopsystem for professional photographers. The customers can build their own website with our tool including an onlineshop to sell their pictures. Here is my question: One part of the customers use subdomains of our site like photographers.fotograf.de. On each customer website we include a backlink to our homepage www.fotograf.de. From SEO view is it better to set these links as NoFollow Links? Or should we put one Follow Link on the starting page on each site and on the other pages only NoFollow Link? Are these links bad for our SEO regarding link diversity because they all come from one root domain? Thanks for the answers! Sebastian
Moz Pro | | Sebastian230 -
File permissions save. any suggestions
Hi - something not related directly to SEO - but a vital factor Need to know for any tool - by which i can take backup or save existing file permissions of all my web site files.. God forbid - in event of any unfortunate circumstance if some mis hap happens or file permissions changed accidentally - how can i get them back to the original. Any tool by which i can save existing file permissions of filezilla thanks
Moz Pro | | Modi0 -
Campaigns - domains / sub domains
Dear SEOmoz folk, At the moment, I have one (rather large) site set up as a campain. I have decided to split off parts of my site into sub domains. My question is, can I set up these sub domains as a completely separate campaign on here? I'd very much like the crawl reports separated out for each one. A very real fear I have is that my site is well over double 10,000 pages, so I never truly see the actual number of errors that I have. Thoughts appreciated. Many thanks, Matt
Moz Pro | | Horizon0 -
Impact of 301-redirected domains on OSE Metrics
When looking at the OSE metrics (DA, PA, Number Linking RDs etc.) is it purely based on OSEs evaluation of the specific domain or will it take into account links that have been 301-redirected to the domain?
Moz Pro | | bjalc20110 -
Is DA a reliable domain metric?
I use DA and PA every day for reporting, researching and most of the time it's a pretty good metric to compare domains and pages but recently I did an experiment and I was surprised when I saw the results. Two months ago I "link bombed" one of my old, unused websites with thousands of spammy blog comment links. Before the attack, its DA was 21 with a few hundred links. In August, after the recent OSE update I checked the website again and I was quite surprised to see DA 61 as a results of 8340 links. A DA value over 60 is considered pretty strong and it's interesting to see that spammy blog comment links could change it so significantly. Someone who doesn't know the history of the domain might get interested in advertising on such websites because the mozbar shows a high DA value. I know it's difficult to algorithmically differentiate between spammy and valuable blogs but a future OSE update could focus on this issue. Let me know what you think.
Moz Pro | | Gyorgy0