Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing Domains From Disavow File
-
We may have accidentally included the wrong domains in our Disavow file and have since removed most domains leaving the only very highly rated spammy links (using moz's new spam score)in the file.
How long can it take for to google to recognise this change?ThanksMike
-
Great! Thank you for your help.
-
Hi Mike,
I recommend you to read this guide of spam score from Moz:
https://moz.com/help/guides/link-explorer/spam-score
Start reading on this part: "Another site's Spam Score - Again, this doesn't mean that these sites are spammy. This percentage represents a wide variety of potential signals ranging from content concerns to low authority metrics. Since this is just based on correlation with penalization, rather than causation, the solution isn't necessarily to disregard sites or disavow links with higher Spam Scores. Instead, we'd recommend using it as a guide for kick starting investigations. Be sure to check out a site's content and its relevance in linking back to you before disregarding or disavowing."
I personally never use Disavow Links Tool. I manually delete links or simply create new ones to reduce to percentaje of "spammy links" or the percentaje of links that have the same anchor...
But if I had to say a spam rating where I would use the disavow links tool, it probably would be higher than 60-80%, depending on my personal opinion of how spammy I see the website. If I see it very spammy, higher than 60%, if a don't see it very spammy, higher than 80%.
Hope that helps

-
Hi Pau Pl
Thank you for the response,How often do you advise to use the disavow file? for example we use the new Moz tool that provides a spam rating from 1 to 100% and we tend to disavow links from site that are higher than 80% with active links (99% of these are from hotlinking image sites).ThanksMike
-
Hi mlb7,
Matt Cutts explained this around 2015:
When you are disavowing links, you can know that a link in your disavow file is considered disavowed once you see that Google has cached the page where the link resides. But when it comes to reavowing, we have no way of knowing when Google is going to start counting that link again or whether it will be given the same weight.
Reavowing a link can “take a lot longer than disavowing it,” though no one knows how long that is. Google wants to be really certain that spammers are not going to try to figure out which links are helping or hurting them by doing disavow and reavow experiments.
I recommend you to take a look to this video from Matt Cutts: https://www.youtube.com/watch?time_continue=1&v=393nmCYFRtA
Sources:
https://searchenginewatch.com/sew/how-to/2409081/can-you-reavow-links-you-have-peviously-disavowed
https://ahrefs.com/blog/google-disavow-links/Hope that helps, good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt file issues on Shopify server
We have repeated issues with one of our ecommerce sites not being crawled. We receive the following message: Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. Read our troubleshooting guide. Are you aware of an issue with robots.txt on the Shopify servers? It is happening at least twice a month so it is quite an issue.
Moz Pro | | A_Q0 -
How to remove 404 pages wordpress
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere? Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases? I figure that getting rid of the 404 errors will improve SEO is this correct? Thanks, David
Moz Pro | | DJDavid0 -
Relation between domain age and domain authority?
what is relation between domain age and domain authority? Old registered domain help for domain authority higher or not? if so, but i am still in confused, http://www.green-lotus-trekking.com/ this is too old domain but authority is only 33?
Moz Pro | | agsln1 -
Automatically Check List of Sites For Links To Specific Domain
Hi all, Can anyone recommend a tool that will allow me to put in a list of about 200 domains that are then checked for a link back to a specific domain? I know I can do various link searches and use Google site: command on a site by site basis, but it would be much quicker if there was a tool that could take the list of domains I am expecting a link on and then find if that link exists and if so on what page etc. Hope this makes sense otherwise I have to spend a day doing it by hand - not fun! Thanks,
Moz Pro | | MrFrisbee
charles.0 -
Remove geographic modifiers from keyword list
I just pulled a search term report for all of 2013 from my PPC account. What I got was 673,000 rows of terms that have garnered at least 1 impression in 2013. This is exactly what I was looking for. My issue is that the vast majority of terms are geo-modified to include the city, the city and state or the zip code. I am trying to remove the geographic information to get to a list of root words people are interested in based on their search query patterns. Does anyone know how to remove all city, state and zip codes quickly without having to do a find and replace for each geo-modifier in excel? for example, if i could get a list of all city and state combinations in the US and a list of all zip codes, and put that list on a separate tab and then have a macro find and remove from the original tab any instances of anything from the second tab, that would probably do the trick. Then I could remove duplicates and have my list of root words.
Moz Pro | | dsinger0 -
My site's domain authority is 1\. why is that
Hi Guys My website's domain authority is 1 no matter i try www or non www.. why is that? can you guys please help? Thanks a lot in advance. http://www.opensiteexplorer.org/links?site=autoproject.com.au
Moz Pro | | JazzJack
http://www.opensiteexplorer.org/links?site=www.autoproject.com.au Jazz0 -
Batch lookup domain authority on list of URL's?
I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?
Moz Pro | | SirSud1 -
What Exactly Does "Linking Root Domains" mean??
What Exactly Does "Linking Root Domains" mean?? And how does it affect your ranking for certain Keywords?? Thanks
Moz Pro | | Caseman57