Disavow File
-
After uploading a Google disavow file how long does it take to be processed?
Before any trolls get going, not been doing anything dogy, looks like someone has been trying some negative seo on us.
-
Mazen is right. Typically take a few weeks. Makes sure you annotate the date on your analytics account when you upload and monitor. Good luck and be patient.
-
Based on Google's Help Center: https://support.google.com/webmasters/answer/2648487?hl=en - it can take a few weeks for them to process the information in the file.
Based on the few times we've used the tool, I can say that the effect roles out across 2-4 weeks. But that's based on a small number of cases and it might be very different for other sites.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you disavow backlinks even if your site spam score is 1%?
With a site spam score of 1% as indicated by Moz, is it worth the effort or necessary to disavow backlinks in Google? Even at just 1%, could those spammy links still hurt a site's Google search rankings, even in the slightest of ways? As it relates to disavowing backlinks, everything I read about is only related to sites with high spam scores. But what about sites with low spam scores? I'm interested in best practices for dealing with spammy links, regardless of one's site spam score. Thank you
Intermediate & Advanced SEO | | AndrewOrr100 -
Can some one help how to fix spam problem. please see the attached file
Hi have spam link issue in my website at attaching report please help to fix this so my domain can get better. thanks. aEP1vVy
Intermediate & Advanced SEO | | grbassi0 -
72KB CSS code directly in the page header (not in external CSS file). Done for faster "above the fold" loading. Any problem with this?
To optimize for googles page speed, our developer has moved the 72KB CSS code directly in the page header (not in external CCS file). This way the above the fold loading time was reduced. But may this affect indexing of the page or have any other negative side effects on rankings? I made a quick test and google cache seems to have our full pages cached, but may it affect somehow negatively our rankings or that google indexes fewer of our pages (here we have some problems with google ignoring about 30% of our pages in our sitemap".)
Intermediate & Advanced SEO | | lcourse0 -
How long should I keep the 301 redirect file
We've setup an new site and many pages don't exist anymore (clean up done). But for many of them we have new pages with new url's. We've monitored the 404 and have now many URL's redirected with 301 (apache file). How long should we keep this in place? Checking all links manually to see of new url is in place of the old url (in google) is too much work. tx!
Intermediate & Advanced SEO | | KBC0 -
Should I replace underscores in page file names with hyphens?
Many of the 1000s of pages at our Web store that were established several years ago but still relevant today have underscores separating words in page file names. For example: http://www.audiobooksonline.com/whats_new_compact_disc_audiobooks_audio_books.html Should I replace the underscores with hyphens like this: http://www.audiobooksonline.com/whats-new-compact-disc-audiobooks-audio-books.html or should I duplicate pages with underscores using hyphens and have the older pages with underscores 304 re-directed to the new pages with hyphens?
Intermediate & Advanced SEO | | lbohen0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
Robots.txt file - How to block thosands of pages when you don't have a folder path
Hello.
Intermediate & Advanced SEO | | Unity
Just wondering if anyone has come across this and can tell me if it worked or not. Goal:
To block review pages Challenge:
The URLs aren't constructed using folders, they look like this:
www.website.com/default.aspx?z=review&PG1234
www.website.com/default.aspx?z=review&PG1235
www.website.com/default.aspx?z=review&PG1236 So the first part of the URL is the same (i.e. /default.aspx?z=review) and the unique part comes immediately after - so not as a folder. Looking at Google recommendations they show examples for ways to block 'folder directories' and 'individual pages' only. Question:
If I add the following to the Robots.txt file will it block all review pages? User-agent: *
Disallow: /default.aspx?z=review Much thanks,
Davinia0 -
.htaccess files
I am working with a clients website which has multiple htaccess files (.htaccess , .htaccess.holiding, and .htaccess.live -all in the same directory) My question is how does a server process these files? All 3 files? Currently the domain has 301 redirect showing for the home page to the mobile site (which is a problem) in one of the files (.htaccess but not others) Has anyone come across this before with regard to SEO problems?
Intermediate & Advanced SEO | | OnlineAssetPartners0