Google Disavow Tool - Waste of Time
-
My humble opinion is that Google's disavow tool.... is a utter waste of your time!
My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking.
Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.)
Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view.
For a year we had contacted webmasters, asking them remove links pointing back to us.
90% didn't respond, the other 10% complied).
Work on our site continued, adding high quality, highly relevant unique content.
Rankings never recovered and neither did our traffic or business…..Earlier this month, we learned about Google’s "link disavow tool" and were excited!
We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it...We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request.
The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site.
Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders…
If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.
-
I have to apologize to Google for my earlier response. After submitting a disavow request I did see my largest site recover. I do have to say I did submit another low quality website and got a quick response that it still was penalized. I only disavowed links and did not have any removed by third party sites. So it looks like you can get penalty removed with disavow and hard work. But don't think disavow alone will be good enough for manual action.
-
Where is this tool located in WMT? I can't find it anywhere
-
If all you said is true - that is *$#@# UP! I have also had all I can take from Googles WMT editors. Im all for better search results but this has gone to far. You make one mistake and Google throws you out and give you a pile of useless videos to watch. I really wish they would get their act together on this. Everyone cheats Google get over it and give us a way back in the game!
-
One suggestion, if you have not already done so is to have a closer look at directories.I have found that most webmasters I have worked with fight hard to keep directory links. If you look at it from the perspective that almost all self made links are unnatural then it would make sense that directory links could be in this pile.
-
We sure disavowed a lot of links... got the same response to our reconsideration request.... truly frustrating as google continues to rank sites that lack relevance, quality and good user experience over my site which provides all of that...
-
It sounds like you have been through a very frustrating time! Is it possible that you did not disavow enough links? I have found that many webmasters do not objectively look at their links and try to keep a lot that are actually unnatural.
-
we did submit a reconsideration request (many over the past year +) after using the disavow tool that stated that we did use the new tool along with contacting all the webmasters (gave a list in previous reconsideration requests).
My market is competitive but i was on page 1 and i have a truly better and easier site than the multi billion dollar companies (walmart, target, amazon) that don't focus on my products...
-
Even though you disavowed the links you still need to send a reconsideration request telling Google what you've been thriugh and how you've contacted all these sites and only 10% responded.
Also it seems to me like you've invested a ton of time fixing the negative, instead of focusing on future wins. Invest your time in creating good content on your site people will link to.
Also, how competitive is your market?
-
This tool would have seen huge amounts of usage since it was first announced (there are many sites in your position). I think it may be a little too early to give up on it as there has only been a few weeks since you submitted it.
I suspect Google will be implementing the results of the tool for high authority site first and then roll it out to lower authority sites. As your site is of low authority I think it may be a casing of playing the waiting game to see if the results come through. Don't give up just yet! Focus on gaining high quality links through valuable content and Google will eventually pay attention.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Problem with Google finding our website
We have an issue with Google finding our website: (URL removed) When we google "(keyword removed)" in google.com.au, our website doesn't come up anywhere. This is despite inserting the suitable title tag and onsite copy for SEO. We found this strange, and thought we'd investigate further. We decided to just google the website URL in google.com.au, to see if it was being properly found. Our site appeared at the top but with this description: A description for this result is not available because of this site's robots.txt – learn more. We also can see that the incorrect title tag is appearing. From this, we assumed that there must be an issue with the robot.txt file. We decided to put a new robot.txt file up: (URL removed) This hasn't solved the problem though and we still have the same issue. If someone could get to the bottom of this for us, we would be most appreciative. We are thinking that there may possibly be another robot.txt file that we can't find that is causing issues, or something else we're not sure of! We want to get to the bottom of it so that the site can be appropriately found. Any help here would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
Google News and Meta Title
Hi,
Intermediate & Advanced SEO | | JohnPalmer
1. I just read this article: https://www.seroundtable.com/google-news-titles-h1-19876.html
Google want the same title. no problem. but what about the brand? for example
POST TITLE BLA BLU | My Brand
The "post title bla blu" is the H1 and title of the article and | My Brand is my brand...
I can keep it as is with the My brand? or remove it? what about posts with long title for example "POST TITLE BLA BLU POST TITLE BLA BLU | My Brand"
What is you suggestion, I know Google doesn't show all the text and we'll see "...". it's still important to write the brand name in the title or just the post title? (without the brand). Thanks,0 -
Meta-description not used at all times
Hi all We are marketing an e-commerce site and seem to have a weird issue. For some reason the clearly specified meta description is not being used in the SERPs. Had a look in the source but all tags seems to be there. The site can be found here:
Intermediate & Advanced SEO | | Resultify
www.bangerhead.se A sample search in Google that uses the wrong info in the SERP:
https://www.google.com/webhp?sourceid=chrome-instant&rlz=1C5CHFA_enSE548SE548&ion=1&espv=2&ie=UTF-8#safe=off&q=bangerhead Any ideas to why this is? Grateful for any inputHave a nice day Fredrik0 -
Multiple Google Webmaster Tools Configurations
Hello everyone, I just inherited a website and 2 different users created GWT accounts on the same site and have configured different settings. Do you know how Google behaves when this happens? Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Google Webmaster Remove URL Tool
Hi All, To keep this example simple.
Intermediate & Advanced SEO | | Mark_Ch
You have a home page. The home page links to 4 pages (P1, P2, P3, P4). ** Home page**
P1 P2 P3 P4 You now use Google Webmaster removal tool to remove P4 webpage and cache instance. 24 hours later you check and see P4 has completely disappeared. You now remove the link from the home page pointing to P4. My Question
Does Google now see only pages P1, P2 & P3 and therefore allocate link juice at a rate of 33.33% each. Regards Mark0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Disavowel tool.
Hello, One of my sites has a strange link profile; it has 40000 in bound links but 30000 of them are from the site http://ourlipsaresealed.skynetblogs.be/ with the anchor text "haarstijl (2)" which is dutch for hairstyles. I haven't paid for or even asked for these links and I don't think its negative seo. I think they just set up a template with hundreds of links they thought were useful to their visitors and produce several pages a day. So the question is do I use the new google disavowel tool? I've held off so far because A. they link to a competitor who haven't been anywhere near as affected as we have although they seem to have been affected to an extent by a drop for some reason and they have a much better link profile overall than mine. and B. in the video Matt cutts goes on over and over that this tool is for people that have done some dodgy link building in the past but I haven't. Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0