Disavow everything or manually remove bad links?
-
Our site is likely suffering an algorithmic penalty from a high concentration of non-branded anchor text that I am painstakingly cleaning up currently. Incremental clean-ups don't seem to be doing much. Google recommends I 'take a machete to them' and basically remove or disavow as much as possible, which I am now seriously considering as an option.
What do you guys recommend, should torch the earth (disavow all links with that anchor text) or keep it on life support (slowly and manually identify each bad link)?
-
Great, thanks Gyorgy for the advice.
-
Disavow all the suspicious links, but also remove as many as possible. Google wants to see that you did work on cleaning up your link profile and not just disavowed the links.
If you're busy, then outsourcing link clean-up can be an option. Be prepared to pay $5-20 for link removals, because some site admins charge money for the work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 and 302 for same link
Just trying to find out if this may be the root of a slight traffic dip and also if we should be redirecting differently. We relaunched and did 301 redirects to the new site, initially. Then, we decided to change from http to https. Our HTTP status now looks like this when using the MozBar: HTTP/1.1 301 Moved Permanently – http://site.com/oldurl
Technical SEO | | MichaelEka
HTTP/1.1 302 Found – https://site.com/oldurl
HTTP/1.1 200 OK - https://site.com/new Should we be changing that 302 to a 301? Are we losing link equity due to this? Thanks.0 -
Number of links you should have on a taxonomy term??
According to SeoMoz, my taxonomy terms contain more than 100 links (links to articles in my case) and it tells me that I should reduce it. I have seen a video by Matt Cutts, the google software engineer, and in that video he said that Google's engine has dramatically improved ever since and 100 is not the limit anymore. What do you guys think is the best practice here? To clarify the subject even more: I want to learn this from link juice perspective, does it effect how link juice is distributed? Let's say I have 5 taxonomy terms and all of them have 200 articles and these 5 terms are listed on the home page of a PR7 website. In this case some of the PR will be passed to these 5 taxonomy terms. However, if I increase taxonomy terms to 10, then i will reduce links to 100, but the PR will be distributed even more. This means each taxonomy term will have even less PR value. Am I wrong? Any ideas?
Technical SEO | | mertsevinc0 -
Removing links from another site
Hello, Some site that I have never been able to access as it is always down has over 3,000 links to my website. They disappeared the other week and our search queries dramatically improved but now they are back again in Google Webmaster and we have dropped again.I have contacted the site owner and got no response and I have also put in a removal form (though I am not sure this fits for that) and asked Google to remove as they have been duplicating our content also. It was in my pending section but has now disappeared.This links are really damaging our search and the site isnt even there. Do I have to list all 3,000 links in the link removal to Google or is there another way I can go about telling them the issue.Appreciate any help on this
Technical SEO | | luwhosjack0 -
Having trouble removing homepage from google
For various reasons my client wants their homepage removed from google, no just the content of the page off but the page not to be indexed (yep strange request but we are mere service providers) today I requested in webmaster tool that default.asp was removed. Wht says done but the sites homepage is still listed. The page also has a no index tag on but 24 hours and 18k Google bot hits later it still remains. Anyone got any other suggestions to deindex just the homepage asap please
Technical SEO | | Grumpy_Carl0 -
How not to lose link juice when linking to thousands of PDF guides?
Hi All, I run an e-commerce website with thousands of products.
Technical SEO | | BeytzNet
In each product page I have a link to a PDF guide of that product. Currently we link to it with a "nofollow" <a href="">tag.</a> <a href="">Should we change it to window.open in order not to lose link juice? Thanks</a>0 -
What loss of value would this link experience?
What loss of value would this link experience? If the link is actually a link to the from site that is 301'd to your site like this example below: i.e., www.domain.com/29834?=www.yourdomain.com My thought is that simply because you're going through a redirect (In this case a 301) you will lose slight value there. But I'd love to hear your thoughts and reasoning on any other affects if any (direct or indirect) you think it may have.
Technical SEO | | Webfor1 -
Internal Linking
Where is the best information on internal linking. I'm so confused and everything I read says something different. Ahhhh Thanks
Technical SEO | | meardna770 -
Which is better? Remove folder or fix the links and wait?
Dear Mozers, Recently I added a new language to one of my sites in a new folder (www.site.com/es/) but for some reasons many links got broken or were simply sending to a wrong page. This caused a bad indexing and it also showed a lot of duplicate content. I know how to fix it but my question is this: Is it better to remove the second language folder, fix it and then put it back up after a few months or just fix it now as it is and wait for G to come back and index the new links?
Technical SEO | | Silviu0