Disavow Issues
-
Hi
We have a client who was hit by Penguin about 18 months ago.
We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings.
The client is asking me whether it would be better to dump the domain and move the website to a fresh domain.
Can you provide thoughts / experience on this please?
Thanks.
-
Just wanted to clarify (for the sake of others who may read this post) that the question was in regards to Penguin and I think in your situation, you're dealing with manual penalties. With Penguin, there is no reconsideration request. You've got to clean up the best you can and then hope that things improve when Google refreshes the Penguin algorithm.
It's still up for debate whether removing links (as opposed to disavowing) is important for Penguin. My current advice is that if a link is easy to remove then do it. But, otherwise I disavow. While you're right that it is important to show Google your efforts in regards to link removal for a manual penalty, no one is going to look at your work for an algorithmic issue.
I asked John Mueller in a hangout once whether disavowing was as good as removing for Penguin and he said, "essentially yes". However, because there are potential problems that could come up with the disavow tool (such as improper formatting or taking too long to recrawl to disavow), if you can remove the link that's not a bad thing to do.
-
Hi Paul,
I realise it's been a couple of weeks since this was submitted, but I wanted to follow up. At my former agency, we went through a few reconsideration procedures for new clients. We managed to be successful with all of them, but some took quite a long time (August - February being the longest).
We have found that disavowing alone is not nearly enough to make a difference - it is far preferable for the links to be removed. Unlike Claudio below, we have had a far higher rate than 5%, but it all depends on where the links come from. Sometimes it's hard to even find a live email address to contact webmasters, and some people want payment to remove links (worth doing if the payment is not too high). We crafted templates and _always _followed up within two weeks if we did not get a response from first emailing someone for a link removal with another specifically crafted email template.
It's true that if you cannot remove links, it is still worthwhile demonstrating to Google that you attempted to do so, with email screenshots or at least a list of the sites you contacted. They want to see effort. They want to see that you removed, or attempted to remove, the vast majority of the bad links. It's time consuming and tedious, but it's worth it if you get the penalty removed.
As I said, the longest process we went through was over six months, but the site in question had a TERRIBLE backlink profile that was the result of years of abuse by bad link builders. We're talking removing thousands of links. However, it came through - the penalty was removed and the client's rankings are on the rise.
I hope this helps. The short version is: remove remove remove. You won't maintain a penalty if there are no more bad links holding the site back, and those links aren't helping it rank anyway.
If you'd like some advice on how to decide which links to remove and which to keep, please let me know. In the meantime, check out this post from my former colleague Brandon at Ayima. It's a good resource for link analysis.
Cheers,
Jane
-
Does the site have a good base of truly natural links? There have been very few reported cases of Penguin recovery. But, the ones that I have seen recover are ones that have had some excellent links left once the bad ones were cleaned up.
-
Did you have a manual penalty? Did you get it revoked? or did you assume you had a Penguin issue and were proactive about it to avoid a manual penalty?
-
Recovery from Link Penalty (manual or algorithm) procedure:
1. Collect inboud links from Google Webmaster Tools + Moz link explorer + Link Majestic.
2. Include all domains in a Excel worksheet.
3. Contact site owners asking for link removal (usually 5% of sucess, but the effort counts for Google).
4. Wait several weeks for the removal of the links.
5. Fill a disavow file and upload it to Google https://www.google.com/webmasters/tools/disavow-links-main?pli=1
6. Wait for 3 or 6 weeks and start a link building campain starting with a few links per week and increase it if you can (only natural links comming from authority sites related to your niche).
Recovers from Content problems.
1. Look for repetitive title and descriptions, use Google Webmaster Tools and Moz.
2. Look for pages with similar or identical content and fix it.
3. Look for pages with less than 200 words of convent and add content or simply remove them (404).
4. Add new fresh and original content.
Google will consider your effort and it will be increasing your rank step by step.
I hope it helps
Claudio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
Redirect and ranking issue
Hi there - was wondering whether someone might be able to help. For a period of a day and a half, all the traffic to our website's blog articles were mistakenly being redirected to our homepage. A number of these articles ranked in the top 5 in Google worldwide for their targeted keywords, so this was a considerable amount of organic traffic that was instantly being redirected. It was a strange site glitch and our web team rectified the error, but now all these articles have disappeared from Google rankings (not visible anywhere in the first five pages). I'm presuming this must be linked to this redirect issue - we've been advised to wait and see whether Google restores these rankings, but I'm still concerned as to whether this represents a more serious problem? We have re-indexed the pages we are most concerned about, but am not sure whether there is anything else obvious we should think to do. If anyone has any thoughts, I'd be happy to hear them!
Technical SEO | | rwat0 -
Issues with Duplicates and AJAX-Loader
Hi, On one website, the "real" content is loaded via AJAX when the visitor clicks on a tile (I'll call a page with some such tiles a tile-page here). A parameter is added to the URL at the that point and the content of that tile is displayed. That content is available via an URL of its own ... which is actually never called. What I want to achieve is a canonicalised tile-page that gets all of the tiles' content and is indexed by google - if possible with also recognising that the single-URLs of a tile are only fallback-solutions and the "tile-page" should be displayed instead. The current tile-page leads to duplicate meta-tags, titles etc and minimal differences between what google considers a page of its own (i.e. the same page with different tiles' contents). Does anybody have an idea on what one can do here?
Technical SEO | | netzkern_AG0 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
How to add specific Tumblr blogs into a disavow file?
Hi guys, I am about to send a reconsideration letter and still finalizing my disavow file. The format of the disavow is Domain:badlink.com (stripping out to the root domain) but what about those toxic links that are located in tumblr such as: badlink.tumblr.com? The issue is that there are good tumblr links we got so I don't want to add just tumblr.com so do you guys think I will have issues submitting badlink.tumblr.com and not tumblr.com? Thank you!
Technical SEO | | Ideas-Money-Art0 -
Robots.txt issue - site resubmission needed?
We recently had an issue when a load of new files were transferred from our dev server to the live site, which unfortunately included the dev site's robots.txt file which had a disallow:/ instruction. Bad! Luckily I spotted it quickly and the file has been replaced. The extent of the damage seems to be that some descriptions aren't displaying and we're getting a message about robots.txt in the SERPs for a few keywords. I've done a site: search and generally it seems to be OK for 99% of our pages. Our positions don't seem to be affected right now but obviously it's not great for the CTRs on those keywords affected. My question is whether there is anything I can do to bring the updated robots.txt file to Google's attention? Or should we just wait and sit it out? Thanks in advance for your answers!
Technical SEO | | GBC0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0