Disavow Issues
-
Hi
We have a client who was hit by Penguin about 18 months ago.
We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings.
The client is asking me whether it would be better to dump the domain and move the website to a fresh domain.
Can you provide thoughts / experience on this please?
Thanks.
-
Just wanted to clarify (for the sake of others who may read this post) that the question was in regards to Penguin and I think in your situation, you're dealing with manual penalties. With Penguin, there is no reconsideration request. You've got to clean up the best you can and then hope that things improve when Google refreshes the Penguin algorithm.
It's still up for debate whether removing links (as opposed to disavowing) is important for Penguin. My current advice is that if a link is easy to remove then do it. But, otherwise I disavow. While you're right that it is important to show Google your efforts in regards to link removal for a manual penalty, no one is going to look at your work for an algorithmic issue.
I asked John Mueller in a hangout once whether disavowing was as good as removing for Penguin and he said, "essentially yes". However, because there are potential problems that could come up with the disavow tool (such as improper formatting or taking too long to recrawl to disavow), if you can remove the link that's not a bad thing to do.
-
Hi Paul,
I realise it's been a couple of weeks since this was submitted, but I wanted to follow up. At my former agency, we went through a few reconsideration procedures for new clients. We managed to be successful with all of them, but some took quite a long time (August - February being the longest).
We have found that disavowing alone is not nearly enough to make a difference - it is far preferable for the links to be removed. Unlike Claudio below, we have had a far higher rate than 5%, but it all depends on where the links come from. Sometimes it's hard to even find a live email address to contact webmasters, and some people want payment to remove links (worth doing if the payment is not too high). We crafted templates and _always _followed up within two weeks if we did not get a response from first emailing someone for a link removal with another specifically crafted email template.
It's true that if you cannot remove links, it is still worthwhile demonstrating to Google that you attempted to do so, with email screenshots or at least a list of the sites you contacted. They want to see effort. They want to see that you removed, or attempted to remove, the vast majority of the bad links. It's time consuming and tedious, but it's worth it if you get the penalty removed.
As I said, the longest process we went through was over six months, but the site in question had a TERRIBLE backlink profile that was the result of years of abuse by bad link builders. We're talking removing thousands of links. However, it came through - the penalty was removed and the client's rankings are on the rise.
I hope this helps. The short version is: remove remove remove. You won't maintain a penalty if there are no more bad links holding the site back, and those links aren't helping it rank anyway.
If you'd like some advice on how to decide which links to remove and which to keep, please let me know. In the meantime, check out this post from my former colleague Brandon at Ayima. It's a good resource for link analysis.
Cheers,
Jane
-
Does the site have a good base of truly natural links? There have been very few reported cases of Penguin recovery. But, the ones that I have seen recover are ones that have had some excellent links left once the bad ones were cleaned up.
-
Did you have a manual penalty? Did you get it revoked? or did you assume you had a Penguin issue and were proactive about it to avoid a manual penalty?
-
Recovery from Link Penalty (manual or algorithm) procedure:
1. Collect inboud links from Google Webmaster Tools + Moz link explorer + Link Majestic.
2. Include all domains in a Excel worksheet.
3. Contact site owners asking for link removal (usually 5% of sucess, but the effort counts for Google).
4. Wait several weeks for the removal of the links.
5. Fill a disavow file and upload it to Google https://www.google.com/webmasters/tools/disavow-links-main?pli=1
6. Wait for 3 or 6 weeks and start a link building campain starting with a few links per week and increase it if you can (only natural links comming from authority sites related to your niche).
Recovers from Content problems.
1. Look for repetitive title and descriptions, use Google Webmaster Tools and Moz.
2. Look for pages with similar or identical content and fix it.
3. Look for pages with less than 200 words of convent and add content or simply remove them (404).
4. Add new fresh and original content.
Google will consider your effort and it will be increasing your rank step by step.
I hope it helps
Claudio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden issue: no google cache for any product detail pages and offer schema missing from SERPS
Absolutely no idea what is going on. All of our category / subcategory and other support pages are indexed and cached as normal, but suddenly none of our product pages are cached, and all of the product / offer schema snippets have been dropped from the serps as well (price, review count, average rating etc). When I inspect a product detail page url in GSC, I am either getting errors or it is returned as a soft 404. There have been no recent changes to our website that are obvious culprits. When I request indexing, it works fine for non-product pages, but generates the "Something went wrong
Technical SEO | | jamestown
If the issue persists, try again in a few hours" message for any product page submitted. We are not SEO novices. This is an Angular 7 site with a Universal version launched back in October (new site, same domain), and until this strange issue cropped up we'd enjoyed steady improvement of rankings and GSC technical issues. Has anyone seen anything like this? We are seeing rapid deterioration in rankings overnight for all product detail pages due to this issue. A term / page combination that ranked for over a decade in the top 10 lost 10 places overnight... There's just no obvious culprit. Using chrome dev tools to view as googlebot, everything is kosher. No weird redirects, no errors, returns 200 and page loads. Thank You0 -
Has anyone had problems with Wordpress plugins on their blog causing payment issues on the main site?
Looking to migrate a subdomain Wordpress site onto the main domain, but the payment system breaks based on one or more of the plugins used on the blog having been linked with spammy activity in the past. Need to isolate the plugin and remove before migrating or it'll break the site! Has anyone had any similar issues with some of the following plugins? Akismet Wordfence Security Subscribe2 Timber Backup Buddy
Technical SEO | | Amelia.Coleby0 -
Issue with Title of Homepage - Wordpress Platform
I can't seem to get Google to read the title of my homepage the way I would like it to be read and I have double checked in all areas of my Wordpress Platform where the title may be defaulting and all it set up correctly. I have also ensured that the "force re-write of titles and meta descriptions" have been checked through a plugin I am using to monitor SEO (Yoast)... here is an example of how Google is reading my homepage title: XYZ Company: Primary Keyword in Ontario Canada - XYZ Company I would like it to read: Primary Keyword in Ontario Canada - XYZ Company But for some reason it keeps pulling in my website title. I have always re-submitted my URL to Google but no luck with the title change. If anyone has any insight as to why this is defaulting as such I would greatly appreciate it. Thanks!
Technical SEO | | MainstreamMktg0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
Base HREF set without HTTP. Will this cause search issues?
The base href has been set in the following format: <base href="//www.example.com/"> I am working on a project where many of the programming team don't believe that SEO has an impact on a website. So, we often see some strange things. Recently, they have rolled out an update to the website template that includes the base href I listed above. I found out about it when some of our tools such as Xenu link checker - suddenly stopped working. Google appears to be indexing the the pages fine and following the links without any issue - but I wonder if there is any long term SEO considerations to building the internal links in this manner? Thanks!
Technical SEO | | Nebraska0 -
Severe Health issue on my site through Webmaster tools
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
Technical SEO | | VictorVC0 -
Duplicate Content Issue within the Categories Area
We are in the process of building out a new website, it has been built in Drupal. Within the scan report from SEOMOZ Crawl Diagnostics and it look like I have a duplicate content issue. Example: We sell Vinyl Banners so we have many different templates one can use from within our Online Banner Builder Tool. We have broken them down via categories: Issue: Duplicate Page Content /categories/activities has 9 other URLS associated this issue, I have many others but this one will work for an example. Within this category we have multiple templates attached to this page. Each of the templates do not need their own page however we use this to pull the templates into one page onto the activities landing page. I am wondering if I need to nofollow, noindex each of those individule templates and just get the main top level category name indexed. Or is there a better way to do this to minimize the impact of Panda?
Technical SEO | | Ben-HPB0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0