Disavow Issues
-
Hi
We have a client who was hit by Penguin about 18 months ago.
We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings.
The client is asking me whether it would be better to dump the domain and move the website to a fresh domain.
Can you provide thoughts / experience on this please?
Thanks.
-
Just wanted to clarify (for the sake of others who may read this post) that the question was in regards to Penguin and I think in your situation, you're dealing with manual penalties. With Penguin, there is no reconsideration request. You've got to clean up the best you can and then hope that things improve when Google refreshes the Penguin algorithm.
It's still up for debate whether removing links (as opposed to disavowing) is important for Penguin. My current advice is that if a link is easy to remove then do it. But, otherwise I disavow. While you're right that it is important to show Google your efforts in regards to link removal for a manual penalty, no one is going to look at your work for an algorithmic issue.
I asked John Mueller in a hangout once whether disavowing was as good as removing for Penguin and he said, "essentially yes". However, because there are potential problems that could come up with the disavow tool (such as improper formatting or taking too long to recrawl to disavow), if you can remove the link that's not a bad thing to do.
-
Hi Paul,
I realise it's been a couple of weeks since this was submitted, but I wanted to follow up. At my former agency, we went through a few reconsideration procedures for new clients. We managed to be successful with all of them, but some took quite a long time (August - February being the longest).
We have found that disavowing alone is not nearly enough to make a difference - it is far preferable for the links to be removed. Unlike Claudio below, we have had a far higher rate than 5%, but it all depends on where the links come from. Sometimes it's hard to even find a live email address to contact webmasters, and some people want payment to remove links (worth doing if the payment is not too high). We crafted templates and _always _followed up within two weeks if we did not get a response from first emailing someone for a link removal with another specifically crafted email template.
It's true that if you cannot remove links, it is still worthwhile demonstrating to Google that you attempted to do so, with email screenshots or at least a list of the sites you contacted. They want to see effort. They want to see that you removed, or attempted to remove, the vast majority of the bad links. It's time consuming and tedious, but it's worth it if you get the penalty removed.
As I said, the longest process we went through was over six months, but the site in question had a TERRIBLE backlink profile that was the result of years of abuse by bad link builders. We're talking removing thousands of links. However, it came through - the penalty was removed and the client's rankings are on the rise.
I hope this helps. The short version is: remove remove remove. You won't maintain a penalty if there are no more bad links holding the site back, and those links aren't helping it rank anyway.
If you'd like some advice on how to decide which links to remove and which to keep, please let me know. In the meantime, check out this post from my former colleague Brandon at Ayima. It's a good resource for link analysis.
Cheers,
Jane
-
Does the site have a good base of truly natural links? There have been very few reported cases of Penguin recovery. But, the ones that I have seen recover are ones that have had some excellent links left once the bad ones were cleaned up.
-
Did you have a manual penalty? Did you get it revoked? or did you assume you had a Penguin issue and were proactive about it to avoid a manual penalty?
-
Recovery from Link Penalty (manual or algorithm) procedure:
1. Collect inboud links from Google Webmaster Tools + Moz link explorer + Link Majestic.
2. Include all domains in a Excel worksheet.
3. Contact site owners asking for link removal (usually 5% of sucess, but the effort counts for Google).
4. Wait several weeks for the removal of the links.
5. Fill a disavow file and upload it to Google https://www.google.com/webmasters/tools/disavow-links-main?pli=1
6. Wait for 3 or 6 weeks and start a link building campain starting with a few links per week and increase it if you can (only natural links comming from authority sites related to your niche).
Recovers from Content problems.
1. Look for repetitive title and descriptions, use Google Webmaster Tools and Moz.
2. Look for pages with similar or identical content and fix it.
3. Look for pages with less than 200 words of convent and add content or simply remove them (404).
4. Add new fresh and original content.
Google will consider your effort and it will be increasing your rank step by step.
I hope it helps
Claudio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
X-robots tag causing no index issues
I have an interesting problem with a site which has an x-robot tag blocking the site from being indexed, the site is in Wordpress, there are no issues with the robots.txt or at the page level, I cant find the noindex anywhere. I removed the SEO plug-in which was there and installed Yoast but it made no difference. this is the url: https://www.cotswoldflatroofing.com/ Its coming up with a HTTP error: x-robots tag noindex, nofollow, noarchive
Technical SEO | | Donsimong0 -
Please let me know if I am in a right direction with fixing rel="canonical" issue?
While doing my website crawl, I keep getting the message that I have tons of duplicated pages.
Technical SEO | | kirupa
http://example.com/index.php and http://www.example.com/index.php are considered to be the duplicates. As I figured out this one: http://example.com/index.php is a canonical page, and I should point out this one: http://www.example.com/index.php to it. Could you please let me know if I will do a right thing if I put this piece of code into my index.php file?
? Or I should use this one:0 -
Is My Boilerplate Product Description Causing Duplicate Content Issues?
I have an e-commerce store with 20,000+ one-of-a-kind products. We only have one of each product, and once a product is sold we will never restock it. So I really have no intention to have these product pages showing up in SERPs. Each product has a boilerplate description that the product's unique attributes (style, color, size) are plugged into. But a few sentences of the description are exactly the same across all products. Google Webmaster Tools doesn't report any duplicate content. My Moz Crawl Report show 29 of these products as having duplicate content. But a Google search using the site operator and some text from the boilerplate description turns up 16,400 product pages from my site. Could this duplicate content be hurting my SERPs for other pages on the site that I am trying to rank? As I said, I'm not concerned about ranking for these products pages. Should I make them "rel=canonical" to their respective product categories? Or use "noindex, follow" on every product? Or should I not worry about it?
Technical SEO | | znagle0 -
Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
Here's the situation... There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.) Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product I've seen traffic to my site dropping but I don't have a warning in GWT. These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try). I totally understand that the site linking to me may not have any affect on my current traffic. So should I use the Disavow tool to make sure this site isn't counting against me?
Technical SEO | | GlenCraig0 -
Canonical Issues
Hi Guys, I have a technical question. Ive started optimising an ecommerce site for a client and come across some duplicate content issues:- This page: http://www.bracknelllamps.com/projector-manufacturer/SANYO/70 is actually indexed in Google as:- http://www.bracknelllamps.com/projector-lamps.php?make=SANYO Both pages have the same content and I'm guessing the indexed page refers to an old way of navigating the site. As I'm concerned about duplicate content issues, what's the best approach as this seems to be the case for all 'projector manufacturer' pages. would it be to 301 redirect each manufacturer url (this could take forever with 107) manufacturers or rel="canonical" tag? to show Google which page I want indexing? Kind Regards Neil
Technical SEO | | nezona0 -
Javascript or HTML / DIVS to fix pagination issues?
Which is better to fix a pagination problem, javascript or HTML/DIVs? I know in one Google Webmaster Forum, a Google engineer recommends Javascript, but I've also seen people use DIVs.
Technical SEO | | nicole.healthline0 -
We have been hit with the "Doorway Page" Penalty - fixed the issue - Got MSG that will still do not meet guidelines.
I have read the FAQs and checked for similar issues: YES / NO
Technical SEO | | LVH
My site's URL (web address) is:www.recoveryconnection.org
Description (including timeline of any changes made): We were hit with the Doorway Pages penalty on 5/26/11. We have a team of copywriters, and a fast-working dev dept., so we were able to correct what we thought the problem was, "targeting one-keyword per page" and thin content. (according to Google) Plan of action: To consolidate "like" keywords/content onto pages that were getting the most traffic and 404d the pages with the thin content and that were targeting singular keywords per page. We submitted a board approved reconsideration request on 6/8/11 and received the 2nd message (below) on 6/16/11. ***NOTE:The site was originally designed by the OLD marketing team who was let go, and we are the NEW team trying to clean up their mess. We are now resorting to going through Google's general guidelines page. Help would be appreciated. Below is the message we received back. Dear site owner or webmaster of http://www.recoveryconnection.org/, We received a request from a site owner to reconsider http://www.recoveryconnection.org/ for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from http://www.recoveryconnection.org/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team Any help is welcome. Thanks0 -
OnPage Issues with UTF-8 and ISO-8859-1
Hi guys, I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside. If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars. I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8.. You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8 Remove the ?charset parameter and the charset it set to iso-8859-1. Hope somebody has an answer or can push me into the right direction. Thanks in advance and have a great day all. Jan
Technical SEO | | jmueller0