Help, really struggling with fixing mistakes post-Penguin
-
We had previously implemented a strategy of paying for lots of links and focusing on 3 or 4 keywords as our anchors, which used to REALLY work (I know, I know, bad black hat strategy - I have since learned my lesson). These keywords and others have since plummeted up to 100 spots since Panda 3.3 and Penguin. So I'm trying to go in and fix all our mistakes cuz our domain is too valuable to us just to start over from scratch.
Yesterday I literally printed a 75 page document of all of our links according to Open Site Explorer. I have been going in and manually changing anchor text wherever I can, and taking down the very egregious links if possible.This has involved calling and emailing webmasters, digging up old accounts and passwords, and otherwise just trying to diversify our anchor text and remove bad links. I've also gone into our site and edited some internal links (also too weighty on certain keywords) and removed other links entirely.
My rankings have gone DOWN more today. A lot. WTF does Google want? Is there something I'm doing wrong? Should we be deleted links from all private networks entirely or just trying to vary the anchor text? Any advice greatly appreciated. Thanks!
-
I would go through your list and remove the links and not try to vary anchor text at this point. I've been hit was well and moved to a domain I have held for years, but am slowly removing bad links that are on networks or painfully outside my niche. I would suggest naturally building links slowly with partial match anchor text and with the majority of the links having anchor text of your brand
-
Hi LilyRay,
Regarding your Penguin penalization, I would treat it like any other pre-Penguin link-based penalty. I have worked with many sites that have been penalized for manipulative linking, and the process to get the penalty lifted is always the same:
- REMOVE as many of the manipulative links as you can. It's the link that Google has classified as manipulative. The anchor text was just the identifier that helped them find it. Changing the anchor text of a manipulative links and leaving them up will keep the penalties associated with those links in place.
Document all of the steps that you're taking to eliminate manipulative links. Make a neat, bulleted list, with the link(s), network(s), actions taken by you, and the results. In some cases, you won't be able to remove a link. That's understandable, as they're not in your control. While you're at it, clean up ANYTHING else on your site that could be perceived as on-page spam. You're trying to prove to Google that you are a good citizen of the web, so make your site as sparkly as you can.
Once you've completed these steps, submit all of your documented work as part of your reconsideration request, to show Google that you're operating in good faith. Under normal circumstances, wait times for reconsideration requests can be anywhere from a week to a month. With the mass of reconsiderations that Google is getting right now, I'd expect a longer wait.
I'm sure this process sounds painful, and it is, but it's the only way to get back from a penalty that I've seen be effective.
-
It was partially out of my control. Pressure from higher ups for instantaneous results. I've always supported and wanted to stick to white hat seo.
-
And promise yourself never to go for the quick and easy again.
-
Google since released a 52 pack of updates since the roll out of Penguin and Panda 3.6 which you may have been stung by almost immediately after the first hit.
SEOmoz provide up to date change history of algorithm updates as soon as they are released.
Any backlinks you have which are associated to blog rings / networks - I would delete as many as you can. If the network has been identified and blacklisted by Google, then they'll be rolling out penalties for any domains that have used them. Parallel to this, build some natural links to balance out your link profile as soon as you can too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Implement AMP for Single Blog Post?
Hello Moz Team, I would like to implement AMP for my single blog post not on whole blog. Is it possible? if Yes then How? Note - I am already using GTM for my website abcd.com but I would like to use for my blog post only and my blog is like - abcd.com/blog..............let me clarify Blog Post means - abcd.com/blog/my-favorite-dress Thanks!
Intermediate & Advanced SEO | | Johny123450 -
Disavow links of my own in niche forums that i post to regularly?
Hi Yall, I'm disavowing a new set of links and have come across a wall: Let's say your niche is in web hosting and you post to forums such as a webhostingtalk.com (a forum very popular in the hosting business). If your sole purpose is mostly selling your business and you have links (not anchor text keywords) that you direct users to for specific products and such...do you do a disavow those links? I'm not leaving links like: Web hosting, or, Free Hosting... I'm posting deals and answering some questions on other posts that direct to my site with traditional links. Thank you
Intermediate & Advanced SEO | | Shawn1240 -
Help with Robots.txt On a Shared Root
Hi, I posted a similar question last week asking about subdomains but a couple of complications have arisen. Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one. Thank you in advance.
Intermediate & Advanced SEO | | Whittie0 -
Robot.txt help
Hi, We have a blog that is killing our SEO. We need to Disallow Disallow: /Blog/?tag*
Intermediate & Advanced SEO | | Studio33
Disallow: /Blog/?page*
Disallow: /Blog/category/*
Disallow: /Blog/author/*
Disallow: /Blog/archive/*
Disallow: /Blog/Account/.
Disallow: /Blog/search*
Disallow: /Blog/search.aspx
Disallow: /Blog/error404.aspx
Disallow: /Blog/archive*
Disallow: /Blog/archive.aspx
Disallow: /Blog/sitemap.axd
Disallow: /Blog/post.aspx But Allow everything below /Blog/Post The disallow list seems to keep growing as we find issues. So rather than adding in to our Robot.txt all the areas to disallow. Is there a way to easily just say Allow /Blog/Post and ignore the rest. How do we do that in Robot.txt Thanks0 -
Status Code: 404 Errors. How to fix them.
Hi, I have a question about the "4xx Staus Code" errors appearing in the Analysis Tool provided by SEOmoz. They are indicated as the worst errors for your site and must be fixed. I get this message from the good people at SEOmoz: "4xx status codes are shown when the client requests a page that cannot be accessed. This is usually the result of a bad or broken link." Ok, my question is the following. How do I fix them? Those pages are shown as "404" pages on my site...isn't that enough? How can fix the "4xx status code" errors indicated by SEOmoz? Thank you very much for your help. Sal
Intermediate & Advanced SEO | | salvyy0 -
Do comment links on blogs help the blog itself rank?
Hi I have a blog - Carzilla.co.uk - and it keeps getting what are pretty obviously spam comments with links to unconnected websites of various quality. The blog is quite new and not ranking highly in SERPs for anything in particular yet. So my question is, is it better to let some of these comments through so google can see activity on the site? Or do spammy comments with links make the site look like a link farm? Any advice on what my policy should be - purely from a Google serps perspective - would be great.
Intermediate & Advanced SEO | | usedcarexpert0 -
301 redirect help
Hey guys, I normally work in WordPress and just use a 301 redirect plugin. I bought a site and rather than maintain two similar ones have decided to redirect one to the other. I am having trouble with the .htaccess file. Here is an example. These are two redirects: redirect 301 /category/models/next/2
Intermediate & Advanced SEO | | DanDeceuster
redirect 301 /category/models I want both of these URLs to redirect to the same URL of the new site. However, the /category/models is the only one working. It redirects to the new page just fine. The /category/models/next/2 is redirecting to nearly the same URL on the new site, only it is adding /next/2 to the end and that is bringing up a 404. Why is it adding /next/2 to the new URL? How can I fix this? There are several doing this. Help appreciated!0 -
Home page deindexed by goole, How to determine why and how to fix
On Wednesday I noticed our domain was no longer ranking for our key word and our product Isolator Fitness, http://isolatorfitness.com, I have been researching and not finding answers to why it happened and what to do to fix it. We have about 800 other pages still listed. I am new to all this seo stuff, can anyone guide me in the right direction.
Intermediate & Advanced SEO | | David75
History about 10 days a go I went google web master tools and noticed that there were a large number of errors due to the fact the robots could not crawl our site. Looked at the site and found that Privacy button on WP was turned to block robots. I turned it on and had google re crawl the site, looked like google was not able to crawl the site for about 3 months, on Monday I did a 301 redirect to on of our other sites for another product we sell, to http://isolatorfitness.com/6-pack-bags. This site had a good bit of back links would doing all this at one time cause this How do I determine if we did anything wrong Thanks0