Penguin 2.1 Penalty- Can't Understand why hit by it?
-
Hi,
I have lost all my rankings after Penguin 2.1 update. I haven't did anything wrong. Want to know the root cause of the penalty so that I can overcome this.
Any help would be appreciated.
Website: http://tiny.cc/hfom4w
-
Where do you find the link profile?
-
Whilst over-optimisation is an issue, it is more about the quality of the websites you are getting links from than the anchor text used. You could have a brand link and if it's a poor quality website, you are going to get penalised. If you haven't disavowed any links, I'd make sure that 90% of them were disavowed.
You've also got duplicate content on your website so I'd say you've been hit by a Panda refresh as well as Penguin. Type in "To assist clients in getting value from their businesses, we provide Business Critical Software and IT services within various industry verticals." to Google and you will see 4 different versions of your content.
You need to do a full backlink analysis ASAP and be ruthless in the ones you get rid of, if you wouldn't show the link to Google then it's bad. Use the disavow tool in Webmaster Tools then file a reconsideration request to Google with details of what you have done and why there were so many bad links.
It's a long road to take but if you want the website to come back into the SERP's then you are going to have to do this.
Good luck!
-
Hi Yiannis and Remus
Thank you for the reply. There are many competitor websites like http://tiny[dot]cc/9tpm4w, http://tiny[dot]cc/dvpm4w that are just targeting "Software Development". Why aren't they getting penalized
-
Hello Chanpreet,
Like Yiannis says, it's probably related to "over optimized anchor text". To get more info you could compare your anchor text link profile with one for a competitor that still ranks well in SERP's.
-
Hello,
I had a quick look at your link profile and it seems that you had a big spike in link building activity from 17 of July up to 24th of August (mostly from directory submission). Then it goes quiet again which looks unnatural. You have used contextual anchor texts around "software development" keyword which covers pretty much 80% of your profile making your web site vulnerable to Penguin and my guess is that you got hit by the latest refresh/update.
I would suggest you make your anchor texts look more natural and remove all those directory links you got between july and august by using the disavow tool. Then monitor if you move up to SERPS and report back.
Remember that you might not move immediate results, sometimes the algorithm needs to refresh before you see any noticeable changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site in 2 page
my site in 2 page how can i rank with this keywords in dubai legal translation in Dubai
White Hat / Black Hat SEO | | saharali150 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Subtle On-site Factors That Could Cause a Penalty
It looks like we have the same penalties on more than one ecommerce site. What subtle on-site factors can contribute to non-manual penalty, specifically rankings slowly going down for all short tail keywords? And what does it take to pull yourself out of these penalties?
White Hat / Black Hat SEO | | BobGW0 -
Are Links from blogs with person using keyword anchor text a Penguin 2.0 issue?
Hello, I am continuing a complete clean up of a clients link profile and would like to know if Penguin is against links from blogs with the user including keywords as anchor text? So far I have been attempting to get them removed before I go for a disavow. An example would be the work clothing comment at the bottom of: http://www.fashionstyleyou.co.uk/beat-the-caffeine-rush.html/comment-page-1 I am also questioning if we should keep any link directories, so far I have been ruthless, but worry I will be losing a hell of a lot of links. For example I have kept the following: http://www.business-directory-uk.co.uk//clothing.htm Your comments are welcomed!
White Hat / Black Hat SEO | | MarzVentures0 -
What's the right way to gain the benefits of an EMD but avoid cramming the title?
Hi Guys, Say I'm (completely hypothetically) building weddingvenuesnewyork.com and right now I'm organizing the tags for each page. What's the best layout so that I can optimize for "wedding venues new york" as much as possible without it becoming spammy. Right now I'm looking at something like "Wedding Venues New York: Wedding Receptions and Ceremony Venues" for the title.. To get other strong keywords in there too. Is there a better layout/structure?.. And is having the first words of the title on the homepage the same as the domain name going to strengthen the ranking for that term, or look spammy to Google and be a bad move? This is a new site being built
White Hat / Black Hat SEO | | xcyte0 -
I think I've been hit by Penguing - Strategy Discusson
Hi, I have a network of 50 to 60 domain names which have duplicated content and whose domains are basically a geographical location + the industry I am in. All of these websites have links to my main site. Over the weekend I saw my traffic fall. I attribute our drop in rankings to what people are calling Penguing 1.1. I want to keep my other domains as we are slowly creating unique content for each of those sites. However, in the mean time, clearly I need to deal with the inbound linking and anchor text problem. Would adding a nofollow tag to all links that point to my main site resolve my issue with Google's penguin update? Thanks for the help.
White Hat / Black Hat SEO | | MangoMan160 -
Hit hard by Panda 3.3 and Penguin. What to do?
Hi there. I work with a company that was originally all white hat, then began to dabble in some pretty serious black hat activities last year (usually paid linking in private blog networks). At the time we saw tremendous results - many of our most highly competitive keywords shot up 20, 30 positions to the top 10. And they didn't seem to budge so long as we kept those (very expensive) links intact. Alongside all of this, we have had a lot of white hat activity going on (pretty much everything recommended by Google/SEO Moz is ALSO in effect on this domain - lots of consistent/relevant blogging, social media, good content, good on-site SEO, etc), which I attribute to SOME of our success with keyword ranking, but what really made the difference was the paid linking. Let's just say we had two different mindsets behind the SEO strategy of the company, and the "Get rich quick" one worked for a while. Now, it doesn't. (Can you guess if I'm the white hat or the black hat at the company?) So here's my question. I have made the effort to contact all of the webmasters of our egregious links and, as everyone else has described, it is effectively useless. Especially given the amazing post by Ryan Kent on this question (http://www.seomoz.org/q/does-anyone-have-any-suggestions-on-removing-spammy-links) I have sort of given up on the strategy of contacting these webmasters on a case by case basis and asking for the links to be removed, especially if Google is not going to accept anything less than a perfect backlink portfolio. It is LITERALLY IMPOSSIBLE to clean up these links. Meanwhile, this company is a big name in a very competitive online market and it really needs to see lead generation from organic SEO. (Please don't give me any told-you-so's here, it was out of my hands.) MY QUESTION IS: WHAT SHOULD WE DO? Should we just keep the domain going and focus on only building quailty links from now on? Most of our keywords fall anywhere from position 40 to position 150 right now, so it's not like ALL hope is lost. But as any SEO knows that is basically as good as not being indexed at all. OTHER OPTION: We have an old domain that is the less-SEO-friendly, but it is the official name of our company . com, and this domain is currently 301'd to our live (SEO-friendly) domain. The companyname.com domain is also older than our SEO friendly domain. Should we manually move our site back over to the old domain since there is no penalty on it? It seems like a lot of sites that are ranking are brand new anyway (except their URL's are loaded with keywords.) Blah, I know that was a lot, but I'm feeling lost and ANY insight would be helpful. Thanks as always SEOMoz!!
White Hat / Black Hat SEO | | LilyRay1 -
What happened with Hayneedle's rankings?
Hayneedle is an e-commerce company that operates 200 niche sites selling indoor and outdoor home products. They were ranking at the top of the first page for most terms related to their sites (fire pits, fountains, benches, etc.), but all of a sudden at the end of April they lost their rankings, getting dropped to page 4 or lower for tons of their sites (barstools.com, patiofurnitureusa.com, adirondackchairs.com, benches.com, etc.). Does anybody know what caused this? Other than one thread on an SEO forum, we haven't been able to find any discussion about it online. It seems like cross-linking between the sites could have been a problem here, but we'd love to hear thoughts from the experts here on this. Our company is using the same business model of one brand with niche sites and we want to avoid anything like this happening to us.
White Hat / Black Hat SEO | | outdoorliving0