Penguin 2.1 Penalty- Can't Understand why hit by it?
-
Hi,
I have lost all my rankings after Penguin 2.1 update. I haven't did anything wrong. Want to know the root cause of the penalty so that I can overcome this.
Any help would be appreciated.
Website: http://tiny.cc/hfom4w
-
Where do you find the link profile?
-
Whilst over-optimisation is an issue, it is more about the quality of the websites you are getting links from than the anchor text used. You could have a brand link and if it's a poor quality website, you are going to get penalised. If you haven't disavowed any links, I'd make sure that 90% of them were disavowed.
You've also got duplicate content on your website so I'd say you've been hit by a Panda refresh as well as Penguin. Type in "To assist clients in getting value from their businesses, we provide Business Critical Software and IT services within various industry verticals." to Google and you will see 4 different versions of your content.
You need to do a full backlink analysis ASAP and be ruthless in the ones you get rid of, if you wouldn't show the link to Google then it's bad. Use the disavow tool in Webmaster Tools then file a reconsideration request to Google with details of what you have done and why there were so many bad links.
It's a long road to take but if you want the website to come back into the SERP's then you are going to have to do this.
Good luck!
-
Hi Yiannis and Remus
Thank you for the reply. There are many competitor websites like http://tiny[dot]cc/9tpm4w, http://tiny[dot]cc/dvpm4w that are just targeting "Software Development". Why aren't they getting penalized
-
Hello Chanpreet,
Like Yiannis says, it's probably related to "over optimized anchor text". To get more info you could compare your anchor text link profile with one for a competitor that still ranks well in SERP's.
-
Hello,
I had a quick look at your link profile and it seems that you had a big spike in link building activity from 17 of July up to 24th of August (mostly from directory submission). Then it goes quiet again which looks unnatural. You have used contextual anchor texts around "software development" keyword which covers pretty much 80% of your profile making your web site vulnerable to Penguin and my guess is that you got hit by the latest refresh/update.
I would suggest you make your anchor texts look more natural and remove all those directory links you got between july and august by using the disavow tool. Then monitor if you move up to SERPS and report back.
Remember that you might not move immediate results, sometimes the algorithm needs to refresh before you see any noticeable changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
1st Ecommerce site got penalized, can we start a 2nd one?
Hello, A client's first site got penalized by Goolge Penguin. It has recovered through cleaning up backlinks, but not to where it was before. It is 2nd and 3rd for several money keywords, but is far less successful than before penalization. We are starting a second site. Here's the important steps to mention The new site shows up first for it's domain name, and it has 30 pages indexed. It shows up NOWHERE for our leading search term. Out other site has a blog post that is 3rd for that search term. We are using new categories and new organization. We are using a different cart solution We are adding all unique content The home pages and some of the product pages are very thorough. We are adding comprehensive products like nothing else in the industry (10X) We plan on adding a very comprehensive blog, but haven't started yet. We've added the top 100 products so far. Our other store has 500. There's a lot of spam in the industry, so sites are slow to rank. Our category descriptions are 500 words Again, all unique content. No major errors in Moz Campaign tools Just a few categories so far, we're going to add many more. Same Google Analytics account as our other site It looks like we should eventually be on page 3 for our major search term. Again, we're nowhere for anything right now. ... Have you seen that Google will not rank a second site because it's from the same company and Google Analytics account, or does Google let you rank 2 sites in the same industry? We are hoping it's just slow to rank. If you can rank 2 sites, what are your best recommendations to help show up? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
It appears that no matter what I do, my website is not picking up traffic. What can I do?
I have tried everything, followed everything by the book...yet nothing is happening. I I yet have to try PPC, but I am sure that by now the website is healthy since I have spent from January to the current date fixing every sort of warning and errors. While I have worked on link building strategies(only submitting links to directories for the moment) However the website is dead. What should I do? Is this due to a Google penalty?
White Hat / Black Hat SEO | | ts24group0 -
Can someone hep me on writing a letter to google reconsideration
Can someone hep me and charge for it on writing a letter to google reconsideration in native english? I did everything on this last year to respect google guidelines, and i as site owner always did on generating good content and usability to my site althoung a seo company had generate unnutural links that i manage to remove almost of them. i did all and more contacting webmasters to solve it. But i am not a english native speaker, i am just a content generator with a webmaster account. I is my last try to save my domain. Can someone help me on write a good google last letter? thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
Can a "Trusted Retailer" badge scheme affect us in the SERPs?
Hi Guys, In the last week our website saw a drop on some of our biggest and best converting keywords and we think it might be down to us rolling out a “Trusted Retailer” badge scheme. We sell our products directly to consumers via our website, but we also sell our products to other online resellers. We think badges are a good to show the consumer that we trust a site. On the 17th September we sent out badges to about 39 of our best retailers, two of whom have already put them on their sites. Instead of sending them a flat jpeg, we sent them HTML files containing code that pulled in the image from our servers. We wanted to host the image to make sure that we always had some leverage. So if a company stopped selling our products, or the quality of their site went down, we could just remove the badge. Whilst at it, we stuck a link in there pointing to an FAQ on our website all about trusted retailers and what people need to look out for. We chose the anchor text “(brand name) Trusted Retailer”, because that seemed to be the most relevant. The code looks like this: (our brand) Trusted Retailer You might notice that there is a div just before the link. This is there to stop the user from clicking on the top 65% of the badge (because this contains the shop name and ID number), and we also used a negative text-indent to move the anchor text out of the way. But right underneath this is our Logo, so it’s almost a hidden link, but you can still click it. So far the badge has been put in on two sites, one of which isn’t so great and maybe looks a tiny bit spammy. (They sell mostly through ebay as opposed to on their main site). Also, these sites seem to have put it on most of their pages! So my questions are; Is this seen as black or grey hat? Is it the fact we put in anchor text with our brand? Or is it the fact the url is transparent in the coding? Or is it the fact the sites are using sitewide links? In any case would Google react so quickly as to penalise us in two days? If this is the issue, do you think there’s anything we can do to stop getting penalised? (Other than having to e-mail 39 retailers back and getting them to take the badges down). Thoughts much appreciated – we do our SEO in-house and are still learning every day… Thank you James
White Hat / Black Hat SEO | | OptiBacUK0 -
How to handle footer links after Penguin?
With the launch of Google's Penguin I know that footer links could possibly hurt rankings. Also too many links on a page are also bad. I have a client http://www.m-scribe.com That has footer links creating well over 100 links on many of their pages. How should I handle these footer links? Suggestions are greatly appreciated.
White Hat / Black Hat SEO | | RonMedlin0 -
Site ranking position 1, after 1 day existing
Hi, I'm working in the Online gaming and since a few months there was a website called 'Htmlwijzer.nl', it ranked page 1 for 'online casino' in Google.nl which is of course a high competive keyword and remained there for about 2 months. This website didn't come up slowly in Google.nl from page 10 to 1, it just was there one day at page 1. This one only had HTML-related information content and did rank for 'online casino' with a layer ad on it's site which would only be showed to users in the Netherlands. Now that website received a penalty after 2 months, and since today a new site is in Google at position 1, called www.casinowijzer.nl. It has completely no backlinks (online a 301 from the former domain). It just popped up there.. Does anyone know how this website could have gotten here? It's obviously blackhat, but for a keyword like 'online casino' it's quite amazing. Thanks,
White Hat / Black Hat SEO | | iwebdevnl0