Penguin 2.1 Penalty- Can't Understand why hit by it?
-
Hi,
I have lost all my rankings after Penguin 2.1 update. I haven't did anything wrong. Want to know the root cause of the penalty so that I can overcome this.
Any help would be appreciated.
Website: http://tiny.cc/hfom4w
-
Where do you find the link profile?
-
Whilst over-optimisation is an issue, it is more about the quality of the websites you are getting links from than the anchor text used. You could have a brand link and if it's a poor quality website, you are going to get penalised. If you haven't disavowed any links, I'd make sure that 90% of them were disavowed.
You've also got duplicate content on your website so I'd say you've been hit by a Panda refresh as well as Penguin. Type in "To assist clients in getting value from their businesses, we provide Business Critical Software and IT services within various industry verticals." to Google and you will see 4 different versions of your content.
You need to do a full backlink analysis ASAP and be ruthless in the ones you get rid of, if you wouldn't show the link to Google then it's bad. Use the disavow tool in Webmaster Tools then file a reconsideration request to Google with details of what you have done and why there were so many bad links.
It's a long road to take but if you want the website to come back into the SERP's then you are going to have to do this.
Good luck!
-
Hi Yiannis and Remus
Thank you for the reply. There are many competitor websites like http://tiny[dot]cc/9tpm4w, http://tiny[dot]cc/dvpm4w that are just targeting "Software Development". Why aren't they getting penalized
-
Hello Chanpreet,
Like Yiannis says, it's probably related to "over optimized anchor text". To get more info you could compare your anchor text link profile with one for a competitor that still ranks well in SERP's.
-
Hello,
I had a quick look at your link profile and it seems that you had a big spike in link building activity from 17 of July up to 24th of August (mostly from directory submission). Then it goes quiet again which looks unnatural. You have used contextual anchor texts around "software development" keyword which covers pretty much 80% of your profile making your web site vulnerable to Penguin and my guess is that you got hit by the latest refresh/update.
I would suggest you make your anchor texts look more natural and remove all those directory links you got between july and august by using the disavow tool. Then monitor if you move up to SERPS and report back.
Remember that you might not move immediate results, sometimes the algorithm needs to refresh before you see any noticeable changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
How can i Rank this website
here is my website www.onlinehackingtricks.com with fresh content and with proper on page seo but if i will do some off page seo then google will give penality to me because my one website got deindexed so how can i rank this?
White Hat / Black Hat SEO | | SEORAMAN0 -
What penalty would cause this traffic drop (Google Analytic Screenshot)
This ecommerce site was hit (mostly) slowly by updates but there is nothing in GWT. Below is the graph. Keep in mind that most of our traffic is return customers, so the drops don't look dramatic, but they are. "New Visitors" doesn't show the drop. This is a "Daily" Google Analytics setting. The drop I've circled is May 23-May 24, 2013. It was a huge hit in non-return customers. This graph is "Unique Visitors" I don't know why the "New Visitors" graph is not showing the dip Although we had some big drops, a lot of the drop was gradual. Any help in identifying what could be causing the problem is appreciated. ga.png
White Hat / Black Hat SEO | | BobGW0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
How should I use the 2nd link if a site allows 2 in the body of a guest post?
I've been doing some guest posting, and some sites allow one link, others allow more. I'm worried I might be getting too many guest posts with multiple links. I'd appreciate your thoughts on the following: 1. If there are 50+ guest posts going to my website (posted over the span of several months), each with 2 links pointing back only to my site is that too much of a pattern? How would you use the 2nd link in a guest post if not to link to your own site? 2. Does linking to .edu or .gov in the guest post make the post more valuable in terms of SEO? Some people recommend using the 2nd link to do this. Thanks!
White Hat / Black Hat SEO | | pbhatt0 -
My site has disapeared from the serps. Could someone take a look at it for me and see if they can find a reason why?
my site has disappeared from the serps. Could someone take a look at it for me and see if they can find a reason why? It used to rank around 4 for the search "austin wedding venues" and it still ranks number three for this search on Bing. I haven't done any SEO work on it in a while so i don't think i did anything to make Google mad but now it doesn't even rank anywhere in the top 160 results. Here's the link: http://austinweddingvenues.org Thanks in advance Mozzers! Ron
White Hat / Black Hat SEO | | Ron100 -
Can a "Trusted Retailer" badge scheme affect us in the SERPs?
Hi Guys, In the last week our website saw a drop on some of our biggest and best converting keywords and we think it might be down to us rolling out a “Trusted Retailer” badge scheme. We sell our products directly to consumers via our website, but we also sell our products to other online resellers. We think badges are a good to show the consumer that we trust a site. On the 17th September we sent out badges to about 39 of our best retailers, two of whom have already put them on their sites. Instead of sending them a flat jpeg, we sent them HTML files containing code that pulled in the image from our servers. We wanted to host the image to make sure that we always had some leverage. So if a company stopped selling our products, or the quality of their site went down, we could just remove the badge. Whilst at it, we stuck a link in there pointing to an FAQ on our website all about trusted retailers and what people need to look out for. We chose the anchor text “(brand name) Trusted Retailer”, because that seemed to be the most relevant. The code looks like this: (our brand) Trusted Retailer You might notice that there is a div just before the link. This is there to stop the user from clicking on the top 65% of the badge (because this contains the shop name and ID number), and we also used a negative text-indent to move the anchor text out of the way. But right underneath this is our Logo, so it’s almost a hidden link, but you can still click it. So far the badge has been put in on two sites, one of which isn’t so great and maybe looks a tiny bit spammy. (They sell mostly through ebay as opposed to on their main site). Also, these sites seem to have put it on most of their pages! So my questions are; Is this seen as black or grey hat? Is it the fact we put in anchor text with our brand? Or is it the fact the url is transparent in the coding? Or is it the fact the sites are using sitewide links? In any case would Google react so quickly as to penalise us in two days? If this is the issue, do you think there’s anything we can do to stop getting penalised? (Other than having to e-mail 39 retailers back and getting them to take the badges down). Thoughts much appreciated – we do our SEO in-house and are still learning every day… Thank you James
White Hat / Black Hat SEO | | OptiBacUK0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0