Was I hit by panda or penguin?
-
My site, graciousbridal.com was hit pretty hard by google at the end of April. I actually noticed our traffic decreasing around February, then towards the end of April, it got really bad. Our sales this may were half of what they were in may 2011. We have never done any black or gray hat seo, wouldn't even know how to. I know in the past we did blog commenting, but changed up our keywords so it wasn't all the same, maybe we didn't change it enough?? We have another very similar site that I'm now wondering if we were penalized because they are too similar. We always have changed up the copy, but they have most of the same products. This second site barely gets traffic or sales and has about half of the items graciousbridal does. But, I'm wondering now if it's to similar and that is why we were penalized. I can't figure out what we did wrong to have this big of a drop. I really need help with this as this is supposed to be our busiest season of the year. Any advice or direction is greatly appreciated..
-
Hi James, Thanks for responding. We hired an SEO person to help us at the beginning of last year, that ended up not working out. I know she was working on link building, and it seems like she must have left comments on irrelevant blogs.. What do you recommend in this situation, focus time on getting high quality links or trying to get the bad ones removed? And is blog commenting still okay to do on blogs that are relevant to our industry, or does google see that as spammy? Thanks! Audrey
-
I had a quick look over your link profile, I feel you have been hit by Penguin.
I notice you have been doing some ok link building in terms of using varied anchor text in your link profile,
But I also notice you have been targeting some lower quality directories for your dedicated page "<a class="clickable title link-pivot expanded" title="See top linking pages that use this anchor text">personalized wedding favors"</a>
For example I notive you have a link on this directorie: coloradosph.org
http://coloradosph.org/index.php?s=A&c=363&p=17
Google has totally de indexed the URL for this low quality directory, If you take a similar sample accross more of these directories I see more like this.
Also in regards to your "Blog Comments" I notice some poor quality links from this area example:
this looks like it is a blog comment on a link network [comment made 28th Feb 2011), I mean their are 100s of other spammy comments this isn't really going to do any thing apart from have a negative impact on your SEO.
http://mommies.genuineinteractive.com/blog/post/Connecting-with-Alpha-Moms.aspx
To future proof your strategy you need to target higher quality link building strategies and do not let the lower quality stuff work.
Google has been very harsh after the last update, work done by agency's 5 years ago can get hit I have seen it over and over, from my experience "Quality" will be the winner in the end
-
Penguin was released on April 24th 2012. If you GA show a decrease on that date, then it's Penguin.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hit by an unnamed Google update on November 30th - Still suffering
Hi, Moz Community. Just decided to sign up for a free trial because I'm absolutely at my wits end here. Here's my site: cheapgamesguru.com I run a small PC gaming blog monetized by affiliate marketing. I do all the writing, SEO, etc. myself. The content I write is, from what I can tell, fully complying with Google's guidelines, and in and of itself is pretty informative and high-quality. My site was started in December of 2015, and it was doing very well for a good 10 or 11 months - until late November of 2016. Then something happened. My traffic started plummeting - I went from getting nearly 300 organic users a day (Not sessions - actual unique users) to 80, then 40, and now I'm lucky to get over 15 a day. I do not do ANY black hat SEO whatsoever. I have not taken part in any shady link building schemes, nor do I try to trick Google in any way. I just write good content, do good keyword research (Targeting only low-hanging fruit and low-difficulty keywords using KWFinder), and do my best to provide a good user experience. I run no ads on my site. Glenn Gabe wrote about a potential Google update on November 29th, but the stuff he said in his article doesn't seem to affect me - my mobile site is perfectly fine, according to Google's own metrics and testing tools. Here's the article in question: http://www.gsqi.com/marketing-blog/november-30-2016-google-algorithm-update/ At first, I thought it was possible that this was a result of my competitors simply doing far better than me - but that doesn't seem to be the case, as their rankings did not actually move - mine simply pummeted. And many of their sites are far worse than mine in terms of grammar, spelling, and site speed. I understand backlinks are important, by the way, but I really don't think that's why my site was hit. Many competitors of mine have little to no backlinks and are doing great, and it would also not make much sense for Google to hit an otherwise great site just because they have few backlinks. A friend of mine has reached out to Glenn Gabe himself to see if he can get his input on my site, but he's had a busy schedule and hasn't gotten a chance to take a look yet. I recently obtained a backlink from a highly relevant DA 65 site (About a month ago, same niche as my site), and it now shows up in Search Console and Ahrefs - but it hasn't affected rankings whatsoever. Important Note: I'm not only just ranking poorly for stuff, I'm ranking in position 100-150+ for many low-competition keywords. I have no idea why that is happening - is my site THAT bad, that my content deserves to be ranking on page 15 or lower? Sorry for the long question. I'm struggling here, and just wanted to give as much information as possible. I would really appreciate any input you guys can give me - if any SEO experts want to turn my site into a case study and work with me to improve things, I'd also be open to that 😉 I kid, of course - I know you guys are all busy. Thanks! P.S. I've attached a picture of my SEMRush graph, for reference, as well. mhgSw
Algorithm Updates | | polycountz0 -
What is your hypothesis why Panda/Penguin recoveries happen over months after an algorithm update rather than over night?
We have experienced many scenarios were ranking recoveries from clear Panda and Penguin penalties on our sites don't necessarily happen with the launch of a Panda/Penguin update but instead trickle back in over weeks and months after a confirmed algo update. A good example is shown in the image which shows a panda recovery for a high volume keyword. What is your theory why these ranking recoveries happen over weeks vs instantly? qCWliLF
Algorithm Updates | | italiansoc0 -
SEO Audit after Penguin 2.1 what are you guys seeing? this is my thougts
We have looked at around 2000 sites since Penguin 2.1 launched a few weeks back. These include our customers and their own competitors site. We are going through all the data which is obviously going to take some time. Hopefully we will publish a report on our findings as we are happy to share. What I currently see in my early analysis is Roughly 70% of sites tested have 0% exact match Anchor Text for their money keywords. The other 30% have less than 5% exact match Anchor Text. The quality of the links is often still poor to the sites ranking on page 1. The content surrounding the links is only about 10-15% of the time related to the money keywords. The loading time of the sites ranking seems to not matter, we encountered a lot of slow sites. Design and usability of the site was not important. We are not seeing much impact via Social media, a lot of these sites are small business Less than 10% of sites on page 1 had a Google+ account More than 40% of page 1 sites had Facebook profiles. More than 80% of the sites ranking on page 1 had less than 100 links to the landing page that ranked What are your opinions of helping to recover if hit by the above??? Q) If you have too high an anchor text percentage and have been hit or may get hit in the future would you. a) create some more high quality links with more varied anchor text, ie Click here, brand name etc b) not create any more links just remove the links you have to dilute the anchor text c) change the anchor text on links you are able to These figures are a work in progress so data will change just wanting to share our early findings and try to get a good conversation going. What are you guys seeing?
Algorithm Updates | | tempowebdesign0 -
Penguin type over-optimisation now part of main algorithm?
Hey guys We think we have been seeing some over-optimisation penalties outside of Penguin updates. One possible penalty seems for over-optimisation on page and one penalty for a page with an over-optimised exact match link profile. Does anyone else suspect, or have seen word elsewhere, that Google's main ongoing algorithm now has Penguin like capabilities and is able to bring over-optimisation penalties without a separate refresh being run?
Algorithm Updates | | QubaSEO0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Any insight on what factors Penguin is looking at?
Anyone have insight into what specific factors penguin is targeting and how it works? Matt Cutts seemed to infer that the site was targeting things such as spun content, keyword stuffing, etc. but most of the sites that have been hit that I've seen aren't doing any obvious content spamming like that. For example: Is penguin looking primarily at onsite or backlink factors? Does Penguin just discount spammy backlinks, or does it apply an additional penalty to sites that have poor quality backlinks? Anyone noticing specific onsite or offsite factors that correlate with whether a site has been hit or not?
Algorithm Updates | | AdamThompson3 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1 -
Recovered From The Panda Update?
Does anyone know if there are websites that have recovered from the Panda update?
Algorithm Updates | | dirkla0