Panda Update - Challenge!
-
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas.
Here are a few details to give you some background.
The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place.
I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues.
The only thing I can come up with is that it is either:
Something off-page related to links or
Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google.
If anyone has ideas or would like more info to help please send me a message.
I greatly appreciate any feedback.
Thank you, friends!
LHC
-
Mmm... yeah hard to guess without looking at the site then, on my own experience / research, these are some of the issues I found in many of the sites affected by Panda:
•Intrusive advertising, excessive use of Adsense, sites created only for Adsense or to solely promote a product•High amounts of duplicate content / scraped content•Bad user interface / “ugly” design•Usage data - low click-through-rate, low time-on-site, 100% bounce rate•Content analysis - not usable/readable/easily-consumable content•Excessive internal linking to one or two pages only
And I don't mean your site to be spammy, but some cases, like news sites with advertising, sometimes they get articles out with just a couple paragraphs of content, so that single page becomes more advertising than content.
Consider posting your site, it would be nice to take a look
and there is also the last reason: your site is innocent and just got hit by mistake, it happens.
-
Hi, Andrés-
They weren't running any AdSense - no advertising at all and the site isn't spammy.
-
Hi Lisa, besides what was already mentioned, one of the reasons many sites were affected is related to ads, if you have Adsense or any other kind of ads in an excessive way, where in some pages you have more ads than content, then that could be a signal of low quality.
-
We're seeing massive changes in rankings now. Not so much drops but far less in rises, and a few drops. This seems to be happening more recently and not immediately after the update.
We've guesstimated that it is down to the update, despite the delay in effects.
We've come to the conclusion that a lot of the links (both existing links and ones we've been building since) are not holding anywhere near as much weight as they once were. Especially links that were "easier to come by" i.e. blog comments, articles, etc...
Due to the fact that the sites the links were and are on, have been hit themselves it's logical to assume those links are now devalued. Lots of article sites were hit, and "low value" sites. Thankfully not all links were from such sites but some were, which explains the drops I think.
-
Hi Lisa, It will be a little tricky without actually looking at the website but my starting point would be what you have done, duplicate content elimination. If the rel=canonical is in place I would double check that it is a 301 re-direct. I would certainly take a look at the content to see if it is duped on third part websites. Some SEO firms just copy and paste segments of the website and add links. Another concern i would have is the backlink hosts status so I.E a website which seemed appropriate at the time has also been hit by the panda update and is now classed as spam. I would run a backlink checker + search for the content to see if it’s duped. Let me know how you get on with this, it will be interesting to see what the culprit is.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
Website hit by something, but not sure if penguin or panda?
Hi my site was doing ok, last week, humming along and increase in traffic. We felt that all the work removing bad links had worked. then all of a sudden, bang traffic dropped and is still dropping day by day. The strange thing as well is all the social media, bing and yahoo traffic has also dried up! Has anyone else had something smiler?
Intermediate & Advanced SEO | | Taiger0 -
Do I have a Panda filter on a specific segment?
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages! Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links. That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0). We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model. Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
Intermediate & Advanced SEO | | Matt.Carwow0 -
What Link building Strategies should adopt after hummingbird update?
I need to know that what Link Building or SEO Strategies should be adopt after latest hummingbird update. I am really much confuse about it. Kindly Help. Thanks
Intermediate & Advanced SEO | | irfan200120 -
Does this make sense to recover from panda?
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then. I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months. Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success. I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
Intermediate & Advanced SEO | | DaveMri0 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
Will our PA be retained after URL updates?
Our web hosting company recently applied a seo update to our site to deal with canonicalization issues and also rewrote all urls to lower case. As a result our PA is now 1 on all pages its effected. I took this up with them and they had this to say. "I must confess I’m still a bit lost however can assure you our consolidation tech uses a 301 permanent redirect for transfers. This should ensure any back link equity isn’t lost. For instance this address: http://www.towelsrus.co.uk/towels-bath-sheets/aztex/egyptian-cotton-Bath-sheet_ct474bd182pd2731.htm Redirects to this page: http://www.towelsrus.co.uk/towels-bath-sheets/aztex/egyptian-cotton-bath-sheet_ct474bd182pd2731.htm And the redirect returns 301 header response – as discussed in your attached forum thread extract" Firstly, is canonicalization working as the number of duplicate pages shot up last week and also will we get our PA back? Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0