Panda 4.0 Suggestions
-
My site was hit pretty negatively by Panda 4.0 and I am at a loss for the best way to address it. I have read like every article that I can and I know there is some duplicate manufacturer product descriptions but I don't hear many other ecoms complaining about Panda so I figure it must be something else. Also, the pages that seem most negatively affected are category and product list pages. Any help or suggestions would be much appreciated. Thanks!
-
THanks for all of the answers, does anyone have recommendations for good companies that do consulting for Panda related issues?
-
Thanks for the answer, that is kinda what I figured too. I know I have to re write the content in order to be more competitive but I assumed there has to be something else that is effecting me as well.
-
Here is a Matt Cutts video where he explains why you need more than duplicate content.
Interesting to watch because you can see, just a few years ago, in 2009, Matt did not have all of the gray hair. That is what all of this Panda and Penguin stuff has done to him.
-
Duplicate product descriptions alone are probably not why you got hit by Panda 4.0. Matt Cutts is on record saying that duplicate product descriptions are not a big deal for ecommerce sites.
-
EGOL is correct, Gordian.
Start fixing those. One the content is brand new and reindexed, it's a waiting game from there onwards. It might make you wait for the next update or you might need to look at your site structure further to find other culprits.
There's really not much Panda 4 did differently. It only improved the original Panda so the same stuff applies so hopefully, you'd get back when you fix the current issues.
-
Hundreds of other sites are using the same content.
The same product descriptions that you are using appear on Heavyweight sites like... NewEgg, Yahoo Shopping, Ebay.
Get rid of ALL of that content written by other people.
Write original content or die.
-
Have you read this post yet? Sounds like what happened to eBay is much like what happened to you. You may want to keep up with what eBay will do to rebound and take cues from that going forward. Just a thought.
Panda 4.0, Payday Loan 2.0 & eBay's Very Bad Day
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
18 years later, Page Rank 6 Drops to 0, All +1s disappear, Scrapers outrank us
18 years ago I put up our first website at http://oz.vc/6 Traffic grew and our forums reached hundreds of thousands of posts, our website had a page rank of 6 and our forums and other content areas ranked 5-6, the others usually 4-6. Panda 2.2 came along and whacked it. No measures recommended by SEO experts and the Matt Cutts videos even made a dent, including some pretty severe measures that were supposed to make a difference. Bing and Yahoo traffic both grew since Panda 2.2 and only Google kept dropping every few updates without recovery. Several few weeks ago Google provides the ultimate whack. It seems every page other than the home page has either a PR of 0 or not generating any PR at all. Every +1 disappeared off of the site. Now three pages have +1 back and the entire guide section (hundreds of articles) are still missing all +1s. I discovered two scrapers, one of which was copying all of our forum posts and ranking a PR 2 for it (while we have a zero. They were taken down but I still can't imagine how this result could happen. I am going to have an RSS feed aggregator taken down that is ranking a 2 and knows we can't prevent them from taking our Wordress feeds and storing them (we use them for areas on the site.) How can Google provide us with a zero page rank and give obvious scrapers page rank? What should have been years worth of awesome rich added content and new features was wasted chasing Google ghosts. I've had two SEO people look at the site and none could point to any major issue that would explain what we've seen, especially the latest page rank death penalty. We haven't sold paid links. We have received no warnings from Google (nor should we have.) The large "thin" area you may see in a directory were removed entirely from Google (and made no difference and a drop in Google doing the "right" thing!) Most think we have been stuck for a very long time in the rare Google glitch. Would be interested in your insights.
Algorithm Updates | | seoagnostic0 -
What our next step after hitting by Penguin 2.0
Hi Everyone, Any idea about what the next step after hitting by Penguin 2.0 We go for more link building or content optimization or some thing else..?
Algorithm Updates | | lucidsoftech0 -
Panda Updates?
Anyone aware of any algo updates/refreshes today? Been seeing some serp movements on .co.uk
Algorithm Updates | | PeterAlexLeigh0 -
Panda 2.3 features
So, its official that Panda 2.3 is out.. Has anyone found the fine print on what this "version" focused on?
Algorithm Updates | | malachiii0 -
Panda Update: Need your expertise...
Hi all, After Panda update our website lost about 45% of it's traffic from Google. It wasn't an instant drop mostly it happened gradually over the last 5 months. Our keywords (all of them except the domain name) started to lose positions from top #10 to now 40+ and all recovery attempts we have done so far didn't really help. At this moment it would be great to get some advice from the top experts like you here. What we have done so far is that We have gone through the all pages and removed the duplicate / redundant ones. We have refresh the content on the main pages and also all pages now have an canonical tags. Our website is www.PrintCountry.com. Thank you very much in advance for your time.
Algorithm Updates | | gbssinc0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1 -
I think Panda was a conspiracy.
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy. Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase." And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix. Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda. So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well. Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query. I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place. What do you think?
Algorithm Updates | | MarieHaynes3 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0