ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there any alternative ranking strategies for not a blog site other than on site SEO, speed improvement, building backlinks and social media engagement to improve rankings?
We own a horoscope website and looking for some SEO advice.However most of the websites are blog sites therefore most of the SEO content is about how to rank a blog site better. IE getting new quality content, use anchor text link out etc. However if your site is different by nature it is hard to find good advice on how to rank better in these scenarios. I would like to know if there are alternative ways of increasing rankings apart from the usual strategies of improving social media fan pages, building backlinks and optimising the site speed wise and making it accessible and understandable to crawlers and people too.
Algorithm Updates | | websitebuilder0 -
Early Feb Update?
After the Google "Mystery Update" in early Feb (Algoroo marks the flux as Feb 5th) one of my B2C e-commerce sites has gotten absolutely hammered. We had some trouble last May but cleaned a whole host of on-site/site quality issues over the summer and as of early Sept we began to regain rankings/traffic quite nicely...until Feb when we've almost given it all back. I was doing a little hunting around on SearchMetrics.com and saw lots of big ecommerce sites took a hit as well (eBay, Kohls, Target, etc…). I had read someplace (I think it was the SearchMetrics blog) that the Feb 5<sup>th</sup> update had to do with sites ranking due to misspellings. We do have several domains variations of company name redirecting to our site, I was wondering if that could have hurt. I dropped the redirects as a test since they do not drive traffic. I was wondering if anyone else has seen similar issues and/or could shed some light on the situation. It is very disheartening to see all that hard work simply "go away" so quickly.
Algorithm Updates | | costume0 -
Why do weaker competitors on open site explorer outrank me on SERP
Hi I am new to the whole SEO + marketing and was just wondering why the competition is doing better in SERP they are all in the top 1-5 positions; however i am 5th on page 2 ?? after doing a site explorer analysis i found that I am beating them in all aspects, my site is sparrowmakeup.com.au keyword is "makeup artist sydney' any suggestions on how to increase my SERP would be helpfull.
Algorithm Updates | | EdsonGroupMedia1 -
Google is really NOT SAYING IN "HOW SEARCH WORKS” ?
Hi All SEOmoz members and team, As I was reading this, is it true that Google does this . Simply, I don't think so, I haven't experienced any of such what is being talked [http://www.fairsearch.org/search-manipulation/what-google-isnt-saying-in-how-search-works/ C](http://www.fairsearch.org/search-manipulation/what-google-isnt-saying-in-how-search-works/ "http://www.fairsearch.org/search-manipulation/what-google-isnt-saying-in-how-search-works/")ome on, let us discuss the real thing about Google. Teginder Ravi
Algorithm Updates | | Futura0 -
Is it better to build a large site that covers many verticals or many sites dedicated to each vertical
Just wondering from an seo perspective is it better to build a large site that covers many verticals or build out many sites one for each vertical?
Algorithm Updates | | tlhseo0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1 -
Classifieds and Google Panda
It seems Google's Panda update is targetting low quality sites with little unique content (I know there's more to it than that). It makes sense that they may want to do this but what about classified sites. They may use some scraped content as well as unique ads, and the ads may lack content as they rely on the users writing the ads. However, they are helpful to the people that use classifieds. Because of these factors, these sites are suffering with the release of the latest Panda update. Any advice for classified sites and how they can combat the rankings drops???
Algorithm Updates | | Sayers0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0