ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google adding main site name to the title tags of pages in the sub folders: How to handle?
Hi community, Ours is a WP hosted website. We have given our site title which reflects across all the website page title suffix. Like "Moz SEO" will be default at the title for pages like "Local SEO - Moz SEO". We have given different page title suffix to our sub-folders' pages like blog and help guides. For blog we have given "Moz blog" as title tag suffix which was working fine. But Google suddenly started showing main website's title as suffix in pages of sub folders. Ex blog: "How to rank better - Moz blog - Moz SEO". Here we can see "Moz SEO" has been added which is not required. How to handle this? Thanks
Algorithm Updates | | vtmoz0 -
Fred Google Update & Ecommerce Sites
Hi I've seen a couple areas of our site drop in average rankings for some areas since the 'Fred' update. We don't have ads on our site, but I'm wondering if it's 'thin' content - http://www.key.co.uk/en/key/ We are an ecommerce site and we have some content on our category pages - which is a bit more generic about the section/products within that section - but how can it not be if it's a category page with products on? I am working on adding topic based content/user guides etc to be more helpful for customers, but I'd love some advice on generating traffic to category pages. Is it better to rank these other topic/user guide pages instead of the category page & then hope the customer clicks through to products? Advice welcome 🙂
Algorithm Updates | | BeckyKey0 -
Ranking Drop After Switching Sites
I have a client who's rankings dropped after switching to out site. We know that rankings can drop a little after switching, but we are concerned that hers are still low. Any suggestions? As far as I can tell, the links to her site remained the same. Thanks Holly
Algorithm Updates | | hwade1 -
Has Bing rolled out an algorithm update?
Hi everyone, Does anyone know if Bing has rolled out an algorithm update recently? I have noticed a decent change in Bing/Yahoo results recently. Searching online only produces results for Google's updates. Thanks in advance for your help.
Algorithm Updates | | DirectiveSEO0 -
SEO ANALYSIS ON A NEW SITE
Hi just would like if anyone could help me in provide some seo analysis on a new website http://www.ppilegalservices.co.uk/ main keyword is mis-sold ppi Its a very competitive keyword but not being able to come on google result in long tail keywords as well, Just got ranked on brand keywords like PPI LEGAL Services. Also running out of ideas as to how to create quality content any tips please? many thanks
Algorithm Updates | | conversiontactics0 -
Is there a utility that can tell me what keywords my site already ranks high for?
Ok... so I'm looking for a way to understand what my site already ranks high for.. I don't necessarily want to have to manually type in keywords. The purpose of this exercise is to demonstrate to a client what keywords they're already ranking high for. Is there an easy way / tool to go about doing this? Thanks in advance, Gene
Algorithm Updates | | BGroup0 -
Google Update on the 6th July
Hi Mozzers, Has anyone noticed a Google update on the 6th July? A price comparison site I optimise has fallen off the SERPs for most generic terms, however still getting traffic for longer tail phrases. Cheers Aran
Algorithm Updates | | Entrusteddev0 -
Rel="author" - This could be KickAss!
Google is now encouraging webmasters to attribute content to authors with rel="author". You can read what google has to say about it here and here. A quote from one of google's articles.... When Google has information about who wrote a piece of content on the web, we may look at it as a signal to help us determine the relevance of that page to a user’s query. This is just one of many signals Google may use to determine a page’s relevance and ranking, though, and we’re constantly tweaking and improving our algorithm to improve overall search quality. I am guessing that google might use it like this..... If you have several highly successful articles about "widgets", your author link on each of them will let google know that you are a widget expert. Then when you write future articles about widgets, google will rank them much higher than normal - because google knows you are an authority on that topic. If it works this way the rel="author" attribute could be the equivalent of a big load of backlinks for highly qualified authors. What do you think about this? Valuable? Also, do you think that there is any way that google could be using this as a "content registry" that will foil some attempts at content theft and content spinning? Any ideas welcome! Thanks!
Algorithm Updates | | EGOL3