ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best way to rank a site?
Hi, I have been working on SEO for a long time, recently I started a new site where I was aiming to rank different niches but I am stuck. First I covered some keywords related to sports then I shifted the niche to hunting. My idea was to cover a niche fully then move on to the 2nd so the authority of the site can also help rank the 2nd niche but the problem is I am unable to rank my site. Should I be considering only a very specific niche site or should I continue doing all the stuff on the same site. Please checkout my site ReviewsCase.com and let me know. And if has also done the same please let me know.
Algorithm Updates | | seoasikhan20 -
Anyone else noticing "Related Topics" featured snippet? Is this new?
First time I've seen this type of featured snippet and now have seen it twice in the space of a couple hours. Queries on Google UK desktop: surgical instruments Hawking radiation Is this new? It definitely is for the "surgical instruments" search. Google are highlighting related topics/keywords in bold beneath the usual featured snippet. b261ea5b3279991f8549d20127f8fde3.png
Algorithm Updates | | Ria_0 -
What do media queries have to do with the page layout update?
Who thinks the lack of media queries will have an impact on whether the page layout update affects a site?
Algorithm Updates | | kimmiedawn0 -
Relevant site outranked by powerful un-relevant sites
One of my clients has a site in a niche market, and has been ranking well for years. Since the Penguin algorithm changes his site dropped and 4-5 other sites came out of nowhere to take to top spots. These are very large sites, but they are not really reliant to the search terms. Sure, they sell one or two of the niche products, but our site is dedicated to those products. The site has been updated since I took over on the site, and is well SEOed. The site in question still ranks 1st for the keywords in every other search engine imaginable. Has anyone else encountered this? If so, how did you combat it?
Algorithm Updates | | DavidWilsonSEO0 -
"No Follow", C Blocks and IP Addresses combined into one ultimate question?
I think the the theme of this question should be "Is this worth my time?" Hello, Mozcon readers and SEO gurus. I'm not sure how other hosting networks are set up, but I'm with Hostgator. I have a VPS level 5 which (I think) is like a mini personal server. I have 4 IP addresses, although it is a C block as each IP address is off by one number in the last digit of the address. I have used 3 out of the 4 IP addresses I have been given. I have added my own sites (some high traffic, some start-ups) and I've hosted a few websites that I have designed from high paying customers. -one man show, design them, host them and SEO them With the latest Penguin update, and with learning that linking between C Block sites is not a great idea, I have "No Followed" all of the footer links on client sites back to my portfolio site. I have also made sure that there are no links interlinking between any of my sites as I don't see them in the Site Explorer, and I figure if they aren't helping, they may be hurting the rankings of those keywords. Ok, so...my question is: "I have one IP address that I'm not using, and I have a popular high traffic site sharing it's IP with 5 other sites (all not related niches but high quality) Is it worth it to move the high traffic site to it's own IP address even though making the switch would take up to 48hrs for process to take affect? -My site would be down for, at the most 2 days (1 and a half if I switch the IP's at night) Is this really worth the stress of losing readers? Will moving a site on an IP with 5 other sites help the rankings if it was to be on it's own IP? Thank you very much ps- I can't make it to MOZcon this year, super bummed
Algorithm Updates | | MikePatch0 -
How useful is a mobile version of your site (for SEO sake)?
We're investigating a mobile version of our e-commerce site. Is it worth the investment regarding search engine optimization, or is this something that wouldn't have a big effect?
Algorithm Updates | | 9Studios0 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0 -
What is considered duplicate content in an ecommerce website that offers the same product for retail and wholesale purchasing?
I have an ecommerce website that offers retail and wholesale products which are identical, of course with the exception of pricing. My concern is duplicate content. If the same product is offered under both the retail and wholesale category, and described identically, with the exception of price, metadata and a few words, is that considered duplicate content and would both pages be disregarded by the robots? Is it best to avoid the same description for that one product under the two separate categories? Thanks for all your help!
Algorithm Updates | | flaca0