Do you know a case where product variations caused panda?
-
Would like to add 300 products to ecommerce site of which 150 products are just variations in different colors.
In this particular case there are some reaons for:
- Not writing partially unique product descriptions for each product variation
- Not setting them up as variations on one product page
So I would have several product pages with nearly identical product descriptions (proprietary description written by us), just name of color in title and description and EAN in description being different, as well as over time different user generated content showing up. Also different product images used.
I would not mind if google would not index some product variations.
Do you think I should be concerned about Panda? Do you know any website which had a Panda problem caused by product variations?
Thanks
-
I don't think anyone really knows how much of the text needs to be unique in order to trigger Panda.
I just finished an audit for a site that had pages that had content that was completely duplicated on another site, yet the information was all in a different order than the other site. The site was not affected by Panda. But does that mean that simply changing up the order of text is enough to evade Panda? One case like this is not enough to prove anything.
I think you need more than just a change in the meta description though.
-
Marie, thanks a lot for sharing this.
I will use the canoncical then as you suggest for now.
At a later point I will have unique product descriptions written for product variations.
From your experience, do you think small variation of description text is enough, or do you think that the majority of text may need to be unique. At zalando I actually saw that they implemented minimal variations in the beginning of the meta description. Just adding sometimes 2 words or changing word order.
-
I have done a large number of traffic drop audits for Panda hit sites and I can tell you that I have seen many sites where product descriptions have gotten them in the doghouse.
In most cases, however, the main problem is that the site is using stock product descriptions that hundreds of other sites are also using. As such Google sees that xx% of your site contains information that is exactly the same as other sites and decides that your site is of low quality and should not rank well.
In your situation though, you are talking about duplication within your own site if I understand you right. So, you are saying that you would write a unique product description and then have slight variations on each page.
I worked with one site that sold not products, but reports. The reports produced thousands of pages on the site that had 90-95% duplicated content and then just a few words that were unique to that page. I believe that the Panda issue was connected to this mass duplication.
You asked why large brands can get away with this. That is a really good question. This is a personal theory, and not something I have proven, but I am wondering if Google has some leeway in regards to Panda for brands who sell products. When I look at the site you mentioned their product descriptions are thin and yes, they are duplicate. There is almost no text on the page for some of them. These are classic things that can cause other sites to be affected by Panda.
I really think your best option is to use rel-canonical for these pages. This type of situation is exactly why Google developed rel canonical (see http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394)
-
thanks.
I understand that this is considered best practice.
But do you know about a case or heard about a case where this really was the likely reason for panda?
I noticed that some very big ecommerce sites such as zalando with top-notch SEO, actually publish color variations with near-identical product descriptions and without canonical or noindex. I was wondering, maybe google is more understanding in terms of content duplciation when it comes to product variations on ecommerce sites as it is generally not done for manipulative purposes.
-
As you describe your problem it seems to me that you will have lots of very near duplicate content pages on your site. That can cause a panda problem and is a waste of your website's power.
There are multiple solutions.
-
combine all of these color,etc. variants to a single page
-
use rel=canonical to attribute duplicates to a master version
-
write very unique descriptions for each color,etc variant, optimizing each for a different query
-
-
Use excel to concatenate unique product descriptions using different parts of the sentence as different variables.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keywords for product title
Hi, I am trying set the product title for my product that I sell online, I have more than 300 products, I am now doing keywords research, any suggestion of the monthly search volume I should aim for each keyword? I just started and selling engagement rings
Intermediate & Advanced SEO | | up8772910 -
My product category pages are not being indexed on google can someone help?
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us. Our website is www.skirtinguk.com And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
Intermediate & Advanced SEO | | chelseaskirtinguk0 -
Category vs Product level URL - Does it Matter?
Is there much google juice to be had by moving a key "money making" product up the URL structure? For example, in this URL http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance.aspx will we gain any juice moving "Over-50-life-insurance" out of the "funeral planning" category and directly to the Domain eg www.over50choices/over-50-life-insurance.aspx ? The page currently ranks on page 2 and 3 for various phrases and we are looking to get to page 1 - its a very competitive set of keywords! Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
May know what's the meaning of these parameters in .htaccess?
Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
Intermediate & Advanced SEO | | esiow2013
RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist1 -
Benefits of Rich Snippets for financial products
Does anyone have experience of using rich snippets for non-physical products? Our website offers credit cards comparison service. Do you think that tagging each card's page with rich snippets such as credit card image, name, description and category makes sense? The idea is to make it stand out in the search results.
Intermediate & Advanced SEO | | imoney0 -
One Person - Different Name Variations - SEO
There is a client we have, John Smith, and he owns www.johnsmith.com. John Smith is also known as John A. Smith, Johnny Smith, and Jonathan Smith. He wants to show up in the top 3 for all his variations, but the catch is he doesn't want his variations anywhere on his www.johnsmith.com Does anyone have any tips or ideas on how we could do that better? Thank you
Intermediate & Advanced SEO | | Rocket.Fuel0 -
I am working SEO on a website that has 2 pages for different variations of a keyword.
I have run into a situation where a website has 2 pages for different variations of a keyword. I personally like to use 1 page and make it powerful for a variety of variations of that keyword. Unfortunately for the site I’m working on, using only one page is not an option. Here is an example: They have a page for “Alex Miley Cameras” and then they have a page for “Alex Miley Cell Phones”. On the first one they want to rank for Alex Miley & Alex Miley Cameras. For the 2<sup>nd</sup> they want to rank for “Alex Miley Cell Phones”. My concern is will Google be indecisive on which page to rank for the keyword “Alex Miley” since they both contain this word. Also, will it affect any of the other words and spread the juice making each page weaker. I would appreciate advice on how to rank these pages each separately for their keywords and not have to worry about any confusion from Google. I can’t change the structure of the site. I only have access to the Meta info and page content. Thank you for your help
Intermediate & Advanced SEO | | SEOPresident0 -
Best website structure for product benefits and features.
I'm in disagreement with my partner over how best to represent our products' benefits and features on the homepage of our website. I'm interested in this from primarily a SEO perspective but it obviously has an impact on conversions as well. I believe that a homepage shouldn't contain too much information so as not to overwhelm the user, a brief sentence or two about each benefit with a link to another page with in depth info about the related feature. Each of these inner pages would be optimized and contain much more content that you could put on the homepage example below. Each Please see wireframe A He believes in more information on the homepage. There is more content to index which he believes is important for the homepage. Also, by using tabs most of the content is hidden from initial view so its doesn't clutter the page and the user doesn't have to leave the page to decide whether he is interested in the software. Please see wireframe B below. I'd really love to hear from other Moz'ers which they would choose and why?
Intermediate & Advanced SEO | | Riona0