Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate eCommerce Product Descriptions
-
I know that creating original product descriptions is best practices. What I don't understand is how other sites are able to generate significant traffic while still using duplicate product descriptions on all product pages. How are they not being penalized by Google?
-
From my experience as an SEO for a large eCommerce site (our own products), I tend to think that Google has a way of recognizing eCommerce site from purely informational ones and takes that into consideration when analyzing content.
As you say Chris, many producers will distribute their catalogs to all their dealers and they in turn will put those online. The same happens with our products here. Our dealers use the very description we provide them with and no one has ever been penalized for that.
As said, I personally think that Google takes the intent of your site (eCommerce, informational etc. ) into consideration when slapping duplicate content penalties.
Having said that, i have no data to back up that claim so go easy on me, it's only based on my gut feeling and practical observations.
-
I can definitely understand the frustration, but Google won't penalize sites for simply having duplicate content, and especially storefronts. Many merchants are provided with photos and product descriptions by the distributor, and when you're talking about hundreds or even thousands of products, it just not feasible for a merchant to change all of the descriptions and even more so if your inventory is changing on a monthly or even weekly basis. Then all of your changes get overwritten with the new upload.
A good example would be the SMC websites that you see on late night TV where they send out a CD with products to thousands of customers and 98% of them just upload the database into their stores with little to no alteration. They won't be penalized, but they just won't be able to sell much.
In those cases, the sites aren't going to be penalized. And if those sites are ranking well without changing the content, then Google is definitely looking at other factors to make that decision (traffic, bounce rate, time on site, etc.).
The sites Google are penalizing are the ones that intentionally try to game the system by stripping content from other sites and reposting them with literally no changes at all. Also sites that try to duplicate one of their stores multiple times in a cookie cutter fashion in order to trick the system to see if they can get multiple listings on the SERPs.
You haven't provided specific sites to review for a definitive answer here, but they don't sound like they're trying to do anything black hat. They're just lazy. But if your site will be selling the same products, altering your descriptions and images is the only way that you'll get the advanatage over them instead of just becoming "yet another one of those sites". Good luck!
-
Thanks for the Amazon comment Chris :). I understand the multitude of variables when asking this question but after looking at a group of sites with similar backlink profiles, site architecture, etc. and all use duplicate product descriptions I am taken aback that they are not penalized. Even looking at smaller sites that are not properly constructed or optimized use duplicate product descriptions and still drive traffic/rank. Then I read all about rewriting product descriptions from SEOMoz and others (this information gels with what I know to be true) but then see sites still rank with this thin/dupe content.
Any thoughts?
-
That could be for a variety of reasons. Is that site the only one that is offering that particular product? Is it a highly trafficked site with a lot of backlinks, reviews, and online activity? Are the pages simply coded properly using canonical tags which help them escape "wrath"? These are all valid questions when you're doing competitive analysis and all things that Google considers along with dozens of other considerations.
Your best practice is to create new descriptions, take new photos or alter the existing ones (add text, crop, change contrast, etc.). This way your listing is seen as fresh and original content and will eventually take precedence over their carbon copy approach. If you have a better page with better content that's more informative to the customer, Google will choose your listing over 20 other sites that all have the same photos and descriptions.
Originality always wins....in most cases. Keep in mind that there are many other considerations in the Google algorithms, so don't expect to beat out Amazon no matter how hard you try.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product Descriptions (SEO)
So I would like a few opinions. How long should a product description be? Enough to get the point across? 100 words? 800 words? Over detailed? Any advice would be appreciated.
On-Page Optimization | | mattl990 -
Duplicate H3, H4 or H5 Tags
I know that duplicate H1 and H2 tags are a red flag for Google, but does the same apply for H3, H4 and H5 tags? A lot of my products have the same H5 tags and I'm wondering whether or not that is pulling down my keyword rank.
On-Page Optimization | | moon-boots0 -
How unique should a meta description be?
I'm working on a large website (circa 25k pages) that presently just replicates each page title as a meta description. I'm thinking of doing a 'find and replace' in the database so I change: to where the preceeding and following text would be the same in each case eg Is this unique enough? Obviously the individual keyword would make it technically unique each time....and manually changing them would take the rest of my life 🙂
On-Page Optimization | | abisti21 -
Duplicate page titles and hreflang tags
Moz is flagging a lot of pages on our site which have duplicate page titles. 99% of these are international pages which hreflang tags in the sitemap. Do I need to worry about this? I assumed that it wasn't an issue given the use of hreflang. And if that's the case, why is Moz flagging them as an issue? Thanks.
On-Page Optimization | | ahyde0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
Should I add PDF manuals to my product pages?
Hello. A lot of the products I sell on my e-commerce site are very technical. I decided to add PDF data sheets, manuals etc on each of the product pages to improve the customer experience. Now I am not sure if it was the best thing to do. I have noticed a couple of times that the PDF is out ranking the product page in the SERP. For a few products, the PDF ranks but the product page doesn't. Anyone got any ideas?
On-Page Optimization | | DavidLenehan0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5