Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
-
Hello,
I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings.
Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions?
Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings?
Thank you for your help!
-
I think Alan and EGOL have summed it up nicely for you.
I have looked at a lot of Panda hit sites and one of the most common issues were e-commerce sites that consisted of primarily of stock product descriptions. Why would Google want to rank a site highly that just contains information that hundreds of other sites have?
If you've got a large chunk of your site containing duplicate descriptions like this then you can attract a Panda flag which can cause your whole site to not rank well, not just the product pages.
You could use the duplicate product descriptions if you had a large amount of original and helpful text around it. However, no one knows what the ratio is. If you have the ability to rewrite the product descriptions this is by far the best thing to do.
-
Just adding a point to this (and with reference to the other good points left by others) - Writing good product descriptions isn't actually that expensive!
It always seems it, as they are usually done in big batches. However on a per product basis they are pretty cheap. Do it well and you will not only improve the search results, but you can improve conversions and even make it more linkable.
Pick a product at random. Would it be worth a few £/$ to sell more of that item? If not remove it from the site anyway.
-
Adding a lot of SKUs to your site in a relatively short amount of time by borrowing content from another site sounds more like a bad sales pitch than a good "opportunity". If you don't want to put in jeopardy a significant chunk of your business, then simply drip the new sku's in as you get new content for them. The thin content's not likely to win you any new search traffic, so unless their addition is going to quickly increase sales from your existing traffic sources and quantities in dramatic fashion, why go down that road?
-
adding emphasis on the danger.
Duplicate product descriptions are the single most problematic issue ecommerce sites face from an SEO perspective. Not only are most canned descriptions so short as to cause product pages to be considered thin on content, copied/borrowed descriptions are more likely to be spread across countless sites.
While it may seem like an inordinate amount of time/cost, unique quality descriptions that are long enough to truly identify product pages as being worthy will go a long way to proving a site deserves ranking, trust.
-
You can hit Panda problems doing this. If you have lots of this content the rankings of your entire site could be damaged.
Best to write your own content, or use this content on pages that are not indexed until you have replaced with original content.
Or you could publish it to get in the index and replace as quickly as possible.
The site you are getting this content from could be damaged as well.
-
You definitely could run in to trouble here. Duplicate content of this type is meant to be dealt with on a page level basis. However if Google think it is manipulative then then it can impact on the domain as a whole. By "think" I really mean "if it matches certain patterns that manipulative sites use" - there is rarely an actual human review.
It is more complex than a simple percentage. Likely many factors are involved. However.. there is a solution!
You can simply add a no index tag to the product pages that have non-original content. That;ll keep them out of the index and keep you on the safe side of dupe issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking penalty: Limited to specific pages or complete website?
Hi all, Let's say few pages on the website dropped in the rankings due to poor optimisation of the pages or hit by algo updates. Does Google limits the ranking drop only to these pages or the entire website will have any impact? I mean will this cause ranking drop to the homepage for primary keyword? Will Google pose the penalty to other pages in the website if few pages drop in the rankings. Thanks
Algorithm Updates | | vtmoz0 -
Where is this SERP listing of a product description coming from?
Google is showing a manufacturers product description below the ads and before the organic listings that I have not seen before, see the attached image. The bad part is instead of attributing it to the manufacturer it is attributing to one of our competitiors and placing thier link with the text. 1. Why is this happening? I can't find any schema or other mark-up on the page explaining where this content is coming from. 2. How do I combat this? I have not seen this type of SERP before. Any help is appreciated. HfYLGd0.jpg
Algorithm Updates | | groovecommerce0 -
Can I only submit a reconsideration request if I have a penalty?
Hey guys, One of the sites I'm looking after took a hit with their rankings (particularly for one keyword that went from 6/7 to 50+) post-Penguin in May. Although, after cleaning-up the link profile somewhat we started to see some slow and steady progression in positions. The keyword that dropped to 50+ was moving upwards in advance of 20. However, a couple of weeks back, the keyword in question took another slide towards 35-40. I therefore wondered whether it would be best to submit a reconsideration request - even though the site did not receive a manual penalty. The website has a DA of 40 which more than matches a lot of the competitor websites that are ranking on first page for the aforementioned keyword. At this stage, I would have expected the site to have returned to its original ranking - four-and-a-half months after Penguin - but it hasn't. So a reconsideration request seemed logical. That said, when I came to go through the process on Webmaster Tools I was unable to find the option! Has it now been removed for sites that don't receive manual penalties?
Algorithm Updates | | Webrevolve1 -
Will Parked Domain hurt My SEO as Duplicate Content?
Hello, I have one website (Migration Lawyers) and I have an extra 8 domains Parked so they are basically cloning the content of the site. so if the main site is: migrationlawyers.co.za and I have an addon domain migration-lawyers.com is that good or bad? is there a proper way to redirect the sites, will redirecting (301) subdomains be more effective? Thanks for your Input 🙂 0i8VXqr.png
Algorithm Updates | | thealika0 -
Google Shopping Blocking All Vitamins and Natural Products - Glitch or Deliberate Censorship?
Hi everyone. We have a client that manufactures and supplies dietary supplements all around the world. We are slightly concerned that a recent Google shopping glitch (or change) is now seemingly excluding products from the shopping search results. This currently appears only to be happening in the US but we are really concerned, as our client ships all over the world and the potential loss of revenue could be quite large. There is already a YouTube video that demonstrates what is going on which is available below: http://www.youtube.com/watch?v=zNDyS0tF4dY Just to clarify, these are products that should not be included in any of Google's “sensitive” categories as they currently stand. Taking Vitamin B12 as an example, it is recognised as a permissible dietary supplement within pretty much every regulatory framework around the world, including those governed by the US FDA, The Euopean Commission and the Australian TGA. Therefore there would be no legal reasons to prevent it's inclusion in shopping results in any country. Has this just slipped under the radar or can anyone point us to a resource that may be able to clarify why this has happened? Thanks in advance guys!
Algorithm Updates | | AduroLabs0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Host name per content
Hello everyone. I'm in charge of the website HispaZone.com in which apart from many other things we provide free program downloads in spanish in a similar way to softpedia, tucows, cnet, softonic and others. I'm not a great SEO but I try to do my best. Several months ago based on my most important competence (softonic.com and uptodown.com) I decided that I would give a host name under the domain hispazone.com for the landing page of each program download. For downloading Nero for example the landing page would be http://nero.hispazone.com and like this for the whole of our 800 program database. The thing is that after 5-6 months since that change and after many other improvements, the traffic coming from google to these downloads dropped dramatically. We thought it could have been related to Google Panda but we recently hired an SEO consultant and he says that it's because of not having the downloads under the same host name. That we lose the page authority and the link flow from the hostname http://www.hispazone.com. The SEO consultant seems to be great, very up to date with all new changes in google. We made many improvements thanks to him and I can say that I trust him with everything. But now comes the time for deciding if we move our program download landing pages back to the www.hispazone.com hostname. I would like some second opinion about this because the fact that the biggest ones in Spain like Softonic and Uptodown have a hostname for each program download when these companies invest really a lot in their SEO makes me be unsure of going back into having all under the same hostname. Thanks a lot.
Algorithm Updates | | HispaZone0 -
What is the critical size to reach for a content farm to be under google spot?
We're looking for building a content farm, as an igniter for another site, so there will be some duplicate content. Is it a good or a bad strategy in terms of SEO.
Algorithm Updates | | sarenausa0