Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
-
Hello,
I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings.
Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions?
Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings?
Thank you for your help!
-
I think Alan and EGOL have summed it up nicely for you.
I have looked at a lot of Panda hit sites and one of the most common issues were e-commerce sites that consisted of primarily of stock product descriptions. Why would Google want to rank a site highly that just contains information that hundreds of other sites have?
If you've got a large chunk of your site containing duplicate descriptions like this then you can attract a Panda flag which can cause your whole site to not rank well, not just the product pages.
You could use the duplicate product descriptions if you had a large amount of original and helpful text around it. However, no one knows what the ratio is. If you have the ability to rewrite the product descriptions this is by far the best thing to do.
-
Just adding a point to this (and with reference to the other good points left by others) - Writing good product descriptions isn't actually that expensive!
It always seems it, as they are usually done in big batches. However on a per product basis they are pretty cheap. Do it well and you will not only improve the search results, but you can improve conversions and even make it more linkable.
Pick a product at random. Would it be worth a few £/$ to sell more of that item? If not remove it from the site anyway.
-
Adding a lot of SKUs to your site in a relatively short amount of time by borrowing content from another site sounds more like a bad sales pitch than a good "opportunity". If you don't want to put in jeopardy a significant chunk of your business, then simply drip the new sku's in as you get new content for them. The thin content's not likely to win you any new search traffic, so unless their addition is going to quickly increase sales from your existing traffic sources and quantities in dramatic fashion, why go down that road?
-
adding emphasis on the danger.
Duplicate product descriptions are the single most problematic issue ecommerce sites face from an SEO perspective. Not only are most canned descriptions so short as to cause product pages to be considered thin on content, copied/borrowed descriptions are more likely to be spread across countless sites.
While it may seem like an inordinate amount of time/cost, unique quality descriptions that are long enough to truly identify product pages as being worthy will go a long way to proving a site deserves ranking, trust.
-
You can hit Panda problems doing this. If you have lots of this content the rankings of your entire site could be damaged.
Best to write your own content, or use this content on pages that are not indexed until you have replaced with original content.
Or you could publish it to get in the index and replace as quickly as possible.
The site you are getting this content from could be damaged as well.
-
You definitely could run in to trouble here. Duplicate content of this type is meant to be dealt with on a page level basis. However if Google think it is manipulative then then it can impact on the domain as a whole. By "think" I really mean "if it matches certain patterns that manipulative sites use" - there is rarely an actual human review.
It is more complex than a simple percentage. Likely many factors are involved. However.. there is a solution!
You can simply add a no index tag to the product pages that have non-original content. That;ll keep them out of the index and keep you on the safe side of dupe issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content is not very similar but topic is same: Will Google considers the rel canonical tags?
Hi Moz community, We have multiple pages from our own different sub-domains for same topics. These pages even rank in SERP for related keywords. Now we are planning to show only one of the pages in SERP. We cannot redirect unfortunately. We are planning to use rel canonical tags. But the page content is not same, only 20% is similar and 80% is different but the context is same. If we use rel canonicals, does Google accepts this? If not what should I do? Making header tags similar works? How Google responds if content is not matching? Just ignore or any negative score? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate Product Pages On Niche Site
I have a main site, and a niche site that has products for a particular category. For example, Clothing.com is the main site, formalclothing.com is the niche site. The niche site has about 70K product pages that have the same content (except for navigation links which are similar, but not dupliated). I have been considering shutting down the niche site, and doing a 301 to the category of the main site. Here are some more details: The niche sites ranks fairly well on Yahoo and Bing. Much better than the main site for keywords relevant to that category. The niche site was hit with Penguin, but doesn't seem to have been effected much by Panda. When I analyze a product page on the main site using copyscape, 1-2 pages of the niche site do show, but NOT that exact product page on the niche site. Questions: Given the information above, how can I gauge the impact the duplicate content is having if any? Is it a bad idea to do a canonical tag on the product pages of the niche site, citing the main site as the original source? Any other considerations aside from duplicate content or Penguin issue when deciding to 301? Would you 301 if this was your site? Thanks in advance.
Algorithm Updates | | inhouseseo0 -
Google not crawling click to expand content - suggestions?
It seems like Google confirmed this week in a G+ hangout that content in click to expand content e.g. 'read more' dropdown and tabbed content scenarios will be discounted. The suggestion was if you have content it needs to be visible on page load. Here's more on it https://www.seroundtable.com/google-index-click-to-expand-19449.html and the actual hangout, circa 11 mins in https://plus.google.com/events/cjcubhctfdmckph433d00cro9as. From a UX and usability point of view having a lot of content that was otherwise tabbed or in click to expand divs can be terrible, especially on mobile. Does anyone have workable solutions or can think of examples of really great landing pages (i'm mostly thinking ecommerce) that also has a lot of visible content? Thanks Andy
Algorithm Updates | | AndyMacLean0 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
Content, for the sake of the search engines
So we all know the importance of quality content for SEO; providing content for the user as opposed to the search engines. It used to be that copyrighting for SEO was treading the line between readability and keyword density, which is obviously no longer the case. So, my question is this, for a website which doesn't require a great deal of content to be successful and to fullfil the needs of the user, should we still be creating relavent content for the sake of SEO? For example, should I be creating content which is crawlable but may not actually be needed / accessed by the user, to help improve rankings? Food for thought 🙂
Algorithm Updates | | underscorelive0 -
Can I have the same item description on Amazon, eBay and my website?
Hi guys, After looking on the Internet and reading the Learn SEO section on this site, I've realised that Google doesn't like duplicate content and penalises it, whether that's duplication on your own site or of another site's content. We are an online retailer currently selling on different platforms including Amazon, eBay and our own ecommerce webstore. Is it okay to have the same item description (i.e. main page copy) on each of these sites, or will our search rankings get negatively impacted? Thank you in advance, I have researched on this issue also but I couldn't find a concrete answer. Tanay
Algorithm Updates | | goforgreen0 -
Title Tags and Over Optimization Penalty
In the past, it was always a good thing to put your most important keyword or phrase at the beginning of the Title Tag with the company name at the end. Now according to the over optimization penalty in the Whiteboard Friday video, it seems to be better to be more human and put the company name at the beginning with the keyword or phrase following. Am I understanding this correctly?
Algorithm Updates | | hfranz0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0