What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
-
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish...
When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other -
We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content?
Thanks!
-
Very good answer - and yes, 2 bad choices but limited resources means I must choose one. Either that or Meta NOINDEX the dupes for the moment until they are re-written.
-
Good idea. Thank you.
-
I agree with you Kurt. In our space we see duplicate content everywhere, from manufacturer's sites to vendors to resellers. There is no such thing as a "duplicate content penalty." Google doesn't penalize duplicate content. They may choose to ignore it, which may feel like a penalty, but that's not technically what's going on.
I also agree with EGOL. If getting a lot of product descriptions is a daunting task, hire some writers. You can get it done for way less that you think. Need inspiration? Watch Fabio's video from MozCon 2012 where in 15-minutes he describes how he and his team created thousands of unique product descriptions in a very short amount of time without spending a lot of money: http://moz.com/videos/e-commerse-seo-tips-and-tricks
Cheers!
Dana
-
I'd take duplicate content over thin content. There are tons of eCommerce sites out there with duplicate product descriptions. I don't think that Google is going to penalize you, per se, they just might not include your pages in the search results in favor of whatever site they think is the originator of the content.
The reason I think duplicate content is better is users. Either way your search traffic is probably not going to be too great. With duplicate, the SE's may ignore your pages and with thin content you haven't given them a reason to rank you. But at least with some real content on the pages you may be be able to convert the visitors you do get.
That said, I like Egol's suggestion. Don't write new product descriptions yourself. Hire a bunch of people to do it so they can crank out the new content real quick.
Kurt Steinbrueck
OurChurch.Com -
Tom... that is some of the best that I have seen in a long time.
Thanks!
-
Nothing like a bit of hyperbole to brighten up a Tuesday, is there?!
-
I'd rather deal with the duplicate content. Personally I'd bounce quicker with Thin or no content than I would with the same content on a different but similar product page. Of course I wouldn't let the duplicate content sit there and hurt me... I'd add canonicals to pages that were similar. Now if it was the exact same content everywhere then that'd drive me nuts. But if I can look at all the products, realize how many are the same with a minor variation and how many truly different product types... then I could write content for fewer pages and consolidate link equity with the canonical without worrying about duplicate content penalizing me. Of course I could always just NoIndex those duplicate pages instead.
-
With a gun to my head....
lol... Wow. That is a great way to word this.
So, my response is, yes, put a gun to my head and I will pick between these two bad choices.
Really, if you are paying someone to write all of this content you can hire one writer and have them take a year to do it... or you can hire 12 writers and have the job done in a month. Same cost either way.
-
With a gun to my head - I'd say thin content is "better" than mass duplicate content.
This is only based on helping to remove penalties from clients' sites - I see more instances of a Panda penalty when duplicate content is present rather than 'thin' content, as it were.
However, it's important to understand how the algorithm works. It will penalise pages based on content similarity - so if a page has thin content on it - ie not a lot to differentiate it from another page on the domain - technically, Google will see it as a duplicate page, with thin content on it.
Now, my line of thinking is that if there is more content on the page, but the majority of it is duplicate - ie physically more duplicate content on the page - then Google would see this as "worse". Similarly, taking product descriptions from one domain to another, and having duplicate content from other domains, seems to be penalised more frequently than the Panda algorithm than just thin-content pages (at least in my experience).
Your mileage may vary on this, but if forced into a temporary solution, thin content may be better for SEO - but conversely worse for a user, as there is less about the product on the page. The best solution of course will be to rewrite the descriptions, but obviously there's a need for a temporary solution.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pagination causing duplicate content problems
Hi The pagination on our website www.offonhols.com is causing duplicate content problems. Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
Intermediate & Advanced SEO | | offonhols
http://offonhols.com/default.aspx?dp=2
http://offonhols.com/default.aspx?dp=3
etc0 -
Ticket Industry E-commerce Duplicate Content Question
Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks
Intermediate & Advanced SEO | | keL.A.xT.o1 -
Robots.txt & Duplicate Content
In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
Intermediate & Advanced SEO | | Careerbags
http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow - catalogsearch via the robots file? Are these links doing more harm than good?0 -
Ecommerce Duplicate Product Descriptions across 3 websites
Hi, We are an e commerce company that has our own domain but also sell the same products on eBay and Amazon. What is the feeling on the same exact descriptions being used on different platforms? Do they count as duplicate content? Will our domain be punished/penalised as our domain does not have as much authority as EBay or Amazon? We have over 5,000 products with our own hand written product descriptions. We want our website to be the main place/ have priority over the above market places. What's the best suggestion/solution? thanks,
Intermediate & Advanced SEO | | Roy19730 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0 -
Login Page = Duplicate content?
I am having a problem with duplicate content with my log in page QuickLearn Online Anytime - Log-in
Intermediate & Advanced SEO | | QuickLearnTraining
http://www.quicklearn.com/maven/login.aspx
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BAM-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BRE-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTAF
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTDF What is the best way to handle it? Add a couple sentences to each page to make it unique? Use a rel canonical, or a no index no follow or something completely different? Your help is greatly appreciated!0 -
Managing Large Regulated or Required Duplicate Content Blocks
We work with a number of pharmaceutical sites that under FDA regulation must include an "Important Safety Information" (ISI) content block on each page of the site. In many cases this duplicate content is not only provided on a specific ISI page, it is quite often longer than what would be considered the primary content of the page. At first blush a rel=canonical tag might appear to be a solution to signal search engines that there is a specific page for the ISI content and avoid being penalized, but the pages also contain original content that should be indexed as it has user benefit beyond the information contained within the ISI. Anyone else running into this challenge with regulated duplicate boiler plate and has developed a work around for handling duplicate content at the paragraph level and not the page level? One clever suggestion was to treat it as a graphic, however for a pharma site this would be a huge graphic.
Intermediate & Advanced SEO | | BlooFusion380