What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
-
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish...
When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other -
We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content?
Thanks!
-
Very good answer - and yes, 2 bad choices but limited resources means I must choose one. Either that or Meta NOINDEX the dupes for the moment until they are re-written.
-
Good idea. Thank you.
-
I agree with you Kurt. In our space we see duplicate content everywhere, from manufacturer's sites to vendors to resellers. There is no such thing as a "duplicate content penalty." Google doesn't penalize duplicate content. They may choose to ignore it, which may feel like a penalty, but that's not technically what's going on.
I also agree with EGOL. If getting a lot of product descriptions is a daunting task, hire some writers. You can get it done for way less that you think. Need inspiration? Watch Fabio's video from MozCon 2012 where in 15-minutes he describes how he and his team created thousands of unique product descriptions in a very short amount of time without spending a lot of money: http://moz.com/videos/e-commerse-seo-tips-and-tricks
Cheers!
Dana
-
I'd take duplicate content over thin content. There are tons of eCommerce sites out there with duplicate product descriptions. I don't think that Google is going to penalize you, per se, they just might not include your pages in the search results in favor of whatever site they think is the originator of the content.
The reason I think duplicate content is better is users. Either way your search traffic is probably not going to be too great. With duplicate, the SE's may ignore your pages and with thin content you haven't given them a reason to rank you. But at least with some real content on the pages you may be be able to convert the visitors you do get.
That said, I like Egol's suggestion. Don't write new product descriptions yourself. Hire a bunch of people to do it so they can crank out the new content real quick.
Kurt Steinbrueck
OurChurch.Com -
Tom... that is some of the best that I have seen in a long time.
Thanks!
-
Nothing like a bit of hyperbole to brighten up a Tuesday, is there?!
-
I'd rather deal with the duplicate content. Personally I'd bounce quicker with Thin or no content than I would with the same content on a different but similar product page. Of course I wouldn't let the duplicate content sit there and hurt me... I'd add canonicals to pages that were similar. Now if it was the exact same content everywhere then that'd drive me nuts. But if I can look at all the products, realize how many are the same with a minor variation and how many truly different product types... then I could write content for fewer pages and consolidate link equity with the canonical without worrying about duplicate content penalizing me. Of course I could always just NoIndex those duplicate pages instead.
-
With a gun to my head....
lol... Wow. That is a great way to word this.
So, my response is, yes, put a gun to my head and I will pick between these two bad choices.
Really, if you are paying someone to write all of this content you can hire one writer and have them take a year to do it... or you can hire 12 writers and have the job done in a month. Same cost either way.
-
With a gun to my head - I'd say thin content is "better" than mass duplicate content.
This is only based on helping to remove penalties from clients' sites - I see more instances of a Panda penalty when duplicate content is present rather than 'thin' content, as it were.
However, it's important to understand how the algorithm works. It will penalise pages based on content similarity - so if a page has thin content on it - ie not a lot to differentiate it from another page on the domain - technically, Google will see it as a duplicate page, with thin content on it.
Now, my line of thinking is that if there is more content on the page, but the majority of it is duplicate - ie physically more duplicate content on the page - then Google would see this as "worse". Similarly, taking product descriptions from one domain to another, and having duplicate content from other domains, seems to be penalised more frequently than the Panda algorithm than just thin-content pages (at least in my experience).
Your mileage may vary on this, but if forced into a temporary solution, thin content may be better for SEO - but conversely worse for a user, as there is less about the product on the page. The best solution of course will be to rewrite the descriptions, but obviously there's a need for a temporary solution.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Main Nav Redirects Issue: Unnecessary Evil or Better Solution
Hi, I'm somewhat stumped on the best course of action for our navigation menu. Our site is "divided" into two areas; informational and transactional. Because of compliance, transnational users have to be geo targeted; therefore, we serve them a specific page (we have 6 different regions: uk, aus, eu, etc). If users visit informational side, it's not geo specific. Example: https://site/charts https://site/uk/money Within our main nav, we don't specify the geo transaction page and use a generic https://site/money/ (page doesn't exist) then when a user clicks that link, we'll detect their location and serve up a 301 redirect to the correct geo page. This has obviously caused a ton load of unnecessary redirects and a waste of powerful link equity from the header of the site. It's been recommended to dynamically change the linked URL in this header based on the location of the user. That sounds good but what about Google? Since we can't detect Google crawler IP, we would have to pick a default geo URL like /uk/money. If we do that, the other regional URLs suffer link equity. How do we minimize redirects and make Google happy for all our geo pages. Hope this makes sense and thanks for your time!
Intermediate & Advanced SEO | | Bragg0 -
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Duplicate Page Title/Content Issues on Product Review Submission Pages
Hi Everyone, I'm very green to SEO. I have a Volusion-based storefront and recently decided to dedicate more time and effort into improving my online presence. Admittedly, I'm mostly a lurker in the Q&A forum but I couldn't find any pre-existing info regarding my situation. It could be out there. But again, I'm a noob... So, in my recent SEOmoz report I noticed that over 1,000 Duplicate Content Errors and Duplicate Page Title Errors have been found since my last crawl. I can see that every error is tied to a product in my inventory - specifically each product page has an option to write a review. It looks like the subsequent page where a visitor can fill out their review is the stem of the problem. All of my products are shown to have the same issue: Duplicate Page Title - Review:New Duplicate Page Content - the form is already partially filled out with the corresponding product My first question - It makes sense that a page containing a submission form would have the same title and content. But why is it being indexed, or crawled (or both for that matter) under every parameter in which it could be accessed (product A, B, C, etc)? My second question (an obvious one) - What can I do to begin to resolve this? As far as I know, I haven't touched this option included in Volusion other than to simply implement it. If I'm missing any key information, please point me in the right direction and I'll respond with any additional relevant information on my end. Many thanks in advance!
Intermediate & Advanced SEO | | DakotahW0 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
Duplicate Content Warning For Pages That Do Not Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Intermediate & Advanced SEO | | southcoasthost0 -
ECommerce products duplicate content issues - is rel="canonical" the answer?
Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.
Intermediate & Advanced SEO | | Confetti_Wedding0