What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
-
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish...
When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other -
We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content?
Thanks!
-
Very good answer - and yes, 2 bad choices but limited resources means I must choose one. Either that or Meta NOINDEX the dupes for the moment until they are re-written.
-
Good idea. Thank you.
-
I agree with you Kurt. In our space we see duplicate content everywhere, from manufacturer's sites to vendors to resellers. There is no such thing as a "duplicate content penalty." Google doesn't penalize duplicate content. They may choose to ignore it, which may feel like a penalty, but that's not technically what's going on.
I also agree with EGOL. If getting a lot of product descriptions is a daunting task, hire some writers. You can get it done for way less that you think. Need inspiration? Watch Fabio's video from MozCon 2012 where in 15-minutes he describes how he and his team created thousands of unique product descriptions in a very short amount of time without spending a lot of money: http://moz.com/videos/e-commerse-seo-tips-and-tricks
Cheers!
Dana
-
I'd take duplicate content over thin content. There are tons of eCommerce sites out there with duplicate product descriptions. I don't think that Google is going to penalize you, per se, they just might not include your pages in the search results in favor of whatever site they think is the originator of the content.
The reason I think duplicate content is better is users. Either way your search traffic is probably not going to be too great. With duplicate, the SE's may ignore your pages and with thin content you haven't given them a reason to rank you. But at least with some real content on the pages you may be be able to convert the visitors you do get.
That said, I like Egol's suggestion. Don't write new product descriptions yourself. Hire a bunch of people to do it so they can crank out the new content real quick.
Kurt Steinbrueck
OurChurch.Com -
Tom... that is some of the best that I have seen in a long time.
Thanks!
-
Nothing like a bit of hyperbole to brighten up a Tuesday, is there?!
-
I'd rather deal with the duplicate content. Personally I'd bounce quicker with Thin or no content than I would with the same content on a different but similar product page. Of course I wouldn't let the duplicate content sit there and hurt me... I'd add canonicals to pages that were similar. Now if it was the exact same content everywhere then that'd drive me nuts. But if I can look at all the products, realize how many are the same with a minor variation and how many truly different product types... then I could write content for fewer pages and consolidate link equity with the canonical without worrying about duplicate content penalizing me. Of course I could always just NoIndex those duplicate pages instead.
-
With a gun to my head....
lol... Wow. That is a great way to word this.
So, my response is, yes, put a gun to my head and I will pick between these two bad choices.
Really, if you are paying someone to write all of this content you can hire one writer and have them take a year to do it... or you can hire 12 writers and have the job done in a month. Same cost either way.
-
With a gun to my head - I'd say thin content is "better" than mass duplicate content.
This is only based on helping to remove penalties from clients' sites - I see more instances of a Panda penalty when duplicate content is present rather than 'thin' content, as it were.
However, it's important to understand how the algorithm works. It will penalise pages based on content similarity - so if a page has thin content on it - ie not a lot to differentiate it from another page on the domain - technically, Google will see it as a duplicate page, with thin content on it.
Now, my line of thinking is that if there is more content on the page, but the majority of it is duplicate - ie physically more duplicate content on the page - then Google would see this as "worse". Similarly, taking product descriptions from one domain to another, and having duplicate content from other domains, seems to be penalised more frequently than the Panda algorithm than just thin-content pages (at least in my experience).
Your mileage may vary on this, but if forced into a temporary solution, thin content may be better for SEO - but conversely worse for a user, as there is less about the product on the page. The best solution of course will be to rewrite the descriptions, but obviously there's a need for a temporary solution.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
SEO effect of content duplication across hub of sites
Hello, I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham
Intermediate & Advanced SEO | | gmwhite9991 -
Unique Content Below Fold - Better Move Above Fold?
I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions: Do we have any evidence or sound reasoning why I should / should not make this move? Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold? Thank you
Intermediate & Advanced SEO | | khi50 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0 -
"Duplicate" Page Titles and Content
Hi All, This is a rather lengthy one, so please bear with me! SEOmoz has recently crawled 10,000 webpages from my site, FrenchEntree, and has returned 8,000 errors of duplicate page content. The main reason I have so many is because of the directories I have on site. The site is broken down into 2 levels of hierachy. "Weblets" and "Articles". A weblet is a landing page, and articles are created within these weblets. Weblets can hold any number of articles - 0 - 1,000,000 (in theory) and an article must be assigned to a weblet in order for it to work. Here's how it roughly looks in URL form - http://www.mysite.com/[weblet]/[articleID]/ Now; our directory results pages are weblets with standard content in the left and right hand columns, but the information in the middle column is pulled in from our directory database following a user query. This happens by adding the query string to the end of the URL. We have 3 main directory databases, but perhaps around 100 weblets promoting various 'canned' queries that users may want to navigate straight into. However, any one of the 100 directory promoting weblets could return any query from the parent directory database with the correct query string. The problem with this method (as pointed out by the 8,000 errors) is that each possible permutation of search is considered to be it's own URL, and therefore, it's own page. The example I will use is the first alphabetically. "Activity Holidays in France": http://www.frenchentree.com/activity-holidays-france/ - This link shows you a results weblet without the query at the end, and therefore only displays the left and right hand columns as populated. http://www.frenchentree.com/activity-holidays-france/home.asp?CategoryFilter= - This link shows you the same weblet with the an 'open' query on the end. I.e. display all results from this database. Listings are displayed in the middle. There are around 500 different URL permutations for this weblet alone when you take into account the various categories and cities a user may want to search in. What I'd like to do is to prevent SEOmoz (and therefore search engines) from counting each individual query permutation as a unique page, without harming the visibility that the directory results received in SERPs. We often appear in the top 5 for quite competitive keywords and we'd like it to stay that way. I also wouldn't want the search engine results to only display (and therefore direct the user through to) an empty weblet by some sort of robot exclusion or canonical classification. Does anyone have any advice on how best to remove the "duplication" problem, whilst keeping the search visibility? All advice welcome. Thanks Matt
Intermediate & Advanced SEO | | Horizon0