What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
-
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish...
When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other -
We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content?
Thanks!
-
Very good answer - and yes, 2 bad choices but limited resources means I must choose one. Either that or Meta NOINDEX the dupes for the moment until they are re-written.
-
Good idea. Thank you.
-
I agree with you Kurt. In our space we see duplicate content everywhere, from manufacturer's sites to vendors to resellers. There is no such thing as a "duplicate content penalty." Google doesn't penalize duplicate content. They may choose to ignore it, which may feel like a penalty, but that's not technically what's going on.
I also agree with EGOL. If getting a lot of product descriptions is a daunting task, hire some writers. You can get it done for way less that you think. Need inspiration? Watch Fabio's video from MozCon 2012 where in 15-minutes he describes how he and his team created thousands of unique product descriptions in a very short amount of time without spending a lot of money: http://moz.com/videos/e-commerse-seo-tips-and-tricks
Cheers!
Dana
-
I'd take duplicate content over thin content. There are tons of eCommerce sites out there with duplicate product descriptions. I don't think that Google is going to penalize you, per se, they just might not include your pages in the search results in favor of whatever site they think is the originator of the content.
The reason I think duplicate content is better is users. Either way your search traffic is probably not going to be too great. With duplicate, the SE's may ignore your pages and with thin content you haven't given them a reason to rank you. But at least with some real content on the pages you may be be able to convert the visitors you do get.
That said, I like Egol's suggestion. Don't write new product descriptions yourself. Hire a bunch of people to do it so they can crank out the new content real quick.
Kurt Steinbrueck
OurChurch.Com -
Tom... that is some of the best that I have seen in a long time.
Thanks!
-
Nothing like a bit of hyperbole to brighten up a Tuesday, is there?!
-
I'd rather deal with the duplicate content. Personally I'd bounce quicker with Thin or no content than I would with the same content on a different but similar product page. Of course I wouldn't let the duplicate content sit there and hurt me... I'd add canonicals to pages that were similar. Now if it was the exact same content everywhere then that'd drive me nuts. But if I can look at all the products, realize how many are the same with a minor variation and how many truly different product types... then I could write content for fewer pages and consolidate link equity with the canonical without worrying about duplicate content penalizing me. Of course I could always just NoIndex those duplicate pages instead.
-
With a gun to my head....
lol... Wow. That is a great way to word this.
So, my response is, yes, put a gun to my head and I will pick between these two bad choices.
Really, if you are paying someone to write all of this content you can hire one writer and have them take a year to do it... or you can hire 12 writers and have the job done in a month. Same cost either way.
-
With a gun to my head - I'd say thin content is "better" than mass duplicate content.
This is only based on helping to remove penalties from clients' sites - I see more instances of a Panda penalty when duplicate content is present rather than 'thin' content, as it were.
However, it's important to understand how the algorithm works. It will penalise pages based on content similarity - so if a page has thin content on it - ie not a lot to differentiate it from another page on the domain - technically, Google will see it as a duplicate page, with thin content on it.
Now, my line of thinking is that if there is more content on the page, but the majority of it is duplicate - ie physically more duplicate content on the page - then Google would see this as "worse". Similarly, taking product descriptions from one domain to another, and having duplicate content from other domains, seems to be penalised more frequently than the Panda algorithm than just thin-content pages (at least in my experience).
Your mileage may vary on this, but if forced into a temporary solution, thin content may be better for SEO - but conversely worse for a user, as there is less about the product on the page. The best solution of course will be to rewrite the descriptions, but obviously there's a need for a temporary solution.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Bigcommerce & Blog Tags causing Duplicate Content?
Curious why moz would pick up our blog tags as causing duplicate content, when each blog has a rel canonical tag pointing to either the blog post itself and on the tag pages points to the blog as a whole. Kinda want to get rid of the tags in general now, but also feel they can add some extra value to UX later on when we have many more blog posts. Curious if anyone knows a way around this or even a best solution practice when faced with such odd issues? I can see why the duplicate content would happen, but when grouping content into categories?
Intermediate & Advanced SEO | | Deacyde0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
2-websites focused on different markets but similar content
Hi all! I have a client who wants to branch out to another market (currently in Northern California and wants to open an office in Southern California), what would happen if we put up a second website that has similar content, but is exclusively for Southern California, with a different office address, and all the content geared towards Southern California market? There would be NO linking between the sites. Would that generate a penalty? Thanks! BB
Intermediate & Advanced SEO | | BBuck0 -
Why is Google Reporting big increase in duplicate content after Canonicalization update?
Our web hosting company recently applied a update to our site that should have rectified Canonicalized URLs. Webmaster tools had been reporting duplicate content on pages that had a query string on the end. After the update there has been a massive jump in Webmaster tools reporting now over 800 pages of duplicate content, Up from about 100 prior to the update plus it reporting some very odd pages (see attached image) They claim they have implement Canonicalization in line with Google Panda & Penguin, but surely something is not right here and it's going to cause us a big problem with traffic. Can anyone shed any light on the situation??? Duplicate%20Content.jpg
Intermediate & Advanced SEO | | Towelsrus0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0