Duplicate product content/disclaimers for non-e-commerce sites
-
This is more a follow-up to Rand's recent Whiteboard "Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs." I posed my question in the comments, but unsure it will get picked up.
My situation isn't exactly the same, but it's similar:
Our site isn't an e-commerce site and doesn't have user reviews yet, but we do have maybe 8 pages across 2 product categories featuring very similar product features with duplicate verbiage. However, we don't want to re-write it because we want to make it easy for users to compare apples-to-apples to easily see which features are actually different. We also have to run disclaimers at the bottom of each page.\
Would i-framing the product descriptions and disclaimers be beneficial in this scenario, with the addition of good content? It would still be nice to have some crawlable content on those pages, so the i-framing makes me nervous unless we compensate with at least some above-the-fold, useful content that could be indexed.
Thanks, Sarah
-
Good points David, and I'd agree. If the duplicate content issues are relatively small-scale, I'd generally opt to have that content accessible to the engines, and would work simply to add my own, unique content/value/features to those pages to help make them unique (you may already be doing this through design/UX/etc).
-
Hi Sarah,
I wouldn't recommend using iframes to deliver the product descriptions. From my experience, sites with small-scale duplicate content issues will consistently perform better than sites with little to no content (which your site ultimately would be if content was delivered via iframe)
There are a few variables that effect things, most importantly the volume of the duplicate content. If it's 8 pages, then I wouldn't consider this a high-risk problem. The solution would be to add as much valuable additional content as you can to these pages, whilst keeping the core (descriptions) content in place.
If I've misunderstood and the scale is much bigger, it may be worth considering an alternative solution. Rather than using alternate content delivery methods, I'd recommend putting your focus on adding additional unique content. Once of the simplest ways of doing this is encouraging (or manually adding) user generated reviews of the two similar products. This way your customers can compare apples with apples in regards to the descriptions, but there's plenty of unique and helpful information on the page too.
I hope that helps
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Pricing Plan tables
Hey guys, We're faced with a problem that we want to solve. We're working on the designs for a few pages for a drag & drop email builder we're currently working on, and we will be having the same pricing table on several pages (much like Moz does). We're worried that Google will take this as duplicate content and not be very fond of it. Any ideas about how we could integrate the same flow without potentially harming ranking efforts? And NO, re-writing the content for each table is not an option. It would do nothing but confuse the heck out of our clients. 😄 Thanks everybody!
On-Page Optimization | | andy.bigbangthemes0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Multilingual site with untranslated content
We are developing a site that will have several languages. There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated. We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have: etc In the spanish version, we would point to the french version and the english version etc. My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages? I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
On-Page Optimization | | jorgeapartime0 -
Using a lightbox - possible duplicate content issues
Redesigning website in Wordpress and going to use the following lightbox plug-in http://www.pedrolamas.pt/projectos/jquery-lightbox/ Naming the original images that appear on screen as say 'sweets.jpg'
On-Page Optimization | | Jon-C
and the bigger version of the images as 'sweets-large.jpg' Alt text wise I would give both versions of the images slightly different descriptions. Do you think there would be any duplicate content issues with this? Anything I should do differently? I'm very wary of doing anything that Google is likely to think is naughty, so want to stay on their good side! Cheers
T0 -
How do I get rid of duplicate page titles when using a php site?
Hi. I have an e-commerce site that sells a list of products. The list is divided into categories and then those categories for the various pages on the site. An example of a page title. would be given root/products.php?c=40 another page would be given root/products.php?c=41 Is there a way to structure the site with SEO in mind?
On-Page Optimization | | curtisgibbsiii0 -
Duplicate product urls
Our site automatically creates shorter urls for the products. There is a rel canonical tag in place, but webmaster tools shows these urls have duplicate title tags. Here is an example: http://www.colemanfurniture.com/holden-desk.htm http://www.colemanfurniture.com/writing-desks-secretary-desks/holden-desk.htm Should the longer url be redirected to the shorter one?
On-Page Optimization | | thappe0 -
Duplicate content Issue
I'm getting a report of duplicate title and content on: http://www.website.com/ http://www.website.com/index.php Of course, they're the same pages but does this need to be corrected somehow. Thanks!
On-Page Optimization | | dbaxa-2613380