Duplicate content across similar computer "models" and how to properly handle it.
-
I run a website that revolves around a niche rugged computer market. There are several "main" models for each computer that also has several (300-400) "sub" models that only vary by specifications for each model. My problem is I can't really consolidate each model to one product page to avoid duplicate content. To have something like a drop down list would be massive and confusing to the customer when they could just search the model they needed. Also I would say 80-90% of the market searches for a specific model when they go to purchase or in Google. A lot of our customers are city government, fire departments, police departments etc. they get a list of approved models and purchase off that they don't really search by specs or "configure" a model so each model number having a chance to rank is important. Currently we have all models in each sub category rel=canonical back to the main category page for that model. Is there a better way to go about this? Example page you can see how there are several models all product descriptions are the same they only vary by model writing a unique description for each one is an unrealistic possibility for us. Any suggestions on this would be appreciated I keep going back on forth on what the correct solution would be.
-
Do people tend to search for "CF-19" in the Toshiba example, or do they tend to search for "CF-1956Y6XLM"?
If it's CF-19 then I would add more value to the example pages, and not worry about the subpages as much. But, I'm guessing that it's the specific model numbers, in which case the ideal situation is to be able to index an exact page for that model number. If you take a look at the "CF-1956Y6XLM" example, PC World is ranking #1 pretty much on all spec content, meaning they're coasting on domain authority to rank those pages. Meanwhile I see you guys at #4. Typically I would suggest that it's a bad plan to go with really thin content, but if everyone else is doing it, you may not need 200-300 words to move up in the rankings. Try producing 50-75 custom words on 100 of these pages where you're ranking Top 5. Do it for models that are newer so you can monitor ranking improvement over time. If the ranking and traffic improvements happen, and they convert, then figure out if you can scale that process up for every new incoming product.
Other SERP benefits can beat rankings here, too. If you can get legitimate product ratings and generate some rich snippets for the products, that will help maximize your CTR. Try to write better meta descriptions, too - right now they're all pretty drab on that SERP example.
Martijn's suggestion of reviews is a good start but will probably only help on 10-20% of pages that you're able to get reviews on. Nevertheless, probably worth the effort.
Some e-commerce platforms will allow you to save a single product with variations, which helps with this problem. If 10 models can share a page, and be selected with a product sub menu (like the t-shirt size or color selector on a fashion ecommerce site) then that is a good way to cut down on total URLs by 50-90%. But, I'd try the unique content route first and see if the numbers add up.
-
I was afraid of this answer. If it was a static product I would be happy to do this but since it is technology in 6-8 months the next "generation" will be out with new models numbers needing descriptions for each one to be re-written which is incredibly difficult to keep up with.
Is there a middle of the road option? is rel=canonical my best choice if I can't do unique content for every single model?
If so is there a way to maximize the benefit of rel=canonical in this situation?
-
Reviews can work perfectly for user generated content to make sure that the content is a bit more unique. It's an easy one and I'm probably hitting an open door here but depending on how much products you sell for a specific version it might help you to extend both the content and make it more unique.
-
It's a very tough question and one that is common with a lot of e-commerce.
The only really complete solution I have for you that addresses each of your needs is to not base the page "content" on the specs.
Make specs a table on the page but put in enough unique content about each model and variation that it has its own truly unique content.
I know this solution means writing at least say 200-300 words of unique content for every model but 100k words solves the whole issue. It just depends if it is worth them all ranking. But this solution gives you:
a) unique content
b) chance for every page to rank & no canonicals back to one page
c) much more long tail search volume
d) specific searches for every one of your potential customers.
That's really the best I can do ... it takes the duplicate content issue away and solves every problem except the one of having to create this much content in the first place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are backlinks within duplicate content ignored or devalued?
From what I understand, Googles no longer has a "Duplicate Content Penalty" instead duplicate content simply isn't show in the search results. Does that mean that any links in the duplicate content are completely ignored, or devalued as far as the backlink profile of the site they are linking to? An example would be an article that might be published on two or three major industry websites. Are only the links from the first website GoogleBot discovers the article on counted or are all the links counted and you just won't see the article itself come up in search results for the second and third website?
Intermediate & Advanced SEO | | Consult19010 -
Prerender.io and similar services to index content - legit?
A client has a huge, unique, updated list of B2B products that are in javascript and not indexed. Reading around, I think I've found that: Google allows showing bots and users different content (if it's fundamentally the same) with no penalty There are good, bad, and ugly ways to do it It's a semi-common problem There are services like prerender.io and formerly ajaxsnapshots.com that can help with this However..... I can't find a single authoritative (read: from Google or Moz) that says the above point 1. I found this White Hat Cloaking: It exists. It's permitted. It's useful. But can't tell where my situation fits (or if it does). So... if I use prerender.io to surface content to get it indexed... is that a smart move? I'm 95% sure it is, but I need 100% to make the decision.
Intermediate & Advanced SEO | | DanSullivan0 -
Implications of posting duplicate blog content on external domains?
I've had a few questions around the blog content on our site. Some of our vendors and partners have expressed interest in posting some of that content on their domains. What are the implications if we were to post copies of our blog posts on other domains? Should this be avoided or are there circumstances that this type of program would make sense?
Intermediate & Advanced SEO | | Visier1 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0