Duplicate content across similar computer "models" and how to properly handle it.
-
I run a website that revolves around a niche rugged computer market. There are several "main" models for each computer that also has several (300-400) "sub" models that only vary by specifications for each model. My problem is I can't really consolidate each model to one product page to avoid duplicate content. To have something like a drop down list would be massive and confusing to the customer when they could just search the model they needed. Also I would say 80-90% of the market searches for a specific model when they go to purchase or in Google. A lot of our customers are city government, fire departments, police departments etc. they get a list of approved models and purchase off that they don't really search by specs or "configure" a model so each model number having a chance to rank is important. Currently we have all models in each sub category rel=canonical back to the main category page for that model. Is there a better way to go about this? Example page you can see how there are several models all product descriptions are the same they only vary by model writing a unique description for each one is an unrealistic possibility for us. Any suggestions on this would be appreciated I keep going back on forth on what the correct solution would be.
-
Do people tend to search for "CF-19" in the Toshiba example, or do they tend to search for "CF-1956Y6XLM"?
If it's CF-19 then I would add more value to the example pages, and not worry about the subpages as much. But, I'm guessing that it's the specific model numbers, in which case the ideal situation is to be able to index an exact page for that model number. If you take a look at the "CF-1956Y6XLM" example, PC World is ranking #1 pretty much on all spec content, meaning they're coasting on domain authority to rank those pages. Meanwhile I see you guys at #4. Typically I would suggest that it's a bad plan to go with really thin content, but if everyone else is doing it, you may not need 200-300 words to move up in the rankings. Try producing 50-75 custom words on 100 of these pages where you're ranking Top 5. Do it for models that are newer so you can monitor ranking improvement over time. If the ranking and traffic improvements happen, and they convert, then figure out if you can scale that process up for every new incoming product.
Other SERP benefits can beat rankings here, too. If you can get legitimate product ratings and generate some rich snippets for the products, that will help maximize your CTR. Try to write better meta descriptions, too - right now they're all pretty drab on that SERP example.
Martijn's suggestion of reviews is a good start but will probably only help on 10-20% of pages that you're able to get reviews on. Nevertheless, probably worth the effort.
Some e-commerce platforms will allow you to save a single product with variations, which helps with this problem. If 10 models can share a page, and be selected with a product sub menu (like the t-shirt size or color selector on a fashion ecommerce site) then that is a good way to cut down on total URLs by 50-90%. But, I'd try the unique content route first and see if the numbers add up.
-
I was afraid of this answer. If it was a static product I would be happy to do this but since it is technology in 6-8 months the next "generation" will be out with new models numbers needing descriptions for each one to be re-written which is incredibly difficult to keep up with.
Is there a middle of the road option? is rel=canonical my best choice if I can't do unique content for every single model?
If so is there a way to maximize the benefit of rel=canonical in this situation?
-
Reviews can work perfectly for user generated content to make sure that the content is a bit more unique. It's an easy one and I'm probably hitting an open door here but depending on how much products you sell for a specific version it might help you to extend both the content and make it more unique.
-
It's a very tough question and one that is common with a lot of e-commerce.
The only really complete solution I have for you that addresses each of your needs is to not base the page "content" on the specs.
Make specs a table on the page but put in enough unique content about each model and variation that it has its own truly unique content.
I know this solution means writing at least say 200-300 words of unique content for every model but 100k words solves the whole issue. It just depends if it is worth them all ranking. But this solution gives you:
a) unique content
b) chance for every page to rank & no canonicals back to one page
c) much more long tail search volume
d) specific searches for every one of your potential customers.
That's really the best I can do ... it takes the duplicate content issue away and solves every problem except the one of having to create this much content in the first place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Backup Server causing duplicate content flag?
Hi, Google is indexing pages from our backup server. Is this a duplicate content issue? There are essentially two versions of our entire domain indexed by Google. How do people typically handle this? Any thoughts are appreciated. Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Lot of duplicate content and still traffic is increasing... how does it work?
Hello Mozzers, I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now. The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though: they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions) indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc) a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case) they have Analytics but don't use Webmaster Tools Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them. Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements? Thanks for your help!
Intermediate & Advanced SEO | | Guybrush_Threepw00d0 -
Ecommerce Duplicate Product Descriptions across 3 websites
Hi, We are an e commerce company that has our own domain but also sell the same products on eBay and Amazon. What is the feeling on the same exact descriptions being used on different platforms? Do they count as duplicate content? Will our domain be punished/penalised as our domain does not have as much authority as EBay or Amazon? We have over 5,000 products with our own hand written product descriptions. We want our website to be the main place/ have priority over the above market places. What's the best suggestion/solution? thanks,
Intermediate & Advanced SEO | | Roy19730 -
Is it better "nofollow" or "follow" links to external social pages?
Hello, I have four outbound links from my site home page taking users to join us on our social Network pages (Twitter, FB, YT and Google+). if you look at my site home page, you can find those 4 links as 4 large buttons on the right column of the page: http://www.virtualsheetmusic.com/ Here is my question: do you think it is better for me to add the rel="nofollow" directive to those 4 links or allow Google to follow? From a PR prospective, I am sure that would be better to apply the nofollow tag, but I would like Google to understand that we have a presence on those 4 social channels and to make clearly a correlation between our official website and our official social channels (and then to let Google understand that our social channels are legitimate and related to us), but I am afraid the nofollow directive could prevent that. What's the best move in this case? What do you suggest to do? Maybe the nofollow is irrelevant to allow Google to correlate our website to our legitimate social channels, but I am not sure about that. Any suggestions are very welcome. Thank you in advance!
Intermediate & Advanced SEO | | fablau9 -
Best practices for handling https content?
Hi Mozzers - I'm having an issue with https content on my site that I need help with. Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly. Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently. One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket. Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place. I guess my question is really about best practices when using https. How can I avoid duplication issues? When do I need to use rel=canonical? What is the best way to do things here to avoid heavy maintenance moving forward?
Intermediate & Advanced SEO | | CodyWheeler0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Mobile version creating duplicate content
Hi We have a mobile site which is a subfolder within our site. Therefore our desktop site is www.mysite.com and the mobile version is www.mysite.com/m/. All URL's for specific pages are the same with the exception of /m/ in them for the mobile version. The mobile version has the specific user agent detection capabilities. I never saw this as being duplicate content initially as I did some research and found the following links
Intermediate & Advanced SEO | | peterkn
http://www.youtube.com/watch?v=mY9h3G8Lv4k
http://searchengineland.com/dont-penalize-yourself-mobile-sites-are-not-duplicate-content-40380
http://www.seroundtable.com/archives/022109.html What I am finding now is that when I look into Google Webmaster Tools, Google shows that there are 2 pages with the same Page title and therefore Im concerned if Google sees this as duplicate content. The reason why the page title and meta description is the same is simply because the content on the 2 verrsions are the exact same. Only layout changes due to handheld specific browsing. Are there any speficific precausions I could take or best practices to ensure that Google does not see the mobile pages as duplicates of the desktop pages Does anyone know solid best practices to achieve maximum results for running an idential mobile version of your main site?1