Should I use rel=canonical on similar product pages.
-
I'm thinking of using rel=canonical for similar products on my site.
Say I'm selling pens and they are al very similar. I.e. a big pen in blue, a pack of 5 blue bic pens, a pack of 10, 50, 100 etc. should I rel=canonical them all to the best seller as its almost impossible to make the pages unique. (I realise the best I realise these should be attributes and not products but I'm sure you get my point)
It seems sensible to have one master canonical page for bic pens on a site that has a great description video content and good images plus linked articles etc rather than loads of duplicate looking pages.
love to hear thoughts from the Moz community.
-
There's no perfect solution, but Google's advice is to use rel=prev/next. This looks like pretty classic pagination. Rel-canonical is a stronger signal, but it's generally going to keep pages 2+ from ranking.
-
Dr. Pete,
I have a internal debate going and I was hoping you might be a tie breaker on rel=canonical vs noindex given these paginated pages and might be a good use case for others:
https://www.newhomesource.com/communityresults/market-269/citynamefilter-cedar-park
https://www.newhomesource.com/communityresults/market-269/citynamefilter-cedar-park/page-2
The individual list items are unique, but clearly want to rank for essentially the exact same terms. Page titles, metas, copy about cit is the same. Just the list elements are different, but not a 12 pack of pens, 24 pack etc. Is this tricky or clear?
-
Thank you Sir. I think we reached the same conclusion.
By the way, the it was a just a simple example of the page hierarchy - we're not doing Horror Books
-
I haven't heard any SEO recommendations or benefits regarding rel="contents". Rel=prev/next has mixed results, but I'd generally only use it for its specific use case of paginated content.
I guess you could treat V2 as "pages" within V1. If you did that, what you'd need to do is treat the main page as a "View All" page and link to it from each author page. I'm not sure if that's the best approach, but it's more or less Google-approved.
If the site has decent authority and we're only talking 100s of pages, I might let them all live in the index and see what happens. Let Google sort it out, and then decide if you're ok with the outcome. If the site is low authority and/or we're talking 1000s of pages, I might be more cautious.
It's hard to speak in generalities - it depends a lot on the quality of the site and nature of the pages, including how much that content is available/duplicated across the web. One problem here is that author pages with lists of books probably exist on many sites, so you have to differentiate yourself.
-
Good. Same page
I was looking in to rel=contents and those variations before, but I can't quite decide whether this is worth the effort or not.
e.g. There's a huge list of resources on a single page, segmented in to categories. The page is HUGE and takes ages to load, so I've been creating new pages for each segment and optimising those pages independently, but there is some common content with the primary page.
V1: Horror Novels page has a section for each author, each section lists all novels by that author.
V2: Each Author has a page which lists novels by that author, but links back to the Horror Novels page which is essentially an index of the Author pages. Would you also
Would you use rel=contents, rel=prev/next or a different approach in this case? From what I've read so far, there doesn't seem any "SEO value" in linking that way.
I guess we're trying to improve the UX through faster load times and segmenting the information in smaller chunks, but also presenting a number of pages to Google as a body of content rather than a single page without causing issues with duplicate or similar content - we just need to make sure that we're optimising it in the right way, of course.
-
I would Meta Noindex an "email this page" template. It has no value for SERPs, it's generally at the end of a path, and no one is going to link to it. Just keep it out of the index altogether.
-
Thanks Pete
So, for a more specific example, if an eCommerce store has an "email this product" page for each product (Magento seems to love doing this and creates a duplicate of the same email page for every product), would you recommend a canonical link for each of those pages to the main Contact page or canonically linking each page to each related product page?
From setup, I'd consider NoIndex on all of those pages anyway, but it's a bit late for that once a site has been live for years.
The email pages are obviously related to the product page, but the content there isn't anywhere near identical.
Or maybe there's a "more appropriate solution" that you alluded to?
-
To clarify, that's the official stance - rel=canonical should only be used on true duplicates (basically, URL variants of the same page). In practice, rel=canonical works perfectly well on near-duplicates, and sometimes even on wildly different pages, but the more different you get, the more caution you should exercise. If the pages are wildly different, it's likely there are more appropriate solutions.
-
Hey Pete
Can you explain, "you can't use rel=canonical on pages that aren't 100% duplicates" a little further please?
Do you mean that only duplicate pages should be canonicalised? Identical pages in two different sub-directories is fine, but two similar pages is not?
-
So, here's the problem - if you follow the official uses of our options, then there is no answer. You can't have thin content or Google will slap you with Panda (or, at the very least, devalue your rankings, you can't use rel=canonical on pages that aren't 100% duplicates, and you're not supposed to (according to Google) just NOINDEX content. The official advice is: "Let us sort it out, but if we don't sort it out, we'll smack you down."
I don't mean that to be critical of your comment, but I'm very frustrated with the official party line from Google. Practically speaking, I've found index control to be extremely effective even before Panda, and critical for big sites post-Panda. Sometimes, that means embracing imperfect solutions. The right tool for any situation can be complex (and it may be a combination of tools), but rel=canonical is powerful and often effective, in my experience.
-
It seems to me that for most ecommerce sites (myself included) that canonical is not the answer. If you have to many near identical products on your site it may be better to re evaluate what you have stocking and if you must stock them then the way forward is to make one page that properly explains them and allows purchase rather than many.
The only uses I can see for canonical is to consolidate old blogs and articles on similar topics. Using it to tidy an ecommerce site seems to be a misuse of the tool.
-
This can get tricky when you dive into the details, but I general agree with Takeshi and EGOL - consolidate or canonicalize. If the products are different brands/versions of a similar item, it's a bit trickier, but these variations do have a way of spinning out of control. In 2013, I think the down side of your index running wild is a lot higher than the up side of ranking for a couple more long-tail terms. It does depend a lot on your traffic, business model, etc., though. I'm not sure any of us can adequately advise you in the scope of a Q&A.
-
Also I forgot to mention that in this way you also don't have to worry about creating tons of different product descriptions because you will put one description for, let's say, 6 different products.
the way we built it, allow us to have just product group pages are reachable; the products pages are indexed and crawled and they have to be there otherwise the whole system wouldn't work, but no optimization is done on them and customers can't see it.
-
Hello there,
I manage an e-commerce site and because we have similar products and issues with duplicate content we have implemented product groups pages with a drop-down menu' listing the different options for a particular product and then we have used the rel="canonical" with the different product pages. In this way we have solved this issue and it works very well.
If you do implement it, make sure every passage is done correctly otherwise, as Matt Cutts says, you will have an headache trying to sort it out.
Hope it helps
-
Those pen offers are very very similar. Identical product descriptions except for perhaps number being sold or color or width of the tip.
If these were on my site they would all be on the same page. One page to concentrate/conserve the linkjuice. One page to make thicker content. One page to present all of the options to the customer at same time. (PITA to click between lots of pages to make up your mind as a shopper). One page to make maintenance easy.
-
Thanks
-
Yes, I've used this approach for a number of ecommerce clients, and it is very effective. There are many advantages to this approach:
- Eliminating duplicate/thin content across the site
- Focusing link value on a single page instead of spreading out across multiple products
- Less effort creating unique content (one page vs multiple)
- Potentially better user experience
Of course, if you have the resources to write unique content for each of your product pages, that is going to be a better solution. You can still create a landing page in this instance, you just wouldn't canonical the product pages to it.
-
Have you used this approach? If so how effective is it?
-
If you want to rank for "flat head screw driver", the canonical approach can still work. Simply create a landing page for flat head screw drivers, and include all of the flat head screwdriver products from each of the different brands. Then canonical each of the individual product pages up to the main landing page.
-
I have all the usual colour size attributes on my products. I just used that as a simple example. Its more to do with similar non branded products that are different enough to be "products" but not when I have 15 similar it's impossible to write fully different descriptions. Screwdrivers, screws or paint would have been a better example. There are hundreds of ranges like that. If you had five unimportant brands of screwdriver and you had flat head and philips head. Each one is marginally different (handle style etc) but there is no keyword benefit to having each optimised for say "flat head screwdriver". Having a good range is beneficial to the customer but seems to be detrimental to SEO. Is it better to employ writers to make every description different no matter how complex or should I canonical it?
-
Yes, that is a good solution, especially in this post-Panda world. Ideally you would just have one page for Bic pens, with a drop down from which you can select different options such as colors & size. If your shopping cart system doesn't allow you to do that, then the canonical is a good approach. This cuts down on the amount of duplicate content you have and the amount of unique content you need to create.
-
Have a client in the exact same situation. Check to see if you are currently getting traffic for terms that would be specific to having separate pages (e.g. "50 blue bic pens" versus a more general "bic blue pens"). If you don't, then you should canonical to one page. If you do, I'd keep it as is and work on diversifying the product pages more.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Moz Pro > Links > Top Pages: many are images, useful?
My site is 10 years old, and has always ranked well for the variety of garden tools it sells. Looking at our Moz Pro > Links > Top Pages report I see that many of the "pages" are actually image URLs. And many of those are images we do not even use anymore (though they are still hosted). Question: As a way of gaining some link juice to deeper pages, what about 301 redirecting some of those old images over to appropriate pages? (example: redirecting old-weeding-hoe.jpg to the page garden-hoes.html) Would it be worthwhile? Would it be safe? Thanks for any and all input!
Intermediate & Advanced SEO | | GregB1230 -
Does Google frown on using 3 different page titles with same content to secure the top 3 results in SERPs?
Is it frowned upon by Google to create 3 different pages with the sames content yet different titles to secure the top three results in SERPs? For example: Luxury Care Homes in Liverpool Care Homes in Liverpool Private Care Homes in Liverpool The page titles are different with slightly different meta data but the user content is exactly the same, would this be considered a cheeky win or negative to rankings?
Intermediate & Advanced SEO | | TrustedCare.co.uk1 -
Similar product descriptions but with different urls
I had this question before and was not fully satisfied with the answer.. We are selling adhesives and some of the products have the same name and description, the only thing that separates them are the width on the roll.. Are old/online setup are as following, each product has its own product page with more or less the same description. For example here http://siga-sverige.se/siga/fentrim-2-100/ and here http://siga-sverige.se/siga/fentrim-2-150/ The above product pages are for a product called Fentrim 2. its availiable in widhts from 75 to 300mm.. so, its six diffent products pages with more or less the same description. The other variations of the products besides the width. are Fentrim 20, Fentrim IS 2 and Fentrim IS 20. So this gives us 6 x Fentrim 20 product pages with the same description, just the width that changes. 6 x Fentrim 2 product pages with the same description, just the width that changes. 6 x Fentrim IS 20 product pages with the same description, just the width that changes. 6 x Fentrim IS 2 product pages with the same description, just the width that changes. I get that this can cause us problems in the terms of duplicate content. The plan that we have now is to have 4 different product pages with variations instead. For each of those for product pages we have well written and unique content. And have the old ones 301 redirected to them. Like this http://siga-sverige.se/siga/fentrim-2 http://siga-sverige.se/siga/fentrim-20 http://siga-sverige.se/siga/fentrim-IS-2 http://siga-sverige.se/siga/fentrim-IS-20 Today we gain traffic from one product page per variation and it seems that google has picked those ones out randomly, see the attached screenshot.. Will we loose rank? will this increase our position, whats your ideas? // Jonas PG4aAcM
Intermediate & Advanced SEO | | knubbz0 -
302 to a page and rel=canonical back to the original (to preserve url juice)?
Bit of a weird case, but let me explain. We use unbounce.com to create our landing pages, which are on a separate sub-domain (get.domain.com).
Intermediate & Advanced SEO | | dragonlawhq
Some of these landing pages have a substantial amount of useful information and are part of our content building strategy (our content marketers are able to deploy them without going through the dev team cycle). We'd like to make sure the seo page-juice is counting towards our primary domain and not the subdomain.
(It would also help if we one day stop using unbounce and just migrate our landing page content to our primary website). Would it be an SEO faux-pas to do the following:
domain.com/awesome-page ---[302]---> get.domain.com/awesome-page
get.domain.com/awesome-page ---[rel=canonical]---> domain.com/awesome-page My understanding is that our primary domain would hold all the "page juice" whilst sending users to the unbounce landing page - and the day we stop using unbounce, we just kill the redirect and host the content on our primary domain.0 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Any experience with using programs to create UGC pages?
We have a new client (a mobile app) who created a program to create thousands of pages of "unique, user-generated content" for their website. An example: A person in the forum in app asks a question, and people respond. The client's program then compiles the question and responses into a unique, auto-generated page for the website. (I don't think the app is utilizing deep linking -- though I was going to recommend it -- so the app content is not indexed by search engines yet.) The pages are already created -- they are just not live on the site yet. I'm very skeptical. But the client says it's similar to what Stack Overflow does (or something like that). Basic example. Say that a question for which the client wants to rank is, "What Are the Symptoms of Cancer?" I'd think that a quality, human-created, referenced, well-written, authoritative page would obviously rank more highly than a UGC page based on a forum discussion on that topic. But of course, doing that for hundreds of questions is costly and hard to scale -- both of which are concerns of the client (a startup with little money). Has anyone had any experience in this? It's the first time I've tackled such an issue. Thanks in advance for any thoughts!
Intermediate & Advanced SEO | | SamuelScott0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0