Product Variations in Ecommerce: Combine or Canonicalize?
-
Hello,
I have an ecommerce site that sells pond pumps. I have every pump separated because each pump has different flow rates, specs, and replacement parts. All of the content is original, and even the content on the pages are (more than) 15% different - so it isn't getting flagged by Moz as duplicate content. Essentially it is set up like this:
Acme Pond Pumps
- Acme Pond Pump 100
- Acme Pond Pump 200
- Acme Pond Pump 300
I am wondering if it is best to leave all of the products as separate pages, or if I should canonicalize them to the category page? Will each of the pages pass link juice upward anyways? The difference between the products are the specs, parts, and model number.
Thoughts?
-
Hey Samuel,
I feel like this confirms some of the thoughts that I had. The different terms are, searched, so I think it will be best to leave them as individuals.
Thanks for your contribution!
-
First, I'd say that just because Moz does not flag a page as duplicate content does not mean Google would not think it is duplicate content. As great as Moz is, even Moz cannot know for sure what Google thinks about any specific issue.
Second, I'd look at your keyword research. Are people searching only for general, category-based terms such as "Acme Pond Pumps"? If so, then you might want to no-index all the specific product pages (that may have duplicate content). But if people are searching on a detailed, specific terms such as "Acme Pond Pump 100" and "Acme Pond Pump 200," then you want all of them to be indexed individually. In the second example, I'd try to make each product page as original as possible -- which is something you seem to be doing.
I wouldn't worry about passing "link juice" upwards. As long as you have a flat hierarchy so that all of the product pages are no more than three levels/clicks away from the home page, there should be no crawling/indexation issues.
I hope that helps -- let me know in a reply if you have any other questions!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does product environment have impact on main website's SEO
We have two environments - product, where login is necessary and where the customers are working. We also have there our help desk, Q&A and knowledge base. Pretty sophisticated page regarding information on a specific topic. We also have our main page where we promote our products, company and events, etc. Main page is www.example.com, where product environment is login.example.com . Does this product environment have an impact on my main page's SEO?
Intermediate & Advanced SEO | | NeringaA0 -
How do I canonicalize an old HTML static site?
Hey All, I have an old static HTML site, and the crawl errors are showing "http://www.website.com" and http://website.com" as the two separate pages because there is no canonicalization. Can I fix that with a rel="canonical" tag? There is just a folder of HTML files to add the tag to, so if the www. version is the true version, can I just add to all the pages? Or is there a better way to do this??
Intermediate & Advanced SEO | | mbodine0 -
Discussion: Ecommerce SEO - What would you recommend as high Impact tasks to start campaigns with?
Calling all those with ecommerce SEO chops! What are the high impact tasks that you would always start out your new campaigns with? I've got this far in my thinking: Identifying query classes, identifying the intent of each query class and setting up ranking indices ensuring templated META page titles match the most valuable query syntax identify any issues with crawl budget that might prevent deep product page indexing
Intermediate & Advanced SEO | | QubaSEO
???
??? Hoping this can be one of those discussion threads that is so useful people bookmark it for future reference! Many thanks!1 -
Two Domains, Same Products/Content
We're an e-commerce company with two domains. One is our original company name/domain, one is a newer top-level domain. The older domain doesn't receive as much traffic but is still searched and used by long-time customers who are loyal to that brand, who we don't want to alienate. The sites are both identical in products and content, which creates a duplicate content issue. I have come across two options so far: 1. a 301 redirect from the old domain to the new one. 2. Optimize the content on the newer domain (the strongest of the two) and leave the older domain content as is. Does anyone know of a solution better than the two I listed above or have experience resolving a similar problem in the past?
Intermediate & Advanced SEO | | ilewis0 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Kill, pimp or cut loose? Ideas for a legacy ECommerce blog
Hi, I'm looking to revamp the fortunes of an ailing Fashion ECommerce blog, which once had an impact on SEO for the site which it linked to but now has fallen by the wayside. Blog sits here: www.mydomain.com/blog and links to products and categories on the ECommerce site www.mydomain.com. The blog has about 2000 posts on it written over the past 5 years, which are almost all rewritten content about existing stories, events or embedded youtube videos related to fashion on the Web. None of the blog topics are unique, but the posts have been rewritten well and in an entertaining way - i.e. it's not just a copy and paste. The blog is written on an old, proprietary platform and only has basic Social sharing. You can't comment on posts, or see "most popular" posts or tag clouds etc. It is optimised for SEO though, with fashion category tags, date archives and friendly URLs. The company badly needs a shot in the arm for its content marketing efforts - so we're looking into the creation of infographics and other types of high quality, sharable content with an outreach effort. Ideally I want this content to be hosted on the Ecommerce site, but am faced with a few options which I'd appreciate the community's view on: How I should handle the mix of the legacy content on /blog and the addition of new, "high quality" content? (Pimp v1) Leave the /blog exactly as is and add the new, high quality content as new posts to it. Invest in pimping the /blog UI so that it has features such as commenting/tag clouds etc. They could migrate the blog to Wordpress, but leave it on the same URL. (Cut loose) Leave the /blog alone, and start afresh with a new Wordpress blog for the new, high quality content. e.g. /News or news.mydomain.com. The old blog posts probably aren't worth bothering about, but it might be risky to delete them as there are a lot and are better off with them than without. (Pimp v2) Set up a new Wordpress blog (e.g. /News or news.mydomain.com) for the new content and move the old /blog content to it. 301 the old /blog posts to the new location. The depth of old content that exists will add weight to the new content from a user's perspective, but will seem sparse if published on its own. Not sure why I would do this, but it's an option... (Kill) Kill the old /blog content, start a new one for the new, high quality content. Maybe there's another option I haven't considered. Thanks in advance, George
Intermediate & Advanced SEO | | webmethod1 -
URL for New Product
Hi, We are creating a section on our established existing website to display our new marketplace product & associated category pages. This marketplace will be a section of the site where our users can sell online training courses that they've created. It will be branded on our site as the Marketplace. Is it important to include 'marketplace' in the URL? Or would it be better to include a relevant keyword such as 'training-courses' instead? Or both? I've assumed I shouldn't use both as that would increase the length of the URLs and number of subfolders.
Intermediate & Advanced SEO | | mindflash0 -
Techniques to fix eCommerce faceted navigation
Hi everyone, I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them. I was wondering if this technique would work instead? If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text. So what would be for a human COLOR < li > < a href = red >red < /a > < /li >
Intermediate & Advanced SEO | | anthematic
< li > < a href = blue>blue < /a > < /li > Would be for a robot COLOR < li > red < /li >
< li > blue < /li > Any reason I shouldn't do this? Thanks! *** edit Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.0