Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
E-commerce site, one product multiple categories best practice
-
Hi there,
We have an e-commerce shopping site with over 8000 products and over 100 categories.
Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees"
The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well.
Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options:
- Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking.
- Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page.
- Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well)
- Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages
This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it.
Thank you all!
-
Hi,
This topic is quite old, but is still relevant.
I understand that the solution mentioned above is the most thorough one.
But is there something wrong with just using canonicals? In a webshop that we are managing, there are just a couple of subcategories that belong to different categories. An example:
Only these two URL's will generate duplicate content, since the categories above 'Company law' ('Economic law' and 'Companies') clearly have different content. Can't you just pick one version as the canonical one? Since we have just a couple of these categories, this is an easier solution.
Thanks for your feedback guys!
-
Thought I'd answer my own question!! (with the help of Dr Pete, who answered this question in private Q&A)
"The multiple path issue is tough - you can't really have a path visitors can follow and then hide that from Google (or, at least, it's not a good idea). You could NOINDEX certain paths, but that's a complex consideration (it has pros and cons and depends a lot on your goals and site architecture).
If you generate the breadcrumb path via user activity and store it in a session/cookie, that's generally ok. Google's crawlers, as well as any visitor who came to the site via search, would see a default breadcrumb, but visitors would see a breadcrumb based on their own activity. That's fine, since the default is the same for humans as for spiders."
That seems to be a fairly conclusive answer IMO.
-
Hi Arik,
I'd really like an answer to this aswell, as there seems to be no clear answer online.
My understanding is that a breadcrumb should specify a canonical crawl path (not based on referral path), so option 1 is out
option 2 seems suboptimal and not something I can recall seeing implemented on other sites
options 3 and 4: I don't want multiple URLs and to use rel=canonical as I already have one definitive URL.
This seems like it must be a fairly regular problem people have, but cant see a good solution online anywhere
Help anyone?
-
Dear All,
I repeat about Option 1: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking.
Changing content based on the referral path means that the same url will have different content at times. Which means that the search engine will probably find a different content on the page than some other views of the page. As far as I know, this is cloaking - please correct me if I'm wrong.
Option 4 will not necessarily achieve the desired effect as the search engine might decide to ignore the tag. i checked a few examples that this is actually what happens when other e-commerce stores use canonical - you find both URLs in the serps. So I doubt this is the perfect solution...
I'm still not convinced that I have a definitive answer for this. Anyone?
Thanks!
-
Option 1 is not cloaking - it is displaying content dynamically. Cloaking would be if you showed one page to viewers and a different version to Googlebot.
I would say it depends on how different pages are. If all that changes in the breadcrumbs, they I would say you're fine with options 1, 2, or 4.
If the pages are significantly different, such as different category names, page titles, descriptive text, etc. I would go with option 4.
-
Thanks Adam.
I very much respect your opinion and even agree that from a user's point of view option 1 is the best.
I wonder though - it's this considered as cloaking?
|
|
From:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
Some examples of cloaking include:
[...]
Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor|
|
This becomes more complicated, as the path the user chose to get to the specific subcategory or product page reflects not only on the breadcrumbs but also on the category's navigation menu and possibly the descriptive text of the category.
What's your take on this?
-
Options 1, 2, or 4 should be fine. Option 3 is not recommended.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices for Title Tags for Product Listing Page
My industry is commercial real estate in New York City. Our site has 300 real estate listings. The format we have been using for Title Tags are below. This probably disastrous from an SEO perspective. Using number is a total waste space. A few questions:
Intermediate & Advanced SEO | | Kingalan1
-Should we set listing not no index if they are not content rich?
-If we do choose to index them, should we avoid titles listing Square Footage and dollar amounts?
-Since local SEO is critical, should the titles always list New York, NY or Manhattan, NY?
-I have red that titles should contain some form of branding. But our company name is Metro Manhattan Office Space. That would take up way too much space. Even "Metro Manhattan" is long. DO we need to use the title tag for branding or can we just focus on a brief description of page content incorporating one important phrase? Our site is: w w w . m e t r o - m a n h a t t a n . c o m <colgroup><col width="405"></colgroup>
| Turnkey Flatiron Tech Space | 2,850 SF $10,687/month | <colgroup><col width="405"></colgroup>
| Gallery, Office Rental | Midtown, W. 57 St | 4441SF $24055/month | <colgroup><col width="405"></colgroup>
| Open Plan Loft |Flatiron, Chelsea | 2414SF $12,874/month | <colgroup><col width="405"></colgroup>
| Tribeca Corner Loft | Varick Street | 2267SF $11,712/month | <colgroup><col width="405"></colgroup>
| 275 Madison, LAW, P7, 3,252SF, $65 - Manhattan, New York |0 -
Taxonomy question - best approach for site structure
Hi all, I'm working on a dentist's website and want some advice on the best way to lay out the navigation. I would like to know which structure will help the site work naturally. I feel the second example would be better as it would focus the 'power' around the type of treatment and get that to rank better. .com/assessment/whitening
Intermediate & Advanced SEO | | Bee159
.com/assessment/straightening
.com/treatment/whitening
.com/treatment/straightening or .com/whitening/assessment
.com/straightening/assessment
.com/whitening/treatment
.com/straightening/treatment Please advise, thanks.0 -
Same product in different categories and duplicate content issues
Hi,I have some questions related to duplicate content on e-commerce websites. 1)If a single product goes to multiple categories (eg. A black elegant dress could be listed in two categories like "black dresses" and "elegant dresses") is it considered duplicate content even if the product url is unique? e.g www.website.com/black-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/elegant-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/black-elegant-dress unique url > this is the way my products urls look like Does google perceive this as duplicated content? The path to the content is only one, so it shouldn't be seen as duplicated content, though the product is repeated in different categories.This is the most important concern I actually have. It is a small thing but if I set this wrong all website would be affected and thus penalised, so I need to know how I can handle it. 2- I am using wordpress + woocommerce. The website is built with categories and subcategories. When I create a product in the product page backend is it advisable to select just the lowest subcategory or is it better to select both main category and subcategory in which the product belongs? I usually select the subcategory alone. Looking forward to your reply and suggestions. thanks
Intermediate & Advanced SEO | | cinzia091 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Magento: URLs for Products in Multiple Categories
I am working in Magento to build out a large e-commerce site with several thousand products. It's a great platform, but I have run into the issue of what it does to URLs when you put a product into multiple categories. Basically, "a book" in two categories would make two URLs for one product: 1) /books/a-book 2) author-name/a-book So, I need to come up with a solution for this. It seems I have two options: Found this from a Magento SEO article: 'Magento gives you the ability to add the name of categories to path for product URL's. Because Magento doesn't support this functionality very well - it creates duplicate content issues - it is a very good idea to disable this. To do this, go to System => Configuration => Catalog => Search Engine Optimization and set "Use categories path for product URL's to "no".' This would solve the issues and be a quick fix, but I think it's a double edged sword, because then we lose the SEO value of our well named categories being in the URL. Use Canonical tags. To be fair, I'm not even sure this is possible. Even though it is creating different URLs and, thus, poses a risk of "duplicate content" being crawled, there really is only one page on the admin side. So, I can't go to all of the "duplicate" pages and put a canonical tag, because those duplicate pages don't really exist on the back-end. Does that make sense? After typing this out, it seems like the best thing to do probably will be to just turn off categories in the URL from the admin side. However, I'd still love any input from the community on this. Thanks!
Intermediate & Advanced SEO | | Marketing.SCG0 -
Multiple sites linking back with pornographic anchor text
I discovered a while ago that we had quite a number of links pointing back to one of our customer's websites. The anchor text of these links contain porn that is extremely bad. These links are originating from forums that seems to link between themselves and then throw my customers web address in there at the same time. Any thoughts on this? I'm seriously worried that this may negatively affect the site.
Intermediate & Advanced SEO | | GeorgeMaven0 -
Best approach to launch a new site with new urls - same domain
www.sierratradingpost.com We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results. The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites. Except for the homepage the URL structure for the new site is different than the old site. What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues? Here is what we got back from a Google post which may highlight our concerns better: http://www.google.com/support/forum/p/Webmasters/thread?tid=62d0a16c4702a17d&hl=en&fid=62d0a16c4702a17d00049b67b51500a6 Thank You, sincerely, Stephan Woo Cude SEO Specialist scude@sierratradingpost.com
Intermediate & Advanced SEO | | STPseo0