E-Commerce Categorization
-
I'm working on an e-commerce site that currently has about 50 root categories and growing, with no sub-categories. They are all linked from the sidebar of every page and all the products are pretty related. They could probably be sub-categorized in to 5 root categories.
At want point does categorization become too flat?
-
Making such radical changes is always something that shouldn't be taken lightly, that's for sure. And if it's done, you'll need to have a spreadsheet where you can enter a column with all the page names, a column with their current URLs, and one with the new URLs, because implementing 301 Redirects is critical but can be a nightmare otherwise.
What it comes down to is evaluating the value of other SEO (on-site, link-building, and social) that can be done to improve things as compared with architectural changes. Judgment calls. Not always fun.
-
Yeah that makes sense. Unfortunately this site is built in ProStores, which gives me no control over the URL structure. It makes me hesitant to make category changes that are going to recreate the URL structure.
ProStores is a nightmare.
-
Roger,
Categorization becomes too flat at the moment you lose high quality visitors. Since the site started out flat, there's no way to tell what that point it. The only way to determine if it's already too flat is to refine it. I always recommend to clients that it's best to have no more than eight to ten top level categories, and only have sidebar navigation that links to sub-categories within the specific category you're in, or at most, those, then below them, two or three additional links to similar top level categories.
The reason for this method is because in the flat model, you have no way of communicating to search engines what the real relationship separation is. This in turn dilutes the ability to drive more strength to the highest level categories. The end result then is a situation where your top level categories don't do as well for their most important keyword phrases, the sub-category page phrases also suffer, and in turn, individual product pages do as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
Hi, We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages. As far as I know, the page URL won't change and won't have parameters. Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots. Is it better to have URL parameters for version B and C of the content? For example: /page for the default content /page?id=2 for the B version /page?id=3 for the C version The dynamic content comes from the server side, so not all pages copy variations are in the default HTML. I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
Technical SEO | | Gyorgy.B1 -
Are subdomains a good seo strategy for a multistore e-commerce?
Hi there I'm wondering what is the best strategy to work with multi-stores on magento: to use or not to use subdomains? Suppose we have the www.website.com and we configure it to use multistore. The url base will not have the store id on it so it will not be like www.website.com/store1 and www.website.com/store2. It will simply rely on the user session so if we have two categories for each store it will acces using: www.website.com/category1 (for store 1) www.website.com/category2 (for store 2) The homepage will allways be set on www.website.com so we should have a single page for several "home pages" (depending on the user session / store he is accessing). I guess this is not a good option if we want to rank for different keywords (for each store). So I was wondering if it is a good solution to set: store1.website.com store2.website.com This way we have 2 "home pages" each one able to rank. Does it make sense? Is it good or bad for seo? Another option I was considering was: www.website.com (for store 1) store2.website.com (for store 2) store3.website.com (for store 3) www.website.com/blog (for blog) Can this work? Good or bad for seo? best regards
Technical SEO | | qgairsoft0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Many Errors on E-commerce website mainly Duplicate Content - Advice needed please!
Hi Mozzers, I would need some advice on how to tackle one of my client’s websites. We have just started doing SEO for them and after moz crawled the e-commerce it has detected: 36 329 Errors – 37496 warnings and 2589 Notices all going up! Most of the errors are due to duplicate titles and page content but I cannot identify where the duplicate pages come from, these are the links moz detected of the Duplicate pages (unfortunately I cannot add the website for confidentiality reasons) : • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_2=&products_per_00&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_00&products_per_00&products_per_00&page=2 With these URLs it is quite hard to identify which pages need to be canonicalize. And this is jsut an example out of thousands on this website. If anyone would have any advice on how to fix this and how to tackle 37496 errors on a website like this that would be great. Thank you for your time, Lyam
Technical SEO | | AlphaDigital0 -
Windows Acces used for e-commerce site - help needed
Hello everybody, I am working on this e-commerce website built on windows access and it's a nightmare to change the html content on it.has anyone used it before? It doesn't allow me to change the content for the html tags even though it should and i don't have a clue about what to do. Thanks oscar
Technical SEO | | PremioOscar0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
Removing Out of Stock Items from an E-Commerce website
I have a dilemma. We have over 500 out of stock items that are still listed on our ecommerce website. I'm thinking it would be a good idea to leave them up because they are all considered content by google, and the keywords might drive traffic. On the other hand, the customers might be disappointed if the items are out of stock (we don't restock our sold out items), and many times, they will not lead to a conversion if the customer is looking for something very specific. Considering all these factors (and some unmentioned ones), my main question is: If I remove content, does that make all of the other content on our website stronger by having more pagerank and link juice flow to them, or do I hurt our rankings?
Technical SEO | | 13375auc30