Ecommerce category pages
-
Hi there,
I've been thinking a lot about this lately. I work on a lot of webshops that are made by the same company. I don't like to say this, but not all of their shops perform great SEO-wise.
They use a filtering system which occasionally creates hundreds to thousands of category pages. Basically what happens is this: A client that sells fashion has a site (www.client.com). They have 'main categories' like 'Men' 'Women', 'Kids', 'Sale'.
So when you click on 'men' in the main navigation, you get www.client.com/men/. Then you can filter on brand, subcategory or color. So you get: www.client.com/men/brand. Basically, the url follows the order in which you filter. So you can also get to 'brand' via 'category': www.client.com/shoes/brand
Obviously, this page has the same content as www.client.com/brand/shoes or even /shoes/brand/black and /men/shoes/brand/black if all the brands' shoes happen to be black and mens' shoes.
Currently this is fixed by a dynamic canonical system that canonicalizes the brand/category combinations. So there can be 8000 url's on the site, which canonicalize to about 4000 url's.
I have a gut feeling that this is still not a good situation for SEO, and I also believe that it would be a lot better to have the filtering system default to a defined order, like /gender/category/brand/color so you don't even need to use these excessive amounts of canonicalization. Because, you can canonicalize the whole bunch, but you'd still offer thousands of useless pages for Google to waste its crawl budget on.
Not to mention the time saved when crawling and analysing using Screaming Frog or other audit tools.
Any opinions on this matter?
-
I love this question, Adriaan. It's one that a lot of people have asked over the years and that a lot of people have had to deal with over time especially with ecommerce sites like those you work on.
As you well know, there are multiple ways to handle duplicate content:
- The way you are proposing, which is moving to a static URL structure that always keeps the same order
- A web of canonicals like you seem to have set up (and it sounds like you have it set up correctly)
- The whack-a-mole approach of periodically looking for duplicate content and implementing redirects, which can lead to further issues with internal redirects. This is not a good scalable option.
SEO is all about processes. If you have a canonical process that is working for you and has been scalable (eg you are not manually specifying the URL for each new category created, which is probably done when the merchandising team or feeds update the site), that works to a certain extent.
However, this is like treating a bunch of cuts on your hands with bandaids but not dealing with the fact that a) you only have so much space on your hands and can only apply so many bandaids, and b) that you're still getting cuts on your hands.
I prefer to deal with the root of the issue, which in your case is that you can have multiple URLs targeting the same terms based on the user's (or Googlebot's!) crawl path on your site. I am assuming that you are only putting the canonicals in your XML and HTML sitemaps, by the way?
If I were you, this is how I would tackle your problem:
-
Make sure you are only putting in the canonical URLs to your XML sitemaps. Start here.
-
Do a full crawl of your site and pull all the URLs that are canonicaling elsewhere. Then get your log files and see how much time the search engines are spending on these canonical'd URLs.
-
Also check to see that Google is indeed respecting all of your canonicals! At this scale of canonicals, I'd expect that they are semi-often not respecting them and you are still dealing with duplicate content issues. But again, that's just a hunch I have.
-
Make a decision from there, off of discussions with your engineers/designers/etc about how much work is involved, about if you think it's worthwhile to make the change.
I am **always **a fan of eliminating pages that are canonical'd and not serving a purpose (example: a PPC landing page might be canonical'd and noindexed, and you don't want to remove that page). My suspicion in your case, as well, is that having /brand/mens won't convert any differently from /mens/brand.
At the end of the day, you need to decide how you want your site organized and if your customers (the people buying things on the site) prefer to shop by brand or by gender/sport/whatever. This will help you decide what way to architect your URLs and your site's flow.
Hope that helps!
John
-
Reducing the number of pages that search engines need to crawl is definitely the right way to go, so yeah I would definitely get a uniform URL structure in place if possible. Reduce that crawl budget
-
Thanks for your response Sean. I do know that the use of canonicals is correct here.
My question though, is if it would be better to reduce the amount of actual pages (introduce a uniform URL structure, so to speak) because this would reduce the amount of pages the Google crawler needs to crawl drastically (over 65% on some of my clients webshops). As far as I know, they do crawl every canonicalized url?
-
It does sound like you're adopting a good approach to canonicals. There are a lot of sites out there that do the same approach with non-uniform URL structures such as the one you're using.
Don't suppose you could supply the URL so I can have a look?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page speed - what do you aim for?
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download? Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Intermediate & Advanced SEO | | McTaggart
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec) Thanks, Luke0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Our company gets inbound links from news websites that write stories about upcoming sporting events. The links we get are pointing to our event / ticket inventory pages on our commerce site. Once the event has passed, that event page is basically a dead page that shows no ticket inventory, and has no content. Also, each “event” page on our site has a unique url, since it’s an event that will eventually expire, as the game gets played, or the event has passed. Example of a url that a news site would link to: mysite.com/tickets/soldier-field/t7493325/nfc-divisional-home-game-chicago bears-vs-tbd-tickets.aspx Would there be any negative ramifications if I set up a 301 from the dead event page to another page on our site, one that is still somewhat related to the product in question, a landing page with content related to the team that just played, or venue they play in all season. Example, I would 301 to: mysite.com/venue/soldier-field tickets.aspx (This would be a live page that never expires.) I don’t know if that’s manipulating things a bit too much.
Intermediate & Advanced SEO | | Ticket_King1 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
Should I merge these pages
I have this business and am not sure if I should have a separate page for all of the different roofing subservices or if i should put them all on one page. Even though they are separate, but related services, I feel they could end up competing against one another If I merge them I will also have more related and keyword rich content on one page that I could focus my efforts on.
Intermediate & Advanced SEO | | Atomicx0 -
Category Content Duplication
Does indexing category archive page for a blog cause duplications? http://www.seomoz.org/blog/setup-wordpress-for-seo-success After reading this article I am unsure.
Intermediate & Advanced SEO | | SEODinosaur0 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10