ECommerce Filtering Affect on SEO
-
I'm building an eCommerce website which has an advanced filter on the left hand side of the category pages.
It allows users to tick boxes for colours, sizes, materials, and so on. When they've made their choices they submit (this will likely be an AJAX thing in a future release, but isn't at time of writing).
The new filtered page has a new URL, which is made up of the IDs of the filter's they've ticked - it's a bit like /department/2/17-7-4/10/
My concern is that the filtered pages are, on the most part, going to be the same as the parent. Which may lead to duplicate content.
My other concern is that these two URLs would lead to the exact same page (although the system would never generate the 'wrong' URL)
- /department/2/17-7-4/10/
- /department/2/**10/**17-7-4/
But I can't think of a way of canonicalising that automatically.
Tricky.
So the meat of the question is this: should I worry about this causing issues with the SEO - or can I have trust in Google to work it out?
-
Andie -
We work on a lot of eCommerce sites with similar left-hand navigation filters.
I think that the thing to keep in mind is that these pages are often like search results pages, and require a human to choose options to create those URLs. As a result, they shouldn't be pages that a typical crawl bot would find.
That said, each eCommerce system acts differently, and it's possible that permanent links are created that are added to a site map. Or, it's possible that Google's bots are starting to check boxes on eCommerce filters to better mimic human behavior. After all, Google has created self-driving cars.
The data driven approach: I would check to see if any of these pages are showing up in Google Webmaster tools to see if it is, indeed, an issue, before trying to go crazy about duplicate content.
Hope this helps,
-- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for a Italian customer
Ciao tutti, Should I include 300 characters for all my customers' page meta descriptions? Some colleagues told me about this but I'm not really sure. Thanks in avance Marco
On-Page Optimization | | BestSEOItaly3 -
SEO Content Revolution Question
I was wondering if articles written about questions people are asking will help my website rank better. For example let's say I wrote an article answering the query, "What Hair Dye Does Angela Merkel Use?" or, "Is Hillary Clinton Thinking of Running for President," and they rank well on google, and in turn they get viewed a lot by searchers because it answers their queries. Would this help my website as whole start ranking better? Thanks!
On-Page Optimization | | OOMDODigital0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
All category seo plugin on Wordpress page title to working
I've installed the all category seo plugin for Wordpress and all works fine except the page title I input doesn't show. Anyone else had that problem?
On-Page Optimization | | SamCUK0 -
Would adding a line break tag into the product name affect SEO ranking and Google's ability to read the entire title?
Our client would like to include a link break so that part of the product name always showed up on a second line. Would this affect how Google bots crawl the product name? Would it also affect how Google would show the product name in a search result page? Thanks!
On-Page Optimization | | BrandLabs0 -
Is 302 Redirect a bad thing in SEO terms?
I am getting a lot of "302 (Temporary Redirects) = True" on many of my product URL's. What does it mean? Is it a bad thing to get these redirects? And how to fix them? Thanks.
On-Page Optimization | | SEOish0 -
Very basic hands-on type of question about SEO
Hello, I am a complete newbie to the world of SEO and I have read few thing available on the net about things to begin with (so google seo guide and then seomoz beginers tutorial) and I just wanted to ask if I understand the process correctly: So I have my website lest call it abcd.com and there I might have a subpage about certain type of robots or specific parts that these robots are built from. I do my keyword search using adwords keyword search tool. Where I get 10 phrases (phrase1, phrase2, phrase3 so on..) that could be used by users to search information about certain type of robots I wrote on my website. Let's asume all 10 of those phrases have low competition so they can all be used. And they are from the long tail ofcourse so let's say I can get 10,000 searches from them per month. Some of them would have 200 searches, some 1500, some 5000 per month in that google adwords report. After reading those basic tutorials I understand it this way: I put one or two best phrases of those 10 in my <title></p> <p>2) I describe the website as accurately as possible to those keywords in <meta description> i.e. Those specific robots are built from those specific parts - we know all about them.</p> <p>3) I put all of them in <meta keywords> phrase1,2,3, etc.</p> <p>4) I use one or two phrases in <h1> on the page with article</p> <p>5) then I use those phrases in the text that's on that particular page of my website - text is about certain type of robots and parts etc.</p> <p>6) I put photos on that page with alt descriptions that may contain some of these phrases</p> <p>7) to be honest I don't know how am I suppose to build links from that page about certain type of robots to anywhere else on my website - but I undestand that's the part of SEO as well</p> <p>And that's pretty much how I understand the basics of SEO. I read about it and I just don't know if that's what I am suppose to do. Silly me!</p> <p> </p></title>
On-Page Optimization | | lolskizz1