Filtered Navigation, Duplicate content issue on an Ecommerce Website
-
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution.
For example.
You have a page that lists 12 products out of 100:
companyname.com/productcategory/page1.htm
And then you filter these products:
companyname.com/productcategory/filters/page1.htm
The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products?
I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages?
I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
-
Hi Dstrunin,
I would still use the rel canonical tag even with or without the filter in place. So if you have a list of products displayed unfilter at companyname.com/productcategory/page1.htm, I would add a rel canonical with it pointing at companyname.com/productcategory/page1.htm. For the filtered results,companyname.com/productcategory/filters/page1.htm , the canoncial tag would still point to companyname.com/productcategory/page1.htm.
It doesn't hurt to have a canonical tag point to the same page it's on.
If you can't do that I would meta noindex those filtered pages and remove the robots.txt stuff. Robots.txt doesn't tell Google they can't index it it only says they can't crawl it. So they could still index old stuff they crawled before you did the robots.txt stuff or index the title tags.
Casey
-
I have been doing that, but robots.txt only does so much. I've implemented the meta noindex tag as well and it doesn't seem to be taking all the pages out of the index.
-
My unprofessional opinion would be to use robot.txt on some areas. I'll also be interested to see what the pros here say.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
How best to deal with internal duplicate content
hi having an issue with a client site and internal duplicate content. The client has a custom cms and when they post new content it can appear, in full, at two different urls on the site. Short of getting the client to move cms, which they won't do, I am trying to find an easy fix that they could do themselves. ideally they would add a canonical on one of the versions but the cms does allow them to view posts in html view, also would be a lot if messing about wth posting the page and then going back to the cms and adding the tag. the cms is unable to auto generate this either. The content editors are copywriters not programmers. Would there be a solution using wmt for this? They have the skill level to be able to add a url in wmt so im thinking that a stop gap solution could be to noindex one of the versions using the option in webmaster tools. Ongoing we will consult developers about modifying the cms but budgets are limited so looking for a cheap and quick solution to help until the new year. anyone know of a way other than wmt to block Google from seeing duplicate content. We can block Google from folders because only a small percentage of the content in the folder would be internally duplicate. would be very grateful for any suggestions anyone could offer. thanks.
On-Page Optimization | | daedriccarl0 -
Duplicate content, which seems not to be duplicate :S
After crawling I am used to getting a lot of duplicate content messages in Moz, which are High Priority. I do not know what to do with them, since I believe we tackled all the issues. Main point being the advise to put in a link rel=canonical. An example of a page that accordeing to the report has a duplicate. I do not see how. Can you help with that? http://www.beat-it.nl/4y6hctr24x7wdmr-ml350-p-ic-procaresvc.html duplicate sample http://www.beat-it.nl/modu-hp-a5800-acm-for-64-256-aps.html
On-Page Optimization | | Raymo0 -
Issue: Duplicate Page Content (index.htm)
I get an error of "**Issue:**Duplicate Page Content" for the following pages in the SEOMOZ Crawl Diagnostics. But these pages are the same one! Duhhhh.... Is there a way to hide this false error? http://www.stdtime.com/ http://www.stdtime.com/index.htm BTW, I also get "**Issue:**Duplicate Page Title" for this page. Another false error...
On-Page Optimization | | raywhite0 -
Duplicate Content - Potential Issue.
Hello, here we go again, If I write an article somewhere, lets say Squidoo for instance, then post it to my blog on my website will google see this as duplicate content and probably credit Squidoo for it or is there soemthing I can do to prevent this, maybe a linkk back to Squidoo from my website or a dontfollow on my website? Im not sure so any help here would be great, Also If I use other peoples material in my blog and link back to them, obviously I dont want the credit for the original material I am simply collating some of this on my blog for others to have a specific library if you like. Is this going to damage my websites reputation? Thanks again peeps. Craig Fenton IT
On-Page Optimization | | craigyboy0 -
The crawl diagnosis indicated that my domain www.mydomain.com is duplicate with www.mydomain.com/index.php. How can I correct this issue?
How can I fix this issue when crawl diagnosis indicated that my www.mydomain.com is duplicate with www.mydomain.com/index.php? That suppose to be the same page and not duplicate, right?
On-Page Optimization | | jsevilla0 -
Should I worry about duplicate titles on pages where there is paginated content?
LivingThere.com is a real estate search site and many of our content pages are "search result" - ish in that a page often provides all the listings that are available and this may go on for multiple pages. For example, this is a primary page about a building: http://livingthere.com/building/31308-Cocoa-Exchange Because of the number of listings, the listings paginate to a second page: http://livingthere.com/building/31308-Cocoa-Exchange?MListings_page=2 Both pages have the same Page Title. Is this a concern? If so is there a "best practice" for giving paginated content different titles? Thanks! Nate
On-Page Optimization | | nate1230 -
Website tables
Hi guys , my website www.starpluservices.com on google first page for the past 2 years and we were page rank 4, 2 months ago we changed all page titles and Keewords after that the page rank droped to 1 and we are not anymore in page 1 in google, we have done all this changes to target another keyword Office Cleaning London, now after 2 months I had 3 quotes for SEO , and 2 SEO companies told me that I need a new website because my one was done with tables and the other company told me that if I still on first page with some keywords andon the 2 page with cleaning companies to dont make a new website just update my one!! Could anyone let me know what should I do? Regards Sergio
On-Page Optimization | | starplus0