Filtered Navigation, Duplicate content issue on an Ecommerce Website
-
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution.
For example.
You have a page that lists 12 products out of 100:
companyname.com/productcategory/page1.htm
And then you filter these products:
companyname.com/productcategory/filters/page1.htm
The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products?
I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages?
I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
-
Hi Dstrunin,
I would still use the rel canonical tag even with or without the filter in place. So if you have a list of products displayed unfilter at companyname.com/productcategory/page1.htm, I would add a rel canonical with it pointing at companyname.com/productcategory/page1.htm. For the filtered results,companyname.com/productcategory/filters/page1.htm , the canoncial tag would still point to companyname.com/productcategory/page1.htm.
It doesn't hurt to have a canonical tag point to the same page it's on.
If you can't do that I would meta noindex those filtered pages and remove the robots.txt stuff. Robots.txt doesn't tell Google they can't index it it only says they can't crawl it. So they could still index old stuff they crawled before you did the robots.txt stuff or index the title tags.
Casey
-
I have been doing that, but robots.txt only does so much. I've implemented the meta noindex tag as well and it doesn't seem to be taking all the pages out of the index.
-
My unprofessional opinion would be to use robot.txt on some areas. I'll also be interested to see what the pros here say.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding internal duplicate content
Suppose two of my webpages from the same site are having 30% to 35% common content. The reason behind this common content is that I put same data and images (in the main content area) since both pages are partially related. But, title tag, meta description, h1 tag, urls are different.
On-Page Optimization | | b.me
My questions are Can Google consider it as duplicate content?
Can it hamper the ranking of my pages ?
How can I deal with it?0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Content on ecommerce categories - good or bad?
We have a case with a client where they previously had content on top of their most important ecommerce categories. The content was well integrated and should in my opinion enhance the category experience, but after doing some A/B testing they proved to only decrease the conversion rates when sending traffic directly to those categories. Around that topic I have two questions: Is it a bad thing to put the content BELOW the categories? I need examples of categories where content and products are very well integrated and enhances the category experience - any tips?
On-Page Optimization | | Inevo0 -
Duplicate content affects on overall rankings
Hi guys, I have a website that has 23 pages with duplicate content. These pages serve the same function, which enables customers to upload their images. There is not much content on each one but we require a different page for each of our products, here is an example page: http://www.point101.com/giclee_printing/upload#/upload I don't think it makes sense to use a canonical tag as each page is for a different product and I think its going to be difficult to differentiate each page. I was wondering: 1. If this has a negative effect on the ranking of our homepage and other main product pages or if its an issue we do not need to worry too much about. 2. If anyone has any other ideas as to how we can resolve this issue. Thanks,
On-Page Optimization | | KerryK
Kerry0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
Duplicate content - Opencart
In my last report I have a lot of duplicate content. Duplicate pages are: http://mysite.com/product/search&filter_tag=Сваров�% http://mysite.com/product/search&filter_tag=бижу http://mysite.com/product/search&filter_tag=бижузо�%8 And a lot of more, starting with -- http://mysite.com/product/search&filter_tag= Any ideas? Maybe I should do something in robots.txt, but please tell me the exact code. Best Regards, Emil
On-Page Optimization | | famozni0 -
Silo and content
I'm about to launch my site but I have a question regarding content and silo structure. If I don't have enough content to fill 4 subpages, could it be better to have only a content-keyword-rich landing page for a silo instead of multiple pages with poor content? Thank you!
On-Page Optimization | | mediodigital0 -
Home page duplicated content issue
Hi there! The home page of my site can be seen under www.mysitename.com and www.mysitename.com/EN/ or www.mysitename.com/ES/ (depending on your language). I understand that this is duplicated content because they show the same content under different URLs. To solve this we've done (depending on your language) a 301 redirect from www.mysitename.com to www.mysitename.com/EN/ or www.mysitename.com/ES/ Is this correct? Thanks!
On-Page Optimization | | Xopie0