Filtered Navigation, Duplicate content issue on an Ecommerce Website
-
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution.
For example.
You have a page that lists 12 products out of 100:
companyname.com/productcategory/page1.htm
And then you filter these products:
companyname.com/productcategory/filters/page1.htm
The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products?
I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages?
I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
-
Hi Dstrunin,
I would still use the rel canonical tag even with or without the filter in place. So if you have a list of products displayed unfilter at companyname.com/productcategory/page1.htm, I would add a rel canonical with it pointing at companyname.com/productcategory/page1.htm. For the filtered results,companyname.com/productcategory/filters/page1.htm , the canoncial tag would still point to companyname.com/productcategory/page1.htm.
It doesn't hurt to have a canonical tag point to the same page it's on.
If you can't do that I would meta noindex those filtered pages and remove the robots.txt stuff. Robots.txt doesn't tell Google they can't index it it only says they can't crawl it. So they could still index old stuff they crawled before you did the robots.txt stuff or index the title tags.
Casey
-
I have been doing that, but robots.txt only does so much. I've implemented the meta noindex tag as well and it doesn't seem to be taking all the pages out of the index.
-
My unprofessional opinion would be to use robot.txt on some areas. I'll also be interested to see what the pros here say.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Title Tags/Meta Tags for Website with Multiple Locations
I currently have an insurance website that has over 40 offices in Ontario. The site also provides online quoting.
On-Page Optimization | | MainstreamMktg
In some of our programming, we have implemented variables in the URLS that will allow the quotes to go to the specific city offices - the issue I am having is that the quote in itself is the same quote form (same title, same meta) because it's technically one page on the website. We did it this way to avoid having to update 40 forms if a field on the form were to change. Is there any way I can relieve my site of this duplicate title tag/meta tag issue? Any insight would be really appreciated - thanks so much!0 -
Duplicate product content/disclaimers for non-e-commerce sites
This is more a follow-up to Rand's recent Whiteboard "Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs." I posed my question in the comments, but unsure it will get picked up. My situation isn't exactly the same, but it's similar: Our site isn't an e-commerce site and doesn't have user reviews yet, but we do have maybe 8 pages across 2 product categories featuring very similar product features with duplicate verbiage. However, we don't want to re-write it because we want to make it easy for users to compare apples-to-apples to easily see which features are actually different. We also have to run disclaimers at the bottom of each page.\ Would i-framing the product descriptions and disclaimers be beneficial in this scenario, with the addition of good content? It would still be nice to have some crawlable content on those pages, so the i-framing makes me nervous unless we compensate with at least some above-the-fold, useful content that could be indexed. Thanks, Sarah
On-Page Optimization | | sbs2190 -
Form Only Pages Considered No Content/Duplicate Pages
We have a lot of WordPress sites with pages that contain only a form. The header, sidebar and footer content is the same as what's one other pages throughout the site. Each form page has a unique page title, meta description, form title and questions but the form title, description and questions add up to probably less than 100 words. Are these form pages negatively affecting the rankings of our landing pages or being viewed as duplicate or no content pages?
On-Page Optimization | | projectassistant0 -
Is there a way to tell Google a site has duplicated content?
Hello, We are joining 4 of our sites, into 1 big portal, and the content from each site gonna be inside this portal and sold as a package. We don't wanna kill these sites we are joining at this moment, we just wanna import their content into the new site and in a few months we will be killing them. Is there a way to tell Google to not consider the content on these small sites, so the new site don't get penalised? Thanks,
On-Page Optimization | | darkmediagroup0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
Ecommerce category navigation structure -best practices
Hello, I've heard that there is a specific strategy for the best linkjuice distribution for categorizing an ecommerce site. How many links should there be on the home pages? Categories 1 deep? 2 deep? This client's customers don't like to go very deep, and they usually don't find our second page Thanks!
On-Page Optimization | | BobGW0 -
Duplicate product information on ecommerce site
I am planning to launch an ecommerce website soon. There is no way to start with the original content for such a small startup like me. It's pretty expensive to get original content for 1000 (around) products. You know, there are a lot of other costs such as, software licences, modules, developer, designer fees, wholesale purchases, monthly subscription for services etc... This is what i am planning to do: Start with duplicate manufacturers' or amazon's product description, meta tags etc. Then gradually turn them into an original one. I assume, google will give me a low score due to duplicate content but, if i start with duplicate content first, and then change with the original ones over the time, will this change my score?
On-Page Optimization | | Emphi0 -
Mobile vs Website Duplicate Data / Meta
SeoMoz is reporting duplicate content, title tags, and other meta information and seems to be showing that my mobile site (located on m.website.com) is a duplicate of website.com I was figuring I could add "Mobile SiteName" to the title to avoid the duplicate title but am a little confused as to how to approach the duplicate content side of it
On-Page Optimization | | Check_City0