How should I manage duplicate content caused by a guided navigation for my e-commerce site?
-
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
-
This was exactly what I was looking for. Thank you very much you have really helped me out.
-
Hi there,
My former agency has a good post on pagination that you might find useful: http://www.ayima.com/seo-knowledge/conquering-pagination-guide.html
You definitely want to cut down on duplicate content as much as possible - let me know if that post does the trick for the ecommerce question!
Cheers
-
Hi David,
I would like to give you an article at hand:
Maybe you noticed it already? It
s hard to give you a recommendation for the refinement levels... in general I would advise you to be very careful with that... to me it sounds not so bad what you
ve done so far... -
You are absolutely right about nofollow overuse being a trust factor. I had not thought about that aspect of this issue, and thank you for bringing it up. In regards to using canonical and rel prev / next, I am not sure what an implementation of this would look like. I added in rel canonical pointing to the www version of the page URL without any unnecessary parameters, and I am familiar with the idea of having a "Show All" page so as to avoid pagination (we added in our pagination parameters into Google Webmaster Tools instead). Would you recommend using canonical to roll up results pages to a category and parent refinement level, and if so how many refinements would you recommend before drawing the line?
Thank you again,
David
-
The only differentiation (if there is any) you can make when it comes up to DC is between partial and "normal" DC... keep in mind that any type (!!!) of DC won`t do your site any good! Avoid DC whenever and wherever you can! Under all circumstances... I do not know Endeca but dealing with DC caused by a navigational structure is a serious problem, especially within a shop system.
There are differnt ways to fight DC or to confine it... most common is rel=prev/next or rel=canonical... these are alternatives and never perfect solutions but there are lots of scenarios where this is a big help.
I would be careful with follow and nofollow... if you let the robot follow everything this might lead to lots of errors in the scenario you describe but on the other hand setting many URLs to nofollow can also harm your site because it`s not a very trustworthy signal for Google
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
How should I handle pagination on an e-commerce site?
I am looking at one of our category pages and it has 25 additional pages for a total of 26 pages. The url for the first page looks good, then the next one ends with ?SearchText=768&SearchType=Category All additional pages have the same url. My first concern was duplicate content, but after looking no pages after the 1st are even indexed. What is the best way to handle this?
Intermediate & Advanced SEO | | EcommerceSite0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Duplicate content in Webmaster tools, is this bad?
We launched a new site, and we did a 301 redirect to every page. I have over 5k duplicate meta tags and title tags. It shows the old page and the new page as having the same title tag and meta description. This isn't true, we changed the titles and meta description, but it still shows up like that. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Is traffic and content really important for an e-commerce site???
Hi All, I'm maintaining an e-commerce website and I've encountered some related keywords that I know will not convert to sales but are related to the subject and might help becoming an "authority". I'll give an example... If a car dealership wrote an amazing article about cleaning a car.
Intermediate & Advanced SEO | | BeytzNet
Obviously it is related but the chances of someone looking to clean his car will go ahead and buy one now are quite low. Also, he will probably bounce out of this page after reading the piece. To conclude, Would such an article do GOOD (helping to become an authority and having more visitors) or BAD (low conversion rate and high bounce rate)? Thanks0