What is the best solution for printable product pages (duplicate content)?
-
What do you think is the best solution for preventing duplicate content issues on printable versions of product pages? The printable versions are identical in content.
Disallow in Robots.txt?
Meta Robots No Index, Follow?
Meta Robots No Index No Follow?
Rel Canonical?
-
I think I will go with canonical.
Just a thought why would I put all the fules in a subdirectory instead of just blocking products.php?printable=Y
I guess it would be the same thing I think?
-
If you don't want to do the canonical method (which is probably the best) you could always make a /print directory blocked by robots.txt and no follow the links and no-index the pages. That should work as well, but could possibly waste link juice if people ever link to the printable version.
Cheers,
Vinnie -
I second Marcus, Canonical is the way to go
-
Canonical - this way, if you pick up links to the print pages the value from the links will be passed to the full version of the page.
That's my tuppence (2 cents) at least!
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
Hello Moz Quick question. Can I copy and paste a paragraph of text (100 words) from my main category page into my products without hurting SEO of the category page? The content on my category page is so good I don't want to take chances as this is what I will be ranking for. Thanks
On-Page Optimization | | crocman0 -
Long list of companies spread out over several pages - duplicate content?
Hi all, I am currently working with a company formation agent. They have a list of every limited company spread over hundreds of pages. What do you guys think? Is there a need for Canonicals? The website is ranking pretty well but I want to make sure there aren't any problems in the future. Here are two pages as examples: http://www.formationsdirect.com/companysearchlist.aspx?start=MULLAGHBOY+CONSTRUCTION+LIMITED&next=1# http://www.formationsdirect.com/companysearchlist.aspx?start=%40a+company+limited&next=1# Also what about the actual company pages? See an example below http://www.formationsdirect.com/companysearchlist.aspx?name=AMNA+CONSTRUCTION+LTD&number=06630333#.U8PW6_ldX1s Thanks in advance Aaron
On-Page Optimization | | AaronGro0 -
Duplicate pages and slight product variations
Hi, I'm new here, first post... I've started working on an existing Magento website which is selling furniture. There are products such as leather dining chairs which have very detailed product descriptions. The problem is that there is separate a page for every colour the chair comes in (with exactly the same on-page text), so the page is effectively duplicated 5 times, one for red, one for blue etc... This is made even worse by the fact that the website builder has listed the products in multiple different categories. This means that the same basic product description is in use on maybe 20 or so pages. How would you guys deal with product descriptions for multiple, very similar products where only the colour is different? There's also the problem of very similar title tags etc... Thanks for any help. Very much appreciated.
On-Page Optimization | | JM67
J.0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Duplicate Page Titles and Duplicate Content
I've been a Pro Member for nearly a year and I am bound and determined to finally clean up all the crawl errors on our site PracticeRange.com. We have 180 errors for Duplicate Page Titles and Duplicate Content. I fixed many of the pages that were product pages with duplicate content. Those product descriptions were edited and now have unique content. However, there remain plenty of the errors that are puzzling. Many of the errors reference the same pages, for example, the Home Page, Login Page and the Search page (our catalog pages).
On-Page Optimization | | AlanWills
In the case of the Catalog Page errors, these type pages would have the same title every time "Search" and the results differ according to category. http://www.practicerange.com/Search.aspx?m=6
http://www.practicerange.com/Search.aspx?m=15 If this is rel=canonical issue, how do I fix it on a search result page? I want each of the different category type pages to be indexed. One of them is no more important than the other. So how would I incorporate the rel=canonical? In the case of the Home Page errors, I'm really confused. I don't know where to start to fix these. They are the result of a 404 error that leads to the home page. Is the content of the 404 page the culprit since it contains a link to the home page? Here are examples of the Home Page type of crawl errors. http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aids/Rubber-Wooden-Tee-Holder.aspx http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aid/Impact-Bag.aspx Thanks , Alan WillsPracticeRange.com0 -
How best to approach archiving badly optimised content
I signed up SEO Moz about a month ago as i'm currently rebuilding my site from scratch and wanted to learn from current mistakes. At present I use the forum software Invision Power Board to manage my site and one thing i've learnt is that it is terrible for SEO, there are so many thousands of errors listed by the crawler that it's not even worth trying to fix it. However because it has 5 or 6 years worth of content alot of which is on Google I don't want to totally remove it, rather I would prefer to archive it of with a big banner at the top letting anybody that visits it know that it's no longer in use and pointing them to the frontpage. I should note that it is in a subfolder already so the location of any of the links won't be changed. So the few questions I have are: The forum index has alot of link juice and I would like to redirect that to the new forum index, however for archive purposes the old index still needs to be accessible. Some topics are very popular and appear high in Google and have alot of backlinks. The important information in these forum topics will be available elsewhere on the new rebuilt site. Again I would like to redirect both link juice and users to the new page, however being a forum topic there are tens or hundreds of pages of old comments that need to still be accessible for reference. There are bound to be duplicate meta title and description issues with new similarly named categories appearing both on the new site and the old forum, is this going to be that much of a problem? So really what i'm asking is, how should I go about archiving this of without destroying content and rankings, but still making sure that the new stuff is getting the right exposure both to users and search engines alike?
On-Page Optimization | | freezedriedmedia0 -
Duplicate pages
Hi, I am using a CMS that generates dynamic urls that according to the SeoMoz tool will be indexed as duplicate pages. The pages in questions are forms, blog-posts etc. that are not crucial to achieve ranking for. I do worry though about the consequences of having 20 (non-duplicate)pages with static urls and about 100 pages that are duplicates with dynamic urls. What consequences will this have for the speed that the robots crawl the site and could there be negative effects on ranking for the entire domain?
On-Page Optimization | | vibelingo0 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0