Avoiding Duplicate Content - Same Product Under Different Categories
-
Hello, I am taking some past advise and cleaning up my content navigation to only show 7 tabs as opposed to the 14 currently showing at www.enchantingquotes.com. I am creating a "Shop by Room" and "Shop by Category" link, which is what my main competitors do. My concern is the duplicate content that will happen since the same item will appear in both categories. Should I no follow the "Shop by Room" page? I am confused as to when I should use a no follow as opposed to a canonical tag? Thank you so much for any advise!
-
It helps tremendously. Thank you so very much!
-
You need it in the head on each page that is a duplicate and pointing to the "original" page or the page you want indexed.
I'll just grab the same example from the help page -
Lets say you had some products that were the same -
http://www.example.com/product.php?item=swedish-fish&trackingid=1234567&sort=alpha&sessionid=5678asfasdfasfd http://www.example.com/product.php?item=swedish-fish&trackingid=1234567&sort=price&sessionid=5678asfasdfasfd
We would put the following on the duplicate pages -
So in your case you may want to put it in your "shop by room" etc. category pointing towards "shop by category" (or visa versa)
this would then tell Google to only index one of the tabs (either room or category tab) and the other is a duplicate.Hope that helps a bit. You can always take a look in the Google guide if you need a bit more help.
-
I'm sorry to bother again...but do I need to add the canonical tag to each product or will just once to the main category page suffice? Thanks
-
Thank you so much for clearing this up for me. You've been a huge help and I appreciate it so much.
-
There used to be a thought that you could horde all your link juice internally but in reality its not worth it. You could no follow the link to the duplicate page and hope that the majority of any links go to the correct page but if rel=canonical any link juice will go the correct place.
In short if you've set up rel=canonical make the rest of the site for users and the rest will come. Endorsing may not have been the best descriptive wording. Any questions let me know ill do my best to explain further. (you can also take a look at the "no follow" link above for more info.
-
I really appreciate this help - one last question, I am a bit confused about the link juice. I thought I would not want to pass link juice to these duplicate pages, but instead it looks like it is okay because it shows I am endorsing the pages? Thank you
-
-
Awesome - thank you SOOOO much, Chris!
-
Hello Cindy,
To stop the duplicate content use the rel=canonical tag it will then tell Google this is the original content and Google will then only index one of the tabs.
No follow is telling Google that you do not want to let "link juice" flow through that link and you're not really endoursing the link.
Rel=canonical is telling Google you have duplicate content but one of them is the original or the single on you want to be indexed.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
Subdomain vs Subdirectory - does the content make a difference?
So I've read through all of the answers that suggest using a subdirectory is the best way to approach this - you rank more quickly and have all of your content on one site. BUT what if you're looking to move into a totally new market that your current site/content isn't in any way relevant to? Some examples are Supermarkets such as Tesco (who seem to use a mix of methods) http://www.tesco.com/groceries/, http://www.clothingattesco.com/, http://www.tesco.com/bank/ which links out from their main site to http://www.tescobank.com/ etc and Sainsburys http://www.sainsburys.co.uk/ who use subdomains - here they have their grocery offering, their bank offering, clothes, phones etc split into subdomains. If you have a product that is totally new to your Brand and different from all the products on your current site, does this change the answer to subdirectory vs subdomain? Would be great to hear your expert opinions on this. Thanks
Intermediate & Advanced SEO | | giffgaff2 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
What Makes Good Content for Category Pages
Hello, We're putting (roughly, depending on the category) 500 words under the products or categories under category pages. We're having a writer do these who has to learn the products from scratch. With over 100 categories, it's not possible for the client to write 500 words for each one. We're wondering, 1. What should go into a category description? 2. How do you prep a writer to write these, and is it possible to do so and get good content? I'm afraid that we're writing just words for long tails, and I know on the product pages, home page, and articles that it has to be the best content, probably written by the client himself if he is knowledgable enough. Open to your suggestions on what should be in these, how long they should be, and who should write them.
Intermediate & Advanced SEO | | BobGW0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0 -
SEOMoz mistaking image pages as duplicate content
I'm getting duplicate content errors, but it's for pages with high-res images on them. Each page has a different, high-res image on it. But SEOMoz keeps telling me it's duplicate content, even though the images are different (and named different). Is this something I can ignore or will Google see it the same way too?
Intermediate & Advanced SEO | | JHT0