Duplicate content question...
-
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case?
Thank you
-
Thank you SO much for your helpful tips and advice!
-
I think Chris had it right in the first response. Use canonical tags on any duplicate content posts pointing to the one version you want to be the canonical version (so the canonical tag on both version needs to have the same canonical URL). You don't have to use canonical tags on the unique pages, but you can if you want to. Just set the canonical URL to the URL of that post.
The tags pages themselves don't matter. Don't worry about them. They won't cause you any issues. It's the posts you need to be thinking about. You also don't need to de-index anything. The canonical tags on the post pages will fix the issue and you don't have to do anything about the tags page.
-
Tag archives shouldn't be canonicalized because the hope is that you would be adding more content to both over time that would negate the relevancy of the canonical. Most people seem to NoIndex their tag archives... which is a fine solution considering Tag Archives have negligible benefits in Search. You could keep them indexed if you would prefer and the duplicate content issue will be solved once you add more content to each. Also, keep in mind that Matt Cutts, in a recent video, said that duplicate content will not penalize you if it is non-spammy... it just won't rank well (or at all) while duplicate.
-
Hi,
Since you are planning on adding other articles to each tag I don't think canonical is the best way to go since they are not (or are not going to be) true duplicate content. The main problem is the use of tags which usually produce pages with content which is available elsewhere on the site and as such do not have anything really unique about them (except for the tag). No-indexing is one solution, check out these discussions for more details and some related links.
http://moz.com/community/q/dupe-content-canonicalize-the-wordpress-tag-or-noindex http://moz.com/community/q/noindex-tags-in-wordpress
-
Even better, just no-index your tag pages.
Depending on what CMS your using (wordpress?) it wont be easy implementing a rel=canonical tag on a tag page
Its general practice to no-index category and tag pages to avoid duplicate issues.
-
Hello,
rel=canonical will fix this - https://support.google.com/webmasters/answer/139394?hl=en
What it does is tell Google "this page is a duplicate" then Google will only index one page. So in the head you would put the tag pointing to the page you want ranked.
Matt Cutts has said “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.”
So once you've done the canonical I wouldn't stress a great deal about the duplicate content (though I still recommend implementing the rel=canonical tag when possible)
Hope that helps a bit.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Simple duplicate content query
Hello Community, One of my clients runs a job board website. They are having some new framework installed which will lead to them having to delete all their jobs and re-add them. The same jobs will be re-posted but with a different reference number which in turn with change each URL. I believe this will cause significant duplicate content issues, I just thought I would get a second opinion on best practice for approaching a situation like this. Would a possible solution be to delete jobs gradually and 301 re-direct old URLs to new URLs? Many thanks in advance, Adam
Technical SEO | | SO_UK0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Duplicate Content Question (E-Commerce Site)
Hi All, I have a page that ranks well for the keyword “refurbished Xbox 360”. The ranking page is an eCommerce product details page for a particular XBOX 360 system that we do not currently have in stock (currently, we do not remove a product details page from the website, even if it sells out – as we bring similar items into inventory, e.g. more XBOX 360s, new additional pages are created for them). Long story short, given this way of doing things, we have now accumulated 79 “refurbished XBOX 360” product details pages across the website that currently, or at some point in time, reflected some version of a refurbished XBOX 360 in our inventory. From an SEO standpoint, it’s clear that we have a serious duplicate content problem with all of these nearly identical XBOX 360 product pages. Management is beginning to question why our latest, in-stock, XBOX 360 product pages aren't ranking and why this stale, out-of-stock, XBOX 360 product page still is. We are in obvious need of a better process for retiring old, irrelevant (product) content and eliminating duplicate content, but the question remains, how exactly is Google choosing to rank this one versus the others since they are primarily duplicate pages? Has Google simply determined this one to be the original? What would be the best practice approach to solving a problem like this from an SEO standpoint – 301 redirect all out of stock pages to in stock pages, remove the irrelevant page? Any thoughts or recommendations would be greatly appreciated. Justin
Technical SEO | | JustinGeeks0 -
Duplicate content - wordpress image attachement
I have run my seomoz campaign through my wordpress site and found duplicate content. However, all of this duplicate content was either my logo or images and no content with addresses like /?attachement_id=4 for example . How should I resolve this? thank you.
Technical SEO | | htmanage0 -
Mod Rewrite question to prevent duplicate content
Hi, I'm having problems with a mod rewrite issue and duplicate content On my website I have Website.com Website.com/directory Website.com/directory/Sub_directory_more_stuff_here Both #1 and #2 are the same page (I can't change this). #3 is different pages. How can I use mod rewrite to to make #2 redirect to #1 so I don't have duplicate content WHILE #3 still works?
Technical SEO | | kat20 -
Aspx filters causing duplicate content issues
A client has a url which is duplicated by filters on the page, for example: - http://www.example.co.uk/Home/example.aspx is duplicated by http://www.example.co.uk/Home/example.aspx?filter=3 The client is moving to a new website later this year and is using an out-of-date Kentico CMS which would need some development doing to it in order to enable implementation of rel canonical tags in the header, I don't have access to the server and they have to pay through the nose everytime they want the slightest thing altering. I am trying to resolve this duplicate content issue though and am wondering what is the best way to resolve it in the short term. The client is happy to remove the filter links from the page but that still leaves the filter urls in Google. I am concerned that a 301 redirect will cause a loop and don't understand the behaviour of this type of code enough. I hope this makes sense, any advice appreciated.
Technical SEO | | travelinnovations0 -
What are some of the negative effects of having duplicate content from other sites?
This could include republishing several articles from another site with permission.
Technical SEO | | Charlessipe0