Tags creating duplicated content issue?
-
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation.
For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this
In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue.
My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
-
I have different meta content since a long time still showing as a duplicate and on just looking at the body content it is identical. Is there any quick way I can manually add something to the robots file to take the duplicates away? Canonical is not working for me as it just points to the same url - not the MAIN one you want. So there is nothing as good as yoast for joomla, they should make that and make a lot of money! OOTB joomla is poor at seo if you dont know how to make menus in joomla your site can have massive issues. Without a tool like MOZ you may never know why your quality content can't rank - gee thanks joomla
-
Hi
Ahhhh... gotcha thought it was wordpress
Your best bet is to have a unique description generated in Joomla for each tag archive. Robots.txt won't necessarily remove the URLs from Google. If you want to deindex them, you need to use meta robots tag.
Anyhow, hope that got you in the right direction!
-Dan
-
Dear Dan,
Thank you so much for spending time on our issue and on the advice. Im looking forward to read your article.
Unfortunetly our blog for technical purposes is not in Wordpress but in Joomla, so i will look for a similar solution there. The desperate solution i guess is to disallow tag urls in robots.txt. But i would try avoid that. On the other hand, since i also use categories to index the content then i assume this will not generate any issue of hiding content.
-
Hey Guys
Again, whether full posts or excerpts are being shown for tag archives, is important (I would vote on excerpts) but see my answer above. The tag archives all have the same description. That's where Moz is likely getting the duplicate errors from - and not likely because of tag pages being similar to post or category pages.
The quick fix on this is to use an SEO plugin like Yoast and create a description template for the tag archives.
But BEST case scenario in a perfect setup, would be have tags totally unique from categories, and not index tag archives at all.
Canonicals should only be used sparingly and when no other measure can be taken.
It also seems this is not the best theme, so there are other issues at play as well, too many to go through in just a Q&A format.
-Dan
-
Hi
Just want to add two cents to this... a canonical should really be the last resort if it can't be resolved with robots meta, url structure, or content.
The issue here is that Moz is bringing back duplicate content errors because the tags all have the same description. This can be fixed (as noted in my full answer) by creating a description template for tag archives with a plugin like Yoast SEO.
The canonical may not resolve anything because the tag pages at best shouldn't be indexed to begin with - and if they are indexed, the descriptions should be unique.
-Dan
Edit - just realized they are using Joomla. The same can apply, but I'm not as familiar with Joomla, so if there's a way to create descriptions for the tags with Joomla that's the best bet still.
-
Hi!
Just need to clear things up here, sorry I'm a little late to responding!
1. Quick Fix - Create a description template for tag archives
You're getting duplicate errors because your tag archives all have the same meta description. Use an SEO plugin like Yoast SEO for wordpress something for Joomla and create a template for your tag descriptions. This will give each tag archive a unique description and eliminate the duplicate errors.
2. Long Term Fix - Root of The Problem
The real ROOT of the issue, is a combination of maybe a poor theme, no SEO plugin (that I can see) and tag pages being used incorrectly.
-
Tags should be completely different than any categories
-
And as standard practice I NOINDEX tags. Because there content is so similar to other pages, and it also may not be the best user experience. There may be exceptions to this but its a general rule I follow.
Now, with that said, don't just go deindexing your tag archives.
Tomorrow (May 8th 2012), I have an extensive article going up on the Moz blog about WordPress and duplicate content. I suggest reading that article to get a good understanding of how all the elements work. And perhaps in the long term you can work towards a more robust WordPress setup. But for now, no harm done the way it is.
Hope that helps!
-Dan
Edit - Realized they are using Joomla. The same concepts apply, but with a technical implementation that works with Joomla (which I am not as familiar with).
-
-
I think you should be good leaving it alone, then.
You could put rel=canonical on the post page only (don't put it on the tags or category pages) but that might be more trouble than it's worth, depending on the restrictions imposed by the CMS.
-
I don't believe the actual tag pages are the issue here. It's the fact that the same page can be accessed by 3 different url's because of the tags it's under. Canonical links will take care of this.
-
I am not sure if it's possible with the publishing system you are working with, but there are CMS systems on the market who have solved this issue.
They have done the following approach:
Create your main Article, Blog etc., tag them with your keywords and on your keyword page show the Article content as a teaser with a ''Read More" link to the full content page.
This is not considered as duplicate content!
Hope this helps!
-
Hi Pantelis,
I think that whether or not this is a problem, and how it should be fixed, depends on how your blog is set up.
The guide Justin mentioned is a good resource. Before you jump in, I think you should consider these questions:
When you go to domain.com/blog/seo etc. are the posts excerpted, or are full posts being displayed?
When someone clicks on the title of a blog post having found it under a tag (e.g. going to domain.com/blog/marketing and clicking on one of the posts) what URL is being displayed for the individual post?
e.g. is it domain.com/blog/seo/great-post-1 or is it domain.com/blog/great-post-1 ?
What really matters for duplicate content and canonicalization is whether the URL for the individual blog post is unique.
If the blog post has one unique URL, no matter how you get to it, and if the tags pages are displaying excerpts, then the only place you should be using rel=canonical is on the blog post itself. I think putting rel=canonical on a tags page that's only displaying titles and excerpts is asking for trouble. I don't like the idea of the search engines potentially thinking that your tag page, which has partials of many posts, is the original source.
If you're displaying full blog posts on the tags pages, then the solution is probably to switch it to excerpts and canonicalize only the individual blog posts.
Reference the SEOmoz blog: The SEOmoz.org/blog page doesn't use rel=canonical, and only displays excerpts, while seomoz.org/blog/post-title uses rel=canonical and displays the full post.
-
Its not really bad, but there is every chance it will affect your rankings as google will not know which page is dominant and in turn will not know which version it should show to searchers
the best method of resolving the issue is to use the rel=canonical tag as this allows you to tell google which page is the dominant version
see article here for more details:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate page titles and hreflang tags
Moz is flagging a lot of pages on our site which have duplicate page titles. 99% of these are international pages which hreflang tags in the sitemap. Do I need to worry about this? I assumed that it wasn't an issue given the use of hreflang. And if that's the case, why is Moz flagging them as an issue? Thanks.
On-Page Optimization | | ahyde0 -
Duplicate content issue, across site domains (blogging)
Hi all, I've just come to learn that a client has been cross-posting their blog posts to other blogs (on higher quality domains, in some cases). For example - this is the same post on 3 different blogs. http://thebioethicsprogram.wordpress.com/2014/06/30/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
On-Page Optimization | | ketanmv
http://blogs.law.harvard.edu/billofhealth/2014/06/29/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
http://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html
And, sometimes a 4th time, on an NPR website. I'm assuming this is doing no one any favors and Harvard or NPR is going to earn the rank most every time. I'm going to encourage them to publish only fresh content on their real blog, would you agree? Can this actually harm the ranking of their blog and website - should we delete the old entries when migrating the blog? They are going to move their Wordpress Blog to hosting on their real domain soon:
http://www.bioethics.uniongraduatecollege.edu/news/ The current set up is not adding any value to their domain. Thank you for any advice! Ketan0 -
How Can I Fix Adobe Bridge Photo Galleries and Duplicate Content?
I have used the Adobe bridge program for a number of photo galleries on a remodeling site and it is showing a large amount of duplicate titles, etc. Is there an easy fix to this? anyone?
On-Page Optimization | | DaveBrown3330 -
How to fix duplicate issue among multiple root domains
Hello, I’m doing SEO for one E-commerce website which name is Lily Ann Cabinets & I’ve around 300 different root domains which having same linking structures, same design & even same product database for all 300 websites, but currently I’m focusing only on Lily Ann Cabinets website & trying to get ranking on some targeted keywords, but website is not performing well in Google.com For Example: http://www.lilyanncabinets.com/ (Main Websites)
On-Page Optimization | | CommercePundit
http://www.orlandocabinets.com/
http://www.chicagocabinets.org/
http://www.miamicabinets.org/
http://www.newyorkcabinets.org/
http://www.renocabinets.org/ So please can anyone tell that Will it create duplicate issue in search engines or may be due to this reason website doesn’t have good ranking in search engines, then how can I fix this issue? Do I have to make different structures for Lily Ann Cabinets?0 -
Offer landing page, duplicate content and noindex
Hi there I'm setting up a landing page for an exclusive offer that is only available (via a link) to a particular audience. Although I've got some specific content (offer informaiton paragraph), i want to use some of the copy and content from one of my standard product pages to inform the visitors about what it is that i'm trying to sell them. Considering I'm going to include a noindex on this page, do i need to worry about it having some content copied directly from another page on my site? Thanks
On-Page Optimization | | zeegirl0 -
Is is it true that Google will not penalize duplicated content found in UL and LI tags?
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
On-Page Optimization | | jdjamie0 -
Notonthehighstreet.co.uk - duplicate content? a reason to not sell via 3rd parties
A mixture of questions and discussion Question 1. can the following two pages be considered duplicate content http://www.notonthehighstreet.com/gardenbeet/product/deer-head-wall-art http://www.notonthehighstreet.com/1/1/219933-deer-head-wall-art-by-garden-beet.html both pages are indexed and both pages have different meta - aimed at different search combinations Discussion The search for 'deer head wall art gardenbeet' is generated by my PR company - we have done loads of print advertising for this item yet the sheer mass and volume of noths.com stops my store http://www.gardenbeet.com/garden-wall-art/58-deer-head.html from obtaining the number one position. All is fair in the business world I suppose BUT the original marketing machine for noths.com was claiming that they were assisting the small business owner. I paid them over £600 to join and now they compete with me head on. Stupid me I suppose. Let this be a key learning for those toying with the idea of investing in their own SEO or a 3rd party selling platform. Ho hum
On-Page Optimization | | GardenBeet0 -
Duplicate pages
Hi, I am using a CMS that generates dynamic urls that according to the SeoMoz tool will be indexed as duplicate pages. The pages in questions are forms, blog-posts etc. that are not crucial to achieve ranking for. I do worry though about the consequences of having 20 (non-duplicate)pages with static urls and about 100 pages that are duplicates with dynamic urls. What consequences will this have for the speed that the robots crawl the site and could there be negative effects on ranking for the entire domain?
On-Page Optimization | | vibelingo0