Tags creating duplicated content issue?
-
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation.
For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this
In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue.
My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
-
I have different meta content since a long time still showing as a duplicate and on just looking at the body content it is identical. Is there any quick way I can manually add something to the robots file to take the duplicates away? Canonical is not working for me as it just points to the same url - not the MAIN one you want. So there is nothing as good as yoast for joomla, they should make that and make a lot of money! OOTB joomla is poor at seo if you dont know how to make menus in joomla your site can have massive issues. Without a tool like MOZ you may never know why your quality content can't rank - gee thanks joomla
-
Hi
Ahhhh... gotcha thought it was wordpress
Your best bet is to have a unique description generated in Joomla for each tag archive. Robots.txt won't necessarily remove the URLs from Google. If you want to deindex them, you need to use meta robots tag.
Anyhow, hope that got you in the right direction!
-Dan
-
Dear Dan,
Thank you so much for spending time on our issue and on the advice. Im looking forward to read your article.
Unfortunetly our blog for technical purposes is not in Wordpress but in Joomla, so i will look for a similar solution there. The desperate solution i guess is to disallow tag urls in robots.txt. But i would try avoid that. On the other hand, since i also use categories to index the content then i assume this will not generate any issue of hiding content.
-
Hey Guys
Again, whether full posts or excerpts are being shown for tag archives, is important (I would vote on excerpts) but see my answer above. The tag archives all have the same description. That's where Moz is likely getting the duplicate errors from - and not likely because of tag pages being similar to post or category pages.
The quick fix on this is to use an SEO plugin like Yoast and create a description template for the tag archives.
But BEST case scenario in a perfect setup, would be have tags totally unique from categories, and not index tag archives at all.
Canonicals should only be used sparingly and when no other measure can be taken.
It also seems this is not the best theme, so there are other issues at play as well, too many to go through in just a Q&A format.
-Dan
-
Hi
Just want to add two cents to this... a canonical should really be the last resort if it can't be resolved with robots meta, url structure, or content.
The issue here is that Moz is bringing back duplicate content errors because the tags all have the same description. This can be fixed (as noted in my full answer) by creating a description template for tag archives with a plugin like Yoast SEO.
The canonical may not resolve anything because the tag pages at best shouldn't be indexed to begin with - and if they are indexed, the descriptions should be unique.
-Dan
Edit - just realized they are using Joomla. The same can apply, but I'm not as familiar with Joomla, so if there's a way to create descriptions for the tags with Joomla that's the best bet still.
-
Hi!
Just need to clear things up here, sorry I'm a little late to responding!
1. Quick Fix - Create a description template for tag archives
You're getting duplicate errors because your tag archives all have the same meta description. Use an SEO plugin like Yoast SEO for wordpress something for Joomla and create a template for your tag descriptions. This will give each tag archive a unique description and eliminate the duplicate errors.
2. Long Term Fix - Root of The Problem
The real ROOT of the issue, is a combination of maybe a poor theme, no SEO plugin (that I can see) and tag pages being used incorrectly.
-
Tags should be completely different than any categories
-
And as standard practice I NOINDEX tags. Because there content is so similar to other pages, and it also may not be the best user experience. There may be exceptions to this but its a general rule I follow.
Now, with that said, don't just go deindexing your tag archives.
Tomorrow (May 8th 2012), I have an extensive article going up on the Moz blog about WordPress and duplicate content. I suggest reading that article to get a good understanding of how all the elements work. And perhaps in the long term you can work towards a more robust WordPress setup. But for now, no harm done the way it is.
Hope that helps!
-Dan
Edit - Realized they are using Joomla. The same concepts apply, but with a technical implementation that works with Joomla (which I am not as familiar with).
-
-
I think you should be good leaving it alone, then.
You could put rel=canonical on the post page only (don't put it on the tags or category pages) but that might be more trouble than it's worth, depending on the restrictions imposed by the CMS.
-
I don't believe the actual tag pages are the issue here. It's the fact that the same page can be accessed by 3 different url's because of the tags it's under. Canonical links will take care of this.
-
I am not sure if it's possible with the publishing system you are working with, but there are CMS systems on the market who have solved this issue.
They have done the following approach:
Create your main Article, Blog etc., tag them with your keywords and on your keyword page show the Article content as a teaser with a ''Read More" link to the full content page.
This is not considered as duplicate content!
Hope this helps!
-
Hi Pantelis,
I think that whether or not this is a problem, and how it should be fixed, depends on how your blog is set up.
The guide Justin mentioned is a good resource. Before you jump in, I think you should consider these questions:
When you go to domain.com/blog/seo etc. are the posts excerpted, or are full posts being displayed?
When someone clicks on the title of a blog post having found it under a tag (e.g. going to domain.com/blog/marketing and clicking on one of the posts) what URL is being displayed for the individual post?
e.g. is it domain.com/blog/seo/great-post-1 or is it domain.com/blog/great-post-1 ?
What really matters for duplicate content and canonicalization is whether the URL for the individual blog post is unique.
If the blog post has one unique URL, no matter how you get to it, and if the tags pages are displaying excerpts, then the only place you should be using rel=canonical is on the blog post itself. I think putting rel=canonical on a tags page that's only displaying titles and excerpts is asking for trouble. I don't like the idea of the search engines potentially thinking that your tag page, which has partials of many posts, is the original source.
If you're displaying full blog posts on the tags pages, then the solution is probably to switch it to excerpts and canonicalize only the individual blog posts.
Reference the SEOmoz blog: The SEOmoz.org/blog page doesn't use rel=canonical, and only displays excerpts, while seomoz.org/blog/post-title uses rel=canonical and displays the full post.
-
Its not really bad, but there is every chance it will affect your rankings as google will not know which page is dominant and in turn will not know which version it should show to searchers
the best method of resolving the issue is to use the rel=canonical tag as this allows you to tell google which page is the dominant version
see article here for more details:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating Content for over 50,000 Pages
Hi, Our site is a football (soccer) statistics sites. We gather information on upcoming games and post results of past games. At the moment we have over 50,000 pages of results each having in-game data displayed. The main problem I have is none of these match data pages has any text.Mostly tables of stats. Could anyone suggest a way of creating unique content for these pages? If I created some generic a paragraphic of text that changed based on stats and figures would this be seen as duplicate content?
On-Page Optimization | | jtatsubana0 -
How to deal with duplicate content when presenting event and sub-events information?
Hi, I'm have a sport event calendar website.
On-Page Optimization | | ahotu
It presents events that may have multiple races.
The event has its own page as well as the races. example :
Event: /event/edinburgh-marathon-festival Races:
/race/emf-half-marathon
/race/emf-10-km
/race/edinburgh-marathon
/race/emf-5-km The pages may have a lot of information in common (location, date, description) and they all link to each other.
What would be the best practices to avoid having the pages considered duplicate content by Google? Thanks0 -
Form Only Pages Considered No Content/Duplicate Pages
We have a lot of WordPress sites with pages that contain only a form. The header, sidebar and footer content is the same as what's one other pages throughout the site. Each form page has a unique page title, meta description, form title and questions but the form title, description and questions add up to probably less than 100 words. Are these form pages negatively affecting the rankings of our landing pages or being viewed as duplicate or no content pages?
On-Page Optimization | | projectassistant0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Will canonical tag on non-copy content harm my site?
Days ago I added rel=canonical tags on my site. For the post pages, I add canonical tag on both post page (www.exmample.com/post.html) and comment page (www.exmample.com/post-sms.html), all the canonical tags are pointing to post page, but in fact there are only comments on the comment page. For product pages, I add the canonical tags on both product info page, download page, and order page, all of them are pointing to the info page, while in fact they are displaying different content. I no-indexed the comment page, download page, and order page for a long time. After I added the canonical tags, the traffics dropped (not hugely but slowly and steadily). Are my actions harming my site? Is this a normal flux after adding codes to the entire site, or it's the bad outcome for wrong SEO actions? PS: I can't change the site structure, so it's not possible to combine post and comment pages into one, so do the product pages. Thank you guys
On-Page Optimization | | JonnyGreenwood0 -
When it comes to duplicate page content how do I deal with correcting it. Its a dynamic e commerce site.
I am under the impression that with ecommerce sites this happens often and that there's a plug in or just simply not worry about it since queries will often find similar conent.
On-Page Optimization | | Wayne_c0 -
Creating a sitemap
what is the best place to go to create a xml sitemap? I have used xml-sitemap.com but it only does 500 pages. Any suggestions? Does a sitemap have value on the SEO front?
On-Page Optimization | | jgmayes0 -
How to avoid content duplication of my websites
Hello, We are having 4 domains abc.com, abc.in, abc.co.uk, abc.com.au with same content and same inner pages (abc.com/page1, abc.in/page1 etc.) targeting on different geographical areas. How can we avoid duplicate content issue in the home page as well as in the inner pages. Abc.com is the major site. Thank you.
On-Page Optimization | | semvibe0