Wordpress: Tags generate duplicate Content - just delete the tags!?
-
Asking people, they say tags are bad and spamy and as I can see they generate all my duplicate page content issues.
So the big question is, why Google very often prefers to show in SERPS these Tag-URLS... so it can't be too bad! :)))?
Then after some research I found the "Term Optimizer" on Yoast.com ... that should help exactly with this problem but it seems not to be available anymore?
So may be there another plugin that can help... or just delete all tags from my blog? and install permanent redirects?
Is this the solution? -
I don't understand the question. Perhaps submit a new question to create a new thread, and elaborate on the issue.
-
It should take care of the duplicate content issue as it relates to tag pages, but not it will not show as a 404 because the page will still be there, just removed from Google's index.
If you adjusted the settings in Yoast it should apply to all future posts and tags.
-
My blog provides articles for technical queries and repair options. Almost many keywords does have the same sort of steps to define and we are into problems now..
CASE1: We have replaced the keywords and created the same content for all posts.
CASE2: To solve the above SEO loop hole, I have added these tags for relevant steps which follow the same troubleshooting.
Now again tags are bothering me.
What should I do for my work?
-
Hi,
I also have this problem with our site with 50 duplicate pages recorded on Moz, If I do a no index, follow on Yoast will it create 404's or will I only get a 404 if i remove the tags completely?
Also is there a way to set this up for all future blog posts on yoast or will I need to do this every time I submit a new post?
Thanks
Jeh
-
I said the same thing! I couldn't figure out why it wasn't the default either. We've had the blog up since before I worked for my current company, and so there were a bunch of duplicate pages I'm trying to fix now... frustrating!
Best of luck,
Tyler
-
Hi Tyler,
yes I ended up with the simple checking of the "noindex,follow" option for Tag within the YOAST Plugin (Taxonomies-Tab).
Now all the duplicate Content Issues are gone... so why is this option not set per default? hmmm
Thanks
Holger -
Hey!
I was just looking into this same issue for myself, and I figured I'd share the URL from Yoast that ended up answering most of my questions. It's in section 3, gives you some solutions to the duplicate page issue.
Hope this helps,
Tyler
-
Does that validate?
I'm no expert, but I think the trailing slash shouldn't be used with meta elements, except maybe if the declared DocType is XML or XHTML? If you do, leave a space just before it, like in your rel canonical line, for compatibility's sake (old stuff but just in case, it won't hurt).
And shouldn't it be a space between the comma and follow in "noindex, follow"?
-
Yes, thanks,
I'm curious ... and hope it works. and yes it looks fine...
<title>Christo | INLINEAR Digital Marketing & Brasilien Blog</title>
-
Yes, that happens sometimes, just make sure the plugin has implemented the tags correctly and that the head of your pages don't contain any other misplaced meta tags or invalid code, since this could cause the issue.
Then give it a try, it usually works, you'll have to wait until the site is re-crawled dough, so give it some time. Then, if it doesn't work, you could take more drastic measures.
Good luck, and follow Alan's advice of using 'no-index, follow' in your tag if you want to make sure of keeping the link flow.
-
Hi Branagan,
yes I can set noindex, follow in the yoast plugin, but this seems that google does not obey this metatag? See the comment of a User from a blog:
Peter Hinson October 25, 2012 at 7:22 am
Hi, I’ve been using the Yoasts SEO for years now, but unfortunately it does not prevent Google from indexing pagination all the time.
Matt Cutts. the Google technical expert and representative says in his video that the ‘noindex’ meta tag is a weak method in trying to prevent Google from indexing a page, since Googlebot will not always obey this tag.
-
If you are using similar tags for almost every post, yes it will create duplicate content, but If you are using them separately it wont create duplicate content.
For an example I had a movie site, I used tags for the Movie Cast, since every movie has different cast it never created any duplicate content, my that site was SEO miracle.
-
the sites that have tag pages ranking usually have onpage problems or a penalty. Instead of the page, they usually show that tag page....or the privacy page....or maybe the keyword is just not competitive and google has nothing better to put there. Ive seen all these cases.
It's up to you, you can noindex them or whatever you want. As long as you dont go crazy and put 1000 tags on your post and spam links to those tag pages, they shouldnt be a problem. At least in my case.
-
Make sure that is a no-index, follow tag so that you get your link juice back
-
I guess tags are there because they fulfill a function, but if they aren't useful for your users it shouldn't be a problem to erase them, as long as you take the necessary measures not to get tons of nasty 404s, although if what worries you is duplicate content, you could just noindex the tag pages, I think most SEO plugins for WP have this option, else it shouldn't be hard to include the noindex tag in tag.php or whatever file generates it's pages, then you wouldn't have to deprive your users of an useful function (if that's the case, off course).
About your question of Google's preference, I'm not aware of it, but couldn't it be because there are many internal pages linking to tag pages and also with the same anchor?
-
I would get rid of the tags, I don't think users use them, they are more like a gimmick.
Search engines are more likely to rank your primary page now, rather then the pages produced by tags
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Instance IDs on "Events" in wordpress causing duplicate content
Hi all I use Yoast SEO on wordpress which does a pretty good job of insertint rel=canonical in to the header of the pages where approproate, including on my event pages. However my crawl diagnostics have highlighted these event pages as duplicate content and titles because of the instance id parameter being added to the URL. When I look at the pages head I see that rel=canonical is as it should be. Please see here for an example: http://solvencyiiwire.com/ai1ec_event/unintended-consequences-basel-ii-and-solvency-ii?instance_id= My question is how come SEOMoz is highlighting these pages as duplicate content and what can I do to remedy this. Is it because ?instance_id= is part of the string on the canonical link? How do I remove this? My client uses the following plugins "All-in-One Event Calendar by Timely" and
Technical SEO | | wellsgp
Google Calendar Events Many thanks!0 -
URL query considered duplicate content?
I have a Magento site. In order to reduce duplicate content for products of the same style but with different colours I have combined them on to 1 product page. I would like to allow the pictures to be dynamic, i.e. allow a user to search for a colour and all the products that offer that colour appear in the results, but I dont want the default product image shown but the product image for that colour applying to the query. Therefore to do this I have to append a query string to the end of the URL to produce this result: www.website.com/category/product-name.html?=red My question is, will the query variations then be picked up as duplicate content: www.website.com/category/product-name.html www.website.com/category/product-name.html?=red www.website.com/category/product-name.html?=yellow Google suggest it has contingencies in its algorithm and I will not be penalised: http://googlewebmastercentral.blogspot.co.uk/2007/09/google-duplicate-content-caused-by-url.html But other sources suggest this is not accurate. Note the article was written in 2007.
Technical SEO | | BlazeSunglass0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1 -
Aspx filters causing duplicate content issues
A client has a url which is duplicated by filters on the page, for example: - http://www.example.co.uk/Home/example.aspx is duplicated by http://www.example.co.uk/Home/example.aspx?filter=3 The client is moving to a new website later this year and is using an out-of-date Kentico CMS which would need some development doing to it in order to enable implementation of rel canonical tags in the header, I don't have access to the server and they have to pay through the nose everytime they want the slightest thing altering. I am trying to resolve this duplicate content issue though and am wondering what is the best way to resolve it in the short term. The client is happy to remove the filter links from the page but that still leaves the filter urls in Google. I am concerned that a 301 redirect will cause a loop and don't understand the behaviour of this type of code enough. I hope this makes sense, any advice appreciated.
Technical SEO | | travelinnovations0