Wordpress: Tags generate duplicate Content - just delete the tags!?
-
Asking people, they say tags are bad and spamy and as I can see they generate all my duplicate page content issues.
So the big question is, why Google very often prefers to show in SERPS these Tag-URLS... so it can't be too bad! :)))?
Then after some research I found the "Term Optimizer" on Yoast.com ... that should help exactly with this problem but it seems not to be available anymore?
So may be there another plugin that can help... or just delete all tags from my blog? and install permanent redirects?
Is this the solution? -
I don't understand the question. Perhaps submit a new question to create a new thread, and elaborate on the issue.
-
It should take care of the duplicate content issue as it relates to tag pages, but not it will not show as a 404 because the page will still be there, just removed from Google's index.
If you adjusted the settings in Yoast it should apply to all future posts and tags.
-
My blog provides articles for technical queries and repair options. Almost many keywords does have the same sort of steps to define and we are into problems now..
CASE1: We have replaced the keywords and created the same content for all posts.
CASE2: To solve the above SEO loop hole, I have added these tags for relevant steps which follow the same troubleshooting.
Now again tags are bothering me.
What should I do for my work?
-
Hi,
I also have this problem with our site with 50 duplicate pages recorded on Moz, If I do a no index, follow on Yoast will it create 404's or will I only get a 404 if i remove the tags completely?
Also is there a way to set this up for all future blog posts on yoast or will I need to do this every time I submit a new post?
Thanks
Jeh
-
I said the same thing! I couldn't figure out why it wasn't the default either. We've had the blog up since before I worked for my current company, and so there were a bunch of duplicate pages I'm trying to fix now... frustrating!
Best of luck,
Tyler
-
Hi Tyler,
yes I ended up with the simple checking of the "noindex,follow" option for Tag within the YOAST Plugin (Taxonomies-Tab).
Now all the duplicate Content Issues are gone... so why is this option not set per default? hmmm
Thanks
Holger -
Hey!
I was just looking into this same issue for myself, and I figured I'd share the URL from Yoast that ended up answering most of my questions. It's in section 3, gives you some solutions to the duplicate page issue.
Hope this helps,
Tyler
-
Does that validate?
I'm no expert, but I think the trailing slash shouldn't be used with meta elements, except maybe if the declared DocType is XML or XHTML? If you do, leave a space just before it, like in your rel canonical line, for compatibility's sake (old stuff but just in case, it won't hurt).
And shouldn't it be a space between the comma and follow in "noindex, follow"?
-
Yes, thanks,
I'm curious ... and hope it works. and yes it looks fine...
<title>Christo | INLINEAR Digital Marketing & Brasilien Blog</title>
-
Yes, that happens sometimes, just make sure the plugin has implemented the tags correctly and that the head of your pages don't contain any other misplaced meta tags or invalid code, since this could cause the issue.
Then give it a try, it usually works, you'll have to wait until the site is re-crawled dough, so give it some time. Then, if it doesn't work, you could take more drastic measures.
Good luck, and follow Alan's advice of using 'no-index, follow' in your tag if you want to make sure of keeping the link flow.
-
Hi Branagan,
yes I can set noindex, follow in the yoast plugin, but this seems that google does not obey this metatag? See the comment of a User from a blog:
Peter Hinson October 25, 2012 at 7:22 am
Hi, I’ve been using the Yoasts SEO for years now, but unfortunately it does not prevent Google from indexing pagination all the time.
Matt Cutts. the Google technical expert and representative says in his video that the ‘noindex’ meta tag is a weak method in trying to prevent Google from indexing a page, since Googlebot will not always obey this tag.
-
If you are using similar tags for almost every post, yes it will create duplicate content, but If you are using them separately it wont create duplicate content.
For an example I had a movie site, I used tags for the Movie Cast, since every movie has different cast it never created any duplicate content, my that site was SEO miracle.
-
the sites that have tag pages ranking usually have onpage problems or a penalty. Instead of the page, they usually show that tag page....or the privacy page....or maybe the keyword is just not competitive and google has nothing better to put there. Ive seen all these cases.
It's up to you, you can noindex them or whatever you want. As long as you dont go crazy and put 1000 tags on your post and spam links to those tag pages, they shouldnt be a problem. At least in my case.
-
Make sure that is a no-index, follow tag so that you get your link juice back
-
I guess tags are there because they fulfill a function, but if they aren't useful for your users it shouldn't be a problem to erase them, as long as you take the necessary measures not to get tons of nasty 404s, although if what worries you is duplicate content, you could just noindex the tag pages, I think most SEO plugins for WP have this option, else it shouldn't be hard to include the noindex tag in tag.php or whatever file generates it's pages, then you wouldn't have to deprive your users of an useful function (if that's the case, off course).
About your question of Google's preference, I'm not aware of it, but couldn't it be because there are many internal pages linking to tag pages and also with the same anchor?
-
I would get rid of the tags, I don't think users use them, they are more like a gimmick.
Search engines are more likely to rank your primary page now, rather then the pages produced by tags
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix duplicate content caused by tags?
I use SEMRush, and the issue they are finding is I have 30 duplicate content issues. All seem to be caused by the tags I add in my portfolio pieces. I have looked at my SEO settings (taxonomies, etc) in the Wordpress site, and don't know what I am doing wrong....any advice how to fix? I have attached a screen shot VsYv2wY
Technical SEO | | cschwartzel0 -
Quickview popup duplicate content
Hi We have an eccomerce site. We just added to the product list view a quickview tab - when you roll mouse over it a popup window with the product image and short description shows up - is this a problem of duplicate content( its the same content that's on the product pages except there we also have a long detailed description) - t is done with javascript. Thanks!
Technical SEO | | henya0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
Fixing Duplicate Pages Titles/Content
I have a DNN site, which I created friendly URL's for; however, the creation of the friendly URL's then created duplicate page content and titles. I was able to fix all but two URL's with rel="canonical" links. BUT The two that are giving me the most issues are pointing to my homepage. When I added the rel = "canonical" link the page then becomes not indexable. And for whatever reason, I can't add a 301 redirect to the homepage because it then gives me "can't display webpage" error message. I am new to SEO and to DNN, so any help would be greatly appreciated.
Technical SEO | | VeronicaCFowler0 -
"Standout" tag and "Original content" tags - what's the latest?
In November 2010 Google introduced the "standout tag" http://support.google.com/news/publisher/bin/answer.py?hl=en&answer=191283 I can't find any articles/blog posts/etc in google after that date, but its use was suggested in a google forum today to help with original content issues. Has anyone used them? Does anyone know what's the latest with them? Are they worth trying for SEO? Is there a possible SEO penalty for using them? Thanks, Jean
Technical SEO | | JeanYates0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0