Wordpress: Tags generate duplicate Content - just delete the tags!?
-
Asking people, they say tags are bad and spamy and as I can see they generate all my duplicate page content issues.
So the big question is, why Google very often prefers to show in SERPS these Tag-URLS... so it can't be too bad! :)))?
Then after some research I found the "Term Optimizer" on Yoast.com ... that should help exactly with this problem but it seems not to be available anymore?
So may be there another plugin that can help... or just delete all tags from my blog? and install permanent redirects?
Is this the solution? -
I don't understand the question. Perhaps submit a new question to create a new thread, and elaborate on the issue.
-
It should take care of the duplicate content issue as it relates to tag pages, but not it will not show as a 404 because the page will still be there, just removed from Google's index.
If you adjusted the settings in Yoast it should apply to all future posts and tags.
-
My blog provides articles for technical queries and repair options. Almost many keywords does have the same sort of steps to define and we are into problems now..
CASE1: We have replaced the keywords and created the same content for all posts.
CASE2: To solve the above SEO loop hole, I have added these tags for relevant steps which follow the same troubleshooting.
Now again tags are bothering me.
What should I do for my work?
-
Hi,
I also have this problem with our site with 50 duplicate pages recorded on Moz, If I do a no index, follow on Yoast will it create 404's or will I only get a 404 if i remove the tags completely?
Also is there a way to set this up for all future blog posts on yoast or will I need to do this every time I submit a new post?
Thanks
Jeh
-
I said the same thing! I couldn't figure out why it wasn't the default either. We've had the blog up since before I worked for my current company, and so there were a bunch of duplicate pages I'm trying to fix now... frustrating!
Best of luck,
Tyler
-
Hi Tyler,
yes I ended up with the simple checking of the "noindex,follow" option for Tag within the YOAST Plugin (Taxonomies-Tab).
Now all the duplicate Content Issues are gone... so why is this option not set per default? hmmm
Thanks
Holger -
Hey!
I was just looking into this same issue for myself, and I figured I'd share the URL from Yoast that ended up answering most of my questions. It's in section 3, gives you some solutions to the duplicate page issue.
Hope this helps,
Tyler
-
Does that validate?
I'm no expert, but I think the trailing slash shouldn't be used with meta elements, except maybe if the declared DocType is XML or XHTML? If you do, leave a space just before it, like in your rel canonical line, for compatibility's sake (old stuff but just in case, it won't hurt).
And shouldn't it be a space between the comma and follow in "noindex, follow"?
-
Yes, thanks,
I'm curious ... and hope it works. and yes it looks fine...
<title>Christo | INLINEAR Digital Marketing & Brasilien Blog</title>
-
Yes, that happens sometimes, just make sure the plugin has implemented the tags correctly and that the head of your pages don't contain any other misplaced meta tags or invalid code, since this could cause the issue.
Then give it a try, it usually works, you'll have to wait until the site is re-crawled dough, so give it some time. Then, if it doesn't work, you could take more drastic measures.
Good luck, and follow Alan's advice of using 'no-index, follow' in your tag if you want to make sure of keeping the link flow.
-
Hi Branagan,
yes I can set noindex, follow in the yoast plugin, but this seems that google does not obey this metatag? See the comment of a User from a blog:
Peter Hinson October 25, 2012 at 7:22 am
Hi, I’ve been using the Yoasts SEO for years now, but unfortunately it does not prevent Google from indexing pagination all the time.
Matt Cutts. the Google technical expert and representative says in his video that the ‘noindex’ meta tag is a weak method in trying to prevent Google from indexing a page, since Googlebot will not always obey this tag.
-
If you are using similar tags for almost every post, yes it will create duplicate content, but If you are using them separately it wont create duplicate content.
For an example I had a movie site, I used tags for the Movie Cast, since every movie has different cast it never created any duplicate content, my that site was SEO miracle.
-
the sites that have tag pages ranking usually have onpage problems or a penalty. Instead of the page, they usually show that tag page....or the privacy page....or maybe the keyword is just not competitive and google has nothing better to put there. Ive seen all these cases.
It's up to you, you can noindex them or whatever you want. As long as you dont go crazy and put 1000 tags on your post and spam links to those tag pages, they shouldnt be a problem. At least in my case.
-
Make sure that is a no-index, follow tag so that you get your link juice back
-
I guess tags are there because they fulfill a function, but if they aren't useful for your users it shouldn't be a problem to erase them, as long as you take the necessary measures not to get tons of nasty 404s, although if what worries you is duplicate content, you could just noindex the tag pages, I think most SEO plugins for WP have this option, else it shouldn't be hard to include the noindex tag in tag.php or whatever file generates it's pages, then you wouldn't have to deprive your users of an useful function (if that's the case, off course).
About your question of Google's preference, I'm not aware of it, but couldn't it be because there are many internal pages linking to tag pages and also with the same anchor?
-
I would get rid of the tags, I don't think users use them, they are more like a gimmick.
Search engines are more likely to rank your primary page now, rather then the pages produced by tags
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical Tags for Legacy Duplicate Content
I've got a lot of duplicate pages, especially products, and some are new but most have been like this for a long time; up to several years. Does it makes sense to use a canonical tag pointing to one master page for each product. Each page is slightly different with a different feature and includes maybe a sentence or two that is unique but everything else is the same.
Technical SEO | | AmberHanson0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Duplicate Content - Products
When running a report it says we have lots of duplicate content. We are a e-commerce site that has about 45,000 sku's on the site. Products can be in multiple departments on the site. So the same products can show up on different pages of the site. Because of this the reports show multiple products with duplicate content. Is this an issue with google and site ranking? Is there a way to get around this issue?
Technical SEO | | shoedog1 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0