Wordpress: Tags generate duplicate Content - just delete the tags!?
-
Asking people, they say tags are bad and spamy and as I can see they generate all my duplicate page content issues.
So the big question is, why Google very often prefers to show in SERPS these Tag-URLS... so it can't be too bad! :)))?
Then after some research I found the "Term Optimizer" on Yoast.com ... that should help exactly with this problem but it seems not to be available anymore?
So may be there another plugin that can help... or just delete all tags from my blog? and install permanent redirects?
Is this the solution? -
I don't understand the question. Perhaps submit a new question to create a new thread, and elaborate on the issue.
-
It should take care of the duplicate content issue as it relates to tag pages, but not it will not show as a 404 because the page will still be there, just removed from Google's index.
If you adjusted the settings in Yoast it should apply to all future posts and tags.
-
My blog provides articles for technical queries and repair options. Almost many keywords does have the same sort of steps to define and we are into problems now..
CASE1: We have replaced the keywords and created the same content for all posts.
CASE2: To solve the above SEO loop hole, I have added these tags for relevant steps which follow the same troubleshooting.
Now again tags are bothering me.
What should I do for my work?
-
Hi,
I also have this problem with our site with 50 duplicate pages recorded on Moz, If I do a no index, follow on Yoast will it create 404's or will I only get a 404 if i remove the tags completely?
Also is there a way to set this up for all future blog posts on yoast or will I need to do this every time I submit a new post?
Thanks
Jeh
-
I said the same thing! I couldn't figure out why it wasn't the default either. We've had the blog up since before I worked for my current company, and so there were a bunch of duplicate pages I'm trying to fix now... frustrating!
Best of luck,
Tyler
-
Hi Tyler,
yes I ended up with the simple checking of the "noindex,follow" option for Tag within the YOAST Plugin (Taxonomies-Tab).
Now all the duplicate Content Issues are gone... so why is this option not set per default? hmmm
Thanks
Holger -
Hey!
I was just looking into this same issue for myself, and I figured I'd share the URL from Yoast that ended up answering most of my questions. It's in section 3, gives you some solutions to the duplicate page issue.
Hope this helps,
Tyler
-
Does that validate?
I'm no expert, but I think the trailing slash shouldn't be used with meta elements, except maybe if the declared DocType is XML or XHTML? If you do, leave a space just before it, like in your rel canonical line, for compatibility's sake (old stuff but just in case, it won't hurt).
And shouldn't it be a space between the comma and follow in "noindex, follow"?
-
Yes, thanks,
I'm curious ... and hope it works. and yes it looks fine...
<title>Christo | INLINEAR Digital Marketing & Brasilien Blog</title>
-
Yes, that happens sometimes, just make sure the plugin has implemented the tags correctly and that the head of your pages don't contain any other misplaced meta tags or invalid code, since this could cause the issue.
Then give it a try, it usually works, you'll have to wait until the site is re-crawled dough, so give it some time. Then, if it doesn't work, you could take more drastic measures.
Good luck, and follow Alan's advice of using 'no-index, follow' in your tag if you want to make sure of keeping the link flow.
-
Hi Branagan,
yes I can set noindex, follow in the yoast plugin, but this seems that google does not obey this metatag? See the comment of a User from a blog:
Peter Hinson October 25, 2012 at 7:22 am
Hi, I’ve been using the Yoasts SEO for years now, but unfortunately it does not prevent Google from indexing pagination all the time.
Matt Cutts. the Google technical expert and representative says in his video that the ‘noindex’ meta tag is a weak method in trying to prevent Google from indexing a page, since Googlebot will not always obey this tag.
-
If you are using similar tags for almost every post, yes it will create duplicate content, but If you are using them separately it wont create duplicate content.
For an example I had a movie site, I used tags for the Movie Cast, since every movie has different cast it never created any duplicate content, my that site was SEO miracle.
-
the sites that have tag pages ranking usually have onpage problems or a penalty. Instead of the page, they usually show that tag page....or the privacy page....or maybe the keyword is just not competitive and google has nothing better to put there. Ive seen all these cases.
It's up to you, you can noindex them or whatever you want. As long as you dont go crazy and put 1000 tags on your post and spam links to those tag pages, they shouldnt be a problem. At least in my case.
-
Make sure that is a no-index, follow tag so that you get your link juice back
-
I guess tags are there because they fulfill a function, but if they aren't useful for your users it shouldn't be a problem to erase them, as long as you take the necessary measures not to get tons of nasty 404s, although if what worries you is duplicate content, you could just noindex the tag pages, I think most SEO plugins for WP have this option, else it shouldn't be hard to include the noindex tag in tag.php or whatever file generates it's pages, then you wouldn't have to deprive your users of an useful function (if that's the case, off course).
About your question of Google's preference, I'm not aware of it, but couldn't it be because there are many internal pages linking to tag pages and also with the same anchor?
-
I would get rid of the tags, I don't think users use them, they are more like a gimmick.
Search engines are more likely to rank your primary page now, rather then the pages produced by tags
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content when working with makes and models?
Okay, so I am running a store on Shopify at the address https://www.rhinox-group.com. This store is reasonably new, so being updated constantly! The thing that is really annoying me at the moment though, is I am getting errors in the form of duplicate content. This seems to be because we work using the machine make and model, which is obviously imperative, but then we have various products for each machine make and model. Have we got any suggestions on how I can cut down on these errors, as the last thing I want is being penalised by Google for this! Thanks in advance, Josh
Technical SEO | | josh.sprakes1 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin <cite>dev.rollerbannerscheap.co.uk/</cite><a id="srsl_0" class="pplsrsla" tabindex="0" data-ved="0CEQQ5hkwAA" data-url="http://dev.rollerbannerscheap.co.uk/" data-title="Roller Banners Cheap » admin" data-sli="srsl_0" data-ci="srslc_0" data-vli="srslcl_0" data-slg="webres"></a>A description for this result is not available because of this site's robots.txt – learn more.This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google.Please can anyone help?
Technical SEO | | SO_UK0 -
Duplicate Page Content for sorted archives?
Experienced backend dev, but SEO newbie here 🙂 When SEOmoz crawls my site, I get notified of DPC errors on some list/archive sorted pages (appending ?sort=X to the url). The pages all have rel=canonical to the archive home. Some of the pages are shorter (have only one or two entries). Is there a way to resolve this error? Perhaps add rel=nofollow to the sorting menu? Or perhaps find a method that utilizes a non-link navigation method to sort / switch sorted pages? No issues with duplicate content are showing up on google webmaster tools. Thanks for your help!
Technical SEO | | jwondrusch0 -
Duplicate Content on 2 Sites - Advice
We have one client who has an established eCommerce Site and has created another site which has the exact same content which is about to be launched. We want both sites to be indexed but not be penalised for duplicate content. The sites have different domains The sites have the same host We want the current site to be priority, so the new site would not be ranking higher in SERPs. Any advice on setting up canonical, author tags, alternate link tag etc Thanks Rich
Technical SEO | | SEOLeaders0 -
What to do about similar content getting penalized as duplicate?
We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!
Technical SEO | | bonnierSEO0 -
Press Releases & Duplicate Content
How do you do press releases without duplicating the content? I need to post it on my website along with having it on PR websites. But isn't that considered bad for SEO since it's duplicate content?
Technical SEO | | MercyCollege0 -
Magento and Duplicate content
I have been working with Magento over the last few weeks and I am becoming increasingly frustrated with the way it is setup. If you go to a product page and remove the sub folders one by one you can reach the same product pages causing duplicate content. All magento sites seem to have this weakness. So use this site as an example because I know it is built on magento, http://www.gio-goi.com/men/clothing/tees/throve-t-short.html?cid=756 As you remove the tees then the clothing and men sub folders you can still reach the product page. My first querstion is how big an issue is this and two does anyone have any ideas of how to solve it? Also I was wondering how does google treat question marks in urls? Should you try and avoid them unless you are filtering? Thanks
Technical SEO | | gregster10001