Duplicate content + wordpress tags
-
According to SEOMoz platform, one of my wordpress websites deals with duplicate content because of the tags I use. How should I fix it? Is it loyal to remove tag links from the post pages?
-
William
I wouldn't do that - make sure to just noindex tags, and also noindex subpages of archives. Categories can stay indexed, it's usually the subpages that cause issues (ie: /page/2/ etc).
-Dan
-
Hey There
I wrote a post which walks through how to decide what to do with tags. Essentially, here's what I would do;
(This assumes you have Yoast SEO for your SEO plugin).
- Noindex tags by default.
- Except set only the ones receiving traffic to index (this is the part you need Yoast for).
- Do not overlap your categories and tags (keep them different).
- Minimize tag usage in general - put a few on each post, but not too many more.
- You can use for navigation (like a tag cloud) if you think your readers will find them useful.
- Do not include them in your XML sitemap.
Hope that helps!
I also did this post on setting up WordPress for the Moz blog which you may find helpful.
-Dan
-
I also have the same problem, with categories and tags.
Should I add www.site.com/categories/ to the robots.txt or is that a bad idea?
-
I also wanted to mention...you might want to read this post. Cyrus Shepard suggested when I was asking a similar question and I think it really helps..
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions
-
If you're receiving very good amount of traffic then it will be a very bad idea to no-index tags, because you'll not receive any more traffic if you do so. i think you understood what it means by no-index, search engines will not index your tags.
For any site there would be 2 types of navigation:
-
through categories.
-
through tags.
Most People recommend using only one as a primary navigation and de-index other, since indexing both will cause duplicate content. There are some gurus like Yoast.com who recommend de-indexing both, which is the rule i'm following now.
So, i explained you what happens, if you feel that you're getting more traffic through tags, then you should de-index categories or viceversa.
If you need to index both, then i recommend you to wait for other pro members suggestion.
-
-
I'm dealing with similar issues on another platform (as are many others I'm sure) I would think twice before deleting them, especially if you are getting traffic from them. You have to weigh the advantages and disadvantages and realize it will probably never be "perfect"...something I personally have a hard time coming to terms with!
If you aren't using excerpts, (and instead, showing entire articles on index pages) that has helped immensely on a couple of websites I've helped in reducing duplicate content.
-
Thank you for your reply. Currently, I'm receiving good amounts of traffic to some www.domain.com/tag/blah-blah pages, will those pages be harmed if I make tags noindex?
-
Hi Giankar,
If Tags is not the Primary way of Navigation for your Blog, Then You can remove all your tags. I mean delete them. Otherwise you can no-index your tags so that it will not cause any duplicate content issues.
I hope this helps !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Image centric site and duplicate content issues
We have a site that has very little text, the main purpose of the site is to allow users to find inspiration through images. 1000s of images come to us each week to be processed by our editorial team, so as part of our process we select a subset of the best images and process those with titles, alt text, tags, etc. We still host the other images and users can find them through galleries that link to the process and unprocessed image pages. Due to the lack of information on the unprocessed images, we are having lots of duplicate content issues (The layout of all the image pages are the same, and there isn't any unique text to differentiate the pages. The only changing factor is the image itself in each page) Any suggestions on how to resolve this issue, will be greatly appreciated.
Technical SEO | | wedlinkmedia0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
Question about duplicate content in crawl reports
Okay, this one's a doozie: My crawl report is listing all of these as separate URLs with identical duplicate content issues, even though they are all the home page and the one that is http://www.ccisolutions.com (the preferred URL) has a canonical tag of rel= http://www.ccisolutions.com: http://www.ccisolutions.com http://ccisolutions.com http://www.ccisolutions.com/StoreFront/IAFDispatcher?iafAction=showMain I will add that OSE is recognizing that there is a 301-redirect on http://ccisolutions.com, but the duplicate content report doesn't seem to recognize the redirect. Also, every single one of our 404-error pages (we have set up a custom 404 page) is being identified as having duplicate content. The duplicate content on all of them is identical. Where do I even begin sorting this out? Any suggestions on how/why this is happening? Thanks!
Technical SEO | | danatanseo1 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0