Duplicate content + wordpress tags
-
According to SEOMoz platform, one of my wordpress websites deals with duplicate content because of the tags I use. How should I fix it? Is it loyal to remove tag links from the post pages?
-
William
I wouldn't do that - make sure to just noindex tags, and also noindex subpages of archives. Categories can stay indexed, it's usually the subpages that cause issues (ie: /page/2/ etc).
-Dan
-
Hey There
I wrote a post which walks through how to decide what to do with tags. Essentially, here's what I would do;
(This assumes you have Yoast SEO for your SEO plugin).
- Noindex tags by default.
- Except set only the ones receiving traffic to index (this is the part you need Yoast for).
- Do not overlap your categories and tags (keep them different).
- Minimize tag usage in general - put a few on each post, but not too many more.
- You can use for navigation (like a tag cloud) if you think your readers will find them useful.
- Do not include them in your XML sitemap.
Hope that helps!
I also did this post on setting up WordPress for the Moz blog which you may find helpful.
-Dan
-
I also have the same problem, with categories and tags.
Should I add www.site.com/categories/ to the robots.txt or is that a bad idea?
-
I also wanted to mention...you might want to read this post. Cyrus Shepard suggested when I was asking a similar question and I think it really helps..
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions
-
If you're receiving very good amount of traffic then it will be a very bad idea to no-index tags, because you'll not receive any more traffic if you do so. i think you understood what it means by no-index, search engines will not index your tags.
For any site there would be 2 types of navigation:
-
through categories.
-
through tags.
Most People recommend using only one as a primary navigation and de-index other, since indexing both will cause duplicate content. There are some gurus like Yoast.com who recommend de-indexing both, which is the rule i'm following now.
So, i explained you what happens, if you feel that you're getting more traffic through tags, then you should de-index categories or viceversa.
If you need to index both, then i recommend you to wait for other pro members suggestion.
-
-
I'm dealing with similar issues on another platform (as are many others I'm sure) I would think twice before deleting them, especially if you are getting traffic from them. You have to weigh the advantages and disadvantages and realize it will probably never be "perfect"...something I personally have a hard time coming to terms with!
If you aren't using excerpts, (and instead, showing entire articles on index pages) that has helped immensely on a couple of websites I've helped in reducing duplicate content.
-
Thank you for your reply. Currently, I'm receiving good amounts of traffic to some www.domain.com/tag/blah-blah pages, will those pages be harmed if I make tags noindex?
-
Hi Giankar,
If Tags is not the Primary way of Navigation for your Blog, Then You can remove all your tags. I mean delete them. Otherwise you can no-index your tags so that it will not cause any duplicate content issues.
I hope this helps !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content when working with makes and models?
Okay, so I am running a store on Shopify at the address https://www.rhinox-group.com. This store is reasonably new, so being updated constantly! The thing that is really annoying me at the moment though, is I am getting errors in the form of duplicate content. This seems to be because we work using the machine make and model, which is obviously imperative, but then we have various products for each machine make and model. Have we got any suggestions on how I can cut down on these errors, as the last thing I want is being penalised by Google for this! Thanks in advance, Josh
Technical SEO | | josh.sprakes1 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Joomla: content accesible through all kinds of other links >> duplicate content?!
When i did a site: search on Google i've noticed all kind of URL's on my site were indexed, while i didn't add them to the Joomla navigation (or they were not linked anywhere on the site). Some examples: www.domain.com/1-articlename >> that way ALL articles are publicly visible, even if they are not linked to a menu-item... If by accident such a link get's shared it will be indexed in google, you can have 2 links with same content... www.domain.com/2-uncategorised >> same with categories, automatically these overview pages are visible to people who know this URL. On it you see all the articles that belong to that category. www.domain.com/component/content >> this gives an overview of all the categories inside your Joomla CMS I think most will agree this is not good for your site's SEO? But how can this be solved? Is this some kind of setting within Joomla? Anyone who dealt with these problems already?
Technical SEO | | conversal0 -
Duplicate Content - Reverse Phone Directory
Hi, Until a few months ago, my client's site had about 600 pages. He decided to implement what is essentially a reverse phone directory/lookup tool. There are now about 10,000 reverse directory/lookup pages (.html), all with short and duplicate content except for the phone number and the caller name. Needless to say, I'm getting thousands of duplicate content errors. Are there tricks of the trade to deal with this? In nosing around, I've discovered that the pages are showing up in Google search results (when searching for a specific phone number), usually in the first or second position. Ideally, each page would have unique content, but that's next to impossible with 10,000 pages. One potential solution I've come up with is incorporating user-generated content into each page (maybe via Disqus?), which over time would make each page unique. I've also thought about suggesting that he move those pages onto a different domain. I'd appreciate any advice/suggestions, as well as any insights into the long-term repercussions of having so many dupes on the ranking of the 600 solidly unique pages on the site. Thanks in advance for your help!
Technical SEO | | sally580 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0