WordPress Duplicate Content Issues
-
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc...
My question is, how do you handle these issues?
Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc?
By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages?
I hope this makes sense.
Regards,
Bill
-
Hey Bill
I like to start with this standard setup (image/chart from my wordpress post on moz);
Pages, Posts, Categories - Index
Tags, Dated Archives, Subpages, Author Archives - noindex
You can check out the full post - I will be updating the Yoast Screenshots very soon!
-Dan
-
Thanks for article,
Now 2 years ahead, are there any important updates for preventing duplicate content/titles?
-
Most of the Plugins for wordpress use canonical urls.
-
Unless I'm missing something here, wouldn't it be easier to set the canonical tag for the main post? There are also plugins like SEO Ultimate that handle this automatically.
-
I posted this article I wrote the other day for someone asking a similar question.
With the Yoast SEO Plugin I no-index everything except Categories. You can see how I set mine up under section 3. Indexation.
Here is the original question that Sha submitted:
http://www.seomoz.org/q/what-is-with-wordpress-dupe-issues -
Bill-
There are several SEO plugs available for WP that will handle these issues. Yes, you are right that adding "noindex" will be beneficial on tag, category, and archive pages. The idea here is avoiding duplicate content issues. BTW, check out: Yoast SEO for Wordpress.
Here is how the values for the robots meta tag work:
- noindex will keep a page from being crawled
- nofollow will prevent a page's links from being followed
I agree with noindex'ing these pages; though I would argue that a nofollow is still worth leaving out. If these pages have any juice you want to allow this to flow to the other links on the page.
-
The WP on my blog is set up as follows (this is a blog that gets between four and ten short posts per day - about two to four sentences, each post linking to an article or other content on a topic-related website)
Homepage: Full text of the most recent 25 posts are displayed. Pagination pages are not indexed (blocked by robots.txt).
Post Pages: Full text is displayed and the title plus a few words of 20 related posts are displayed.
Category Pages: I have over 100 categories and each post is placed into at least two categories (one by location and one by topic). Some posts go into three or four categoreis - sometimes more. Each category page displays the full text of the most recent 25 posts. Categories do not have pagination pages (blocked by robots.txt).
All of the above pages are fully indexed and a long list of category pages appears in the left-side navigation. I don't use tag pages or archive pages. There is a lot of dupe content in this system but so far I am lucky that it does not cause a problem. The category pages pull a lot of organic search traffic.
In January of each year I delete all of the posts that are over a year old. Before doing that I identify those that are pulling reasonable traffic and either redirect them to a permanent page about same topic, write an article about that topic and redirect, or recycle that post. All the rest are redirected to the homepage of the blog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop /tag creating duplicate content - Wordpress
Hi, I keep getting alert for duplicate content. It seems Wordpress is creating it through a /tag https://www.curveball-media.co.uk/tag/cipr/ https://www.curveball-media.co.uk/tag/pr-agencies/ Something in the way we've got Wordpress set up?
Technical SEO | | curveballmedia0 -
Duplicate content warning for a hierarchy structure?
I have a series of pages on my website organized in a hierarchy, let's simplify it to say parent pages and child pages. Each of the child pages has product listings, and an introduction at the top (along with an image) explaining their importance, why they're grouped together, providing related information, etc.
Technical SEO | | westsaddle
The parent page has a list of all of its child pages and a copy of their introductions next to the child page's title and image thumbnail. Moz is throwing up duplicate content warnings for all of these pages. Is this an actual SEO issue, or is the warning being overzealous?
Each child page has tons of its own content, and each parent page has the introductions from a bunch of child pages, so any single introduction is never the only content on the page. Thanks in advance!0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
Standard Responses Causing Duplication Issues
Hi Guys We have a Q&A section on our site which we reply to customers using standard responses which have already been approved. This is causing a lot of duplication errors, however due to the nature of our business we need to use these responses. Is there anything that we can do to stop this? Matthew
Technical SEO | | EwanFisher0 -
What are some of the negative effects of having duplicate content from other sites?
This could include republishing several articles from another site with permission.
Technical SEO | | Charlessipe0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0