What's the best way to eliminate duplicate page content caused by blog archives?
-
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive.
Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct?
Any other suggestions to alleviate this pesky duplicate page content issue?
-
I think I understand better now.
Use the noindex,follow tag on the content you don't want included in the search index.
If you are using Wordpress then you should check out http://yoast.com/wordpress/seo/
-
The hypothetical blog posting I want to have indexed is...
www.example.com/blog/2011/10/19
The first sentence of this blog posting is: "Jim and Janice jumped joyfully to Jackson."
I go out to google and search "Jim and Janice jumped joyfully to Jackson." There are 7 results. The first result is the blog posting I want indexed. The 2nd - 7th results are archive pages from my blog. Let's call one of those archive pages...
So, residing on this archive page are all of my postings from October 2011 including Jim and Janice's. Thus, there appears to be a ton of duplicate content on my site.
If I implement a canonical tag on the archive page, won't this archive page be referred to the blog posting I want indexed?
If so, that won't work. I need the blog posting and all the archive pages to remain as is but I don't want the archive pages to be indexed or show up as duplicate content.
Thoughts?
-
The hypothetical blog posting I want to have indexed is...
www.example.com/blog/2011/10/19
The first sentence of this blog posting is: "Jim and Janice jumped joyfully to Jackson."
I go out to google and search "Jim and Janice jumped joyfully to Jackson." There are 7 results. The first result is the blog posting I want indexed. The 2nd - 7th results are archive pages from my blog. Let's call one of those archive pages...
So, residing on this archive page are all of my postings from October 2011 including Jim and Janice's. Thus, there appears to be a ton of duplicate content on my site.
If I implement a canonical tag on the archive page, won't this archive page be referred to the blog posting I want indexed?
If so, that won't work. I need the blog posting and all the archive pages to remain as is but I don't want the archive pages to be indexed or show up as duplicate content.
Thoughts?
-
I agree with James, best to implement canonical tags.
-
The best way would be to implement canonical tags on these pages,
Example from Google:
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google's search results display my home page instead of my target page?
Why does Google's search results display my home page instead of my target page?
Technical SEO | | h.hedayati6712365410 -
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Duplicate content
Hello mozzers, I have an unusual question. I've created a page that I am fully aware that it is near 100% duplicate content. It quotes the law, so it's not changeable. The page is very linkable in my niche. Is there a way I can build quality links to it that benefit my overall websites DA (i'm not bothered about the linkable page being ranked) without risking panda/dupe content issues? Thanks, Peter
Technical SEO | | peterm21 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
How to solve Parameter Issue causing Duplicate Content
Hi everyone, My site home page comes up in SERP with following url www.sitename/?referer=indiagrid My question is:- Should I disallow using robots.txt.? or 301 redirect to the home page Other issue is i have few dynamic generated URL's for a form http://www.www.sitename/career-form.php?position=SEO Executive I am using parameter "position" in URL Parameter in GWT. But still my pages are indexed that is leading to duplicate page content. Please help me out.
Technical SEO | | himanshu3019890 -
What is the best approach to specifying a page's language?
I have read about a number of different tags that can accomplish this so it is very confusing. For example, should I be using: OR
Technical SEO | | BlueLinkERP0 -
Found a Typo in URL, what's the best practice to fix it?
Wordpress 3.4, Yoast, Multisite The URL is supposed to be "www.myexample.com/great-site" but I just found that it's "www.myexample.com/gre-atsite" It is a relatively new site but we already pointed several internal links to "www.myexample.com/gre-atsite" What's the best practice to correct this? Which option is more desirable? 1.Creating a new page I found that Yoast has "301 redirect" option in the Advanced tap Can I just create a new page(exact same page) and put noindex, nofollow and redirect it to http://www.myexample.com/great-site OR 2. htacess redirect rule simply change the URL to http://www.myexample.com/great-site and update it, and add Options +FollowSymLinks RewriteEngine On
Technical SEO | | joony2008
RewriteCond %{HTTP_HOST} ^http://www.myexample.com/gre-atsite$ [NC]
RewriteRule ^(.*)$ http://www.myexample.com/great-site$1 [R=301,L]0 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0