What's the best way to eliminate duplicate page content caused by blog archives?
-
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive.
Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct?
Any other suggestions to alleviate this pesky duplicate page content issue?
-
I think I understand better now.
Use the noindex,follow tag on the content you don't want included in the search index.
If you are using Wordpress then you should check out http://yoast.com/wordpress/seo/
-
The hypothetical blog posting I want to have indexed is...
www.example.com/blog/2011/10/19
The first sentence of this blog posting is: "Jim and Janice jumped joyfully to Jackson."
I go out to google and search "Jim and Janice jumped joyfully to Jackson." There are 7 results. The first result is the blog posting I want indexed. The 2nd - 7th results are archive pages from my blog. Let's call one of those archive pages...
So, residing on this archive page are all of my postings from October 2011 including Jim and Janice's. Thus, there appears to be a ton of duplicate content on my site.
If I implement a canonical tag on the archive page, won't this archive page be referred to the blog posting I want indexed?
If so, that won't work. I need the blog posting and all the archive pages to remain as is but I don't want the archive pages to be indexed or show up as duplicate content.
Thoughts?
-
The hypothetical blog posting I want to have indexed is...
www.example.com/blog/2011/10/19
The first sentence of this blog posting is: "Jim and Janice jumped joyfully to Jackson."
I go out to google and search "Jim and Janice jumped joyfully to Jackson." There are 7 results. The first result is the blog posting I want indexed. The 2nd - 7th results are archive pages from my blog. Let's call one of those archive pages...
So, residing on this archive page are all of my postings from October 2011 including Jim and Janice's. Thus, there appears to be a ton of duplicate content on my site.
If I implement a canonical tag on the archive page, won't this archive page be referred to the blog posting I want indexed?
If so, that won't work. I need the blog posting and all the archive pages to remain as is but I don't want the archive pages to be indexed or show up as duplicate content.
Thoughts?
-
I agree with James, best to implement canonical tags.
-
The best way would be to implement canonical tags on these pages,
Example from Google:
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle URLs of the to-be-translated pages on a multilingual site
Dear Moz community, I have a multilingual site and there are pages with content that is supposed to be translated but for now is English only. The structure of the site is such that different languages have their virtual subdirs: domain.com/en/page1.html for English, domain.com/fr/page1.html for French and so on. Obviously, if the page1.html is not translated, the URLs point to the same content and I get warnings about duplicate content. I see two ways to handle this situation: Break the naming scheme and link to original English pages, i.e. instead of domain.com/fr/index.html linking to domain.com/fr/page1.html link to domain.com/en/page.html Leave the naming scheme intact and set up a 301 redirect so that /fr/page1.html redirects to /en/page1.html Is there any difference for the two methods from the SEO standpoint? Thanks.
Technical SEO | | Lomar0 -
WordPress Duplicate Content Caused By Categories
Hello, We have a wordpress blog that has around 250 categories. Due to our platform we have a hierarchy structure for 3 separate stores. For example iPhone > Apps > Books. Placing a blog post in the books category automatically places it into iPhone and iPhone/Apps category, causing 3 instances of any blog post in this category. Is this an issue? I have seen 2 schools of thought on categories, 1 index follow and 2 noindex follow. I know some of our categories get indexed, but with so many, maybe it is better to noindex them. We also considered reducing our categories to 10 to 12 and use tags to provide the indexed site navigation as follows: Reviews (category) iPhone Book App, iPhone App Store (tags) but this seems a little redundant? Anyone want to take this on? thank you Mike
Technical SEO | | crazymikesapps10 -
The word 'shop' in a page title
I'm reworking most of the page titles on our site and I'm considering the use of the word 'Shop' before a product category. ex. Shop 'keyword' | Brand Name As opposed to just using the keyword sans 'Shop.' Some of the keywords are very generic, especially for a top level category page. Question: Is the word 'Shop' damaging my SEO efforts in any way?
Technical SEO | | rhoadesjohn0 -
Would Google Call These Pages Duplicate Content?
Our Web store, http://www.audiobooksonline.com/index.html, has struggled with duplicate content issues for some time. One aspect of duplicate content is a page like this: http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html. When an audio book title goes out-of-publication we keep the page at our store and display a http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html whenever a visitor attempts to visit a specific title that is OOP. There are several thousand OOP pages. Would Google consider these OOP pages duplicate content?
Technical SEO | | lbohen0 -
Found a Typo in URL, what's the best practice to fix it?
Wordpress 3.4, Yoast, Multisite The URL is supposed to be "www.myexample.com/great-site" but I just found that it's "www.myexample.com/gre-atsite" It is a relatively new site but we already pointed several internal links to "www.myexample.com/gre-atsite" What's the best practice to correct this? Which option is more desirable? 1.Creating a new page I found that Yoast has "301 redirect" option in the Advanced tap Can I just create a new page(exact same page) and put noindex, nofollow and redirect it to http://www.myexample.com/great-site OR 2. htacess redirect rule simply change the URL to http://www.myexample.com/great-site and update it, and add Options +FollowSymLinks RewriteEngine On
Technical SEO | | joony2008
RewriteCond %{HTTP_HOST} ^http://www.myexample.com/gre-atsite$ [NC]
RewriteRule ^(.*)$ http://www.myexample.com/great-site$1 [R=301,L]0 -
What is the best practice to handle duplicate content?
I have several large sections that SEOMOZ is indicating has duplicate content, even though the content is not identical. For example: Leather Passport Section - Leather Passports - Black - Leather Passposts - Blue - Leather Passports - Tan - Etc. Each of the items has good content, but it is identical, since they are the same products. What is the best practice here: 1. Have only one product with a drop down (fear is that this is not best for the customer) 2. Make up content to have them sound different? 3. Put a do-no-follow on the passport section? 4. Use a rel canonical even though the sections are technically not identical? Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
Duplicate Content - Home Page even wth Mod Rewrite 301
Hi, It looks like Seomoz (and Screaming Frog) is showing my home page as duplicate content. http://www.mydomain.com Page Authority 61 Linking root Domain 321 http://www.mydomain.com/ Page Authority 61 Linking root Domain 321 [Screaming Frog shows duplicate as]
Technical SEO | | Force7
www.mydomain.com/
www.mydomain.com/index.html} Years ago I hired someone to write the code for a rewrite for non www to be 301 redirected to www version. I was surprised at finding out that I still have a problem. Here is the code on my htaccess page. <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !^www.mydomain.com [NC]
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [L,R=301]</ifmodule> Was this code not properly written ? One more question, we were hit hard by Panda and Penguin, would something like this be that much of a factor. Thanks in advance, Force70 -
How can i resolve Duplicate Page Content?
Hello, I have created one campaign over SEOmoz tools for my website AutoDreams.it i have found 159 duplicate page content. My problem is that this web site is about car adsso it is easy to create pages with duplicate content and also Car ads are placed byregistered users. How can i resolve this problem? Regards Francesco
Technical SEO | | francesco870