How to prevent duplicate content in archives?
-
My news site has a number of excerpts in the form of archives based on categories that is causing duplicate content problems.
Here's an example with the nutrition archive. The articles here are already posts, so it creates the duplicate content.
Should I nofollow/noindex this category page along with the rest and 2011,2012 archives etc (see archives here)?
Thanks so much for any input!
-
Thanks, Bob. I've begun the switch over -- appreciate the follow-up.
-
Yeah, the Yoast plugin has more advanced options. It also has a lot less annoying ads in the admin panel
-
Thanks a lot for the answer, Bob. I'm currently using All in One SEO, do you find that Yoast's is much better overall as well?
Appreciated.
-
Install Yoast's Wordpress SEO plugin. Has all the options you're looking for in one easy tab
I'd leave the archive crawlable since you use them in your nav bar. Just make sure to have canonical attributes on my posts. Also, use the "strip catagory base from URL" option - makes it all look a bit cleaner and shortens your URLs
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content with ADN, DNS and F5 URLs
In my duplicate content report, there are URLs showing as duplicate content. All of the pages work, they do not redirect, and they are used for either IT debugging or as part of a legacy system using a split DNS, QAing the site, etc... They aren't linked (or at least, shouldn't be) on any pages, and I am not seeing them in Search Results, but Moz is picking them up. Should I be worried about duplicate content here and how should I handle them? They are replicates of the current live site, but have different subdomains. We are doing clean up before migrating to a new CMS, so I'm not sure it's worth fixing at this point, or if it is even an issue at all. But should I make sure they are in robots or take any action to address these? Thanks!
Technical SEO | | QAD_ERP0 -
Duplicate homepage content across multiple websites
Hi, I work for a company that runs 30+ membership based websites on separate domains and across multiple markets. The homepage for each site contains a section of content that highlights the site benefits and features. While each website serves a different market/niche, this section of content is essentially the same as each website offers the same benefits and features. What is the best way to avoid duplicate content issues while still being able to show the same section of content across 30+ sites? This particular section of content isn't valuable from an SEO perspective, but the rest of the content on that page is. Any ideas or suggestions would be much appreciated. Thanks
Technical SEO | | CupidTeam0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
Worpress Tags Duplicate Content
I just fixed a tags duplicate content issue. I have noindexed the tags. Was wondering if anyone has ever fixed this issue and how long did it take you to recover from it? Just kind of want to know for a piece of mind.
Technical SEO | | deaddogdesign0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0