Duplicated content in news portal: should we use noindex?
-
Hello,
We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective.
The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website.
I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do.
I would appreciate any opinion on that.
-
As a news portal, duplicate content is unavoidable (unless you make up your own news, which actually has been known to do well...)
If you are selling articles, the buyers will tag them for their websites. If they leave them index, follow and put their own canonical on them (common, in my experience) be aware that they can outrank you for your own content if their site has more authority. And having the same content on many sites with conflicting canonicals probably is not going to be worth much SEO-wise for any of them.
As far as articles that are given to you, you should use the canonical of the originating site to give them credit for creating the material. This won't get you search traffic, but readers on your site would have the content right there at their fingertips, and would not have to go to another site to read it. I tend to think that noindex-nofollowing a substantial fraction of your site might raise some red flags.
The assumption here is that the content duplication is being made simply as a convenience to the readers. If you are doing it to increase your rankings, it probably won't work. Excellent, original content should stay on your own site and not be sold.
-
My Advice is the following:
1. Check how much traffic is coming from this section, you can do this in landing page analysis on Google Analytic's or the tracking you use.
If you are getting a decent amount of traffic from these articles even if its long tail I would think of another strategy before slapping on a no index. Because when you do the traffic will go.
I have dealt with a similar strategy for a news website in the past, what many of the big syndication players do is take duplication content to rank on Google News for 30-60 days then they 404 the page, I have seen this numerous times, I do not know how viable the strategy is overall.
Ive also noticed some news websites play around with Canonical tags via various partners on duplication content and yes they also do some no indexing.
Really research this before you implement it, I have done a bit of News SEO for Australian sites its an interesting area with limited information online.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript tabbed navigation and duplicate content
I'm working on a site that has four primary navigation links and under each is a tabbed navigation system for second tier items. The primary link page loads content for all tabs which are javascript controlled. Users will click the primary navigation item "Our Difference" (http://www.holidaytreefarm.com/content.cfm/Our-Difference) and have several options with each tabs content in separate sections. Each second tier tab is also available via sitemap/direct link (ie http://www.holidaytreefarm.com/content.cfm/Our-Difference/Tree-Logistics) without the js navigation so the content on this page is specific to the tab, not all tabs. In this scenario, will there be duplicate content issues? And, what is the best way to remedy this? Thanks for your help!
Technical SEO | | Total-Design-Shop0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
Multiple Sites Duplicate Content Best Practice
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas. I.e. (widgetmakersinmarietta.com) Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed? Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point. 3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh? Thanks everyone.
Technical SEO | | greenhornet770 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
How to resolve this Duplicate content?
Hi , There is page i get when i do proper menu navigation Caratlane.com>jewellery>rings>casualsrings> http://www.caratlane.com/jewellery/rings/casual-rings/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html When i do a site search in my search box by my product code number "JR00219" The same page is appears with different url http://www.caratlane.com/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html So there is a duplicate content. How can we resolve it. Regards, kathir caratlane.com
Technical SEO | | kathiravan0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1