Is Syndicated (Duplicate) Content considered Fresh Content?
-
Hi all,
I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain?
An example may clearly show what I'm after:
domain1.com is a lawyer in Seattle.
domain2.com is a lawyer in New York.Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value?
Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains).
Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well.
We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO.
Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain.
TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain?
Thanks so much,
Cole
-
Hi all,
Thanks for the responses & feedback.
Alan, in this example, the fresh content would be relevant. Of course there are search queries that don't need freshness or updates, but I would argue most do need updates / freshness (even the ones we think we know the answer to over time).Once again, the conversation is not about RANKING for that page but about HELPING the domain achieve "freshness & relevance" around a topic with that duplicate content.
Would love to see others chime in.
Thanks,
Cole
-
Well that could mean that some don't need any.
Like
Q. Who discovered Australia, A. Captain Cook.
This does not need freshness.Also consider being original content, in that case the timestamp being older would be better.
I like to think that I own google, and say to myself would I rank it? of cause some things may rank that were not intended to, but I think its quite safe to think that way.
-
This was the part that triggered me:
"Google Fellow Amit Singhal explains that “Dif__ferent searches have different freshness needs.”
The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query."
-
Had a quick look at that page, did not see that it affects all pages. Anyhow google said 35% of queries, so could not be all pages.
Some points- Why would fresh data be excluded from duplicate content?
- Is it likely that syndicated data is fresh?
- What are google trying to do here, rank syndicated duplicate data?
I cant see it working
-
Thanks a lot! Kinda made me realize I really should read some more about this update. Might be off topic, but what's your view on freshness applied to **all **pages. In this Whiteboard Friday its stated it only impacts the terms you describe:
http://moz.com/blog/googles-freshness-update-whiteboard-friday
But in this blogpost of that time (before the sum up) it’s stated that it’s applied to all pages, but does affect search queries in different ways:
-
Yes, freshness update was not for all queries, it was for certain queries that need fresh content such as football scores, or whose on the team this week, obviously we don't want the score from last year or who is playing last year we want the current data, that is where the freshness update may give you a boost while your content is fresh. Having syndicated content I cant see falling into this category, even if it did, being duplicate content would mean that only once source is going to rank.
Also you have to look at indexing, will the duplicate content even be indexed? if so how often.
That's why I say the short answer is no.
-
Hi Alan,
Is there any source / own research that can back up this answer?
Would love to read more about this subject!
-
Short answer, NO
-
Thanks for your feedback Mike - definitely helpful!
In this hypothetical, we're looking at research or comprehensive articles for specific niches that could serve multiple businesses well as an authority.
Thanks,
Cole
-
Hi Cole,
Fresh by Google (if not noindexed) in this case would be kind of like the freshness value of a "fresh" error.
Maybe that's extreme, but point being, the content is not needed by the web, since it already exists. If there was absolutely nothing else being added to or changed about the site and my one option was adding duplicate content, I'd noindex/follow it and figure I might have gotten some small, small, small benefit from updating the site a little, maybe an improved user signal. I'd for sure keep it out of the index. I guess that's how I'd do it, if it had some value for visitors. If it's only value was adding something fresh and not that great for visitors, I'd find the extra hour necessary to re-write it into something fresh, unique and valued by visitors. .
The other thing about syndicated content is that after you make sure where else you can find it on the web via an exact phrase search in Google, it may not mean you've seen the only instance of it as it may evolve. Having duplicate content indexed with other sites of possibly low quality may put you in a bad neighborhood as sites with common content. If I had a ten foot pole, I wouldn't touch it with it.
I hope that helps. Best... Mike
-
Hi Mike,
Thanks for the feedback. That was one potential point I was making.
Am still curious if duplicate content would be considered "fresh" within a website. Good point of the duplicate content overriding the benefit of fresh content.
Thanks,
Cole
-
In phrasing the question as "is it considered fresh/unique," I'm going to assume you mean by google for the site's organic benefit. So, I guess the reasoning would be is the fact that it's fresh to the site a bigger positive than the negative of duplicate content. Is that what you're getting at? Personally, knowingly on-boarding duplicate content would be too big of a potential negative for me to consider doing it. I've done it as a noindex/follow for reasons other than Google, but not for some mystery freshness bump.
Not that you can't find examples of duplicate content ranking in more than one place. To me on-boarding indexed duplicate content seems like just asking for trouble.
Hope that helps. Best... Mike
-
I'm curious to see what others have to say on this, but I've always assumed that "fresh" and "unique" go hand in hand when it comes to website content. Therefore, duplicate content would not be fresh content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question RE: Links in Headers, Footers, Content, and Navigation
This question is regarding this Whiteboard Friday from October 2017 (https://moz.com/blog/links-headers-footers-navigation-impact-seo). Sorry that I am a little late to the party, but I wanted to see if someone could help out. So, in theory, if header links matter less than in-content links, and links lower on the page have their anchor text value stripped from them, is there any point of linking to an asset in the content that is also in the header other than for user experience (which I understand should be paramount)? Just want to be clear.Also, if in-content links are better than header links, than hypothetically an industry would want to find ways to organically link to landing pages rather than including that landing page in the header, no? Again, this is just for a Google link equity perspective, not a user experience perspective, just trying to wrap my head around the lesson. links-headers-footers-navigation-impact-seo
White Hat / Black Hat SEO | | 3VE0 -
Press Releases. What are the reasons (if any) that I should even consider them?
For the last couple years, i believed common knowledge that press releases was NOT the way to go for SEO. So I avoided it like crazy! Now, I'm doing some research and I realize that some (or alot) companies still create press releases to get the word out so that they can spread the message. I am going to be signing with some clients that are bigger brands and I want to do everything in our power to get the word out. I've done social outreach, commenting and cold emails to influencers before, but now I've been seeing some competitors using press releases with a little success, I want to ask the community, is there a method they recommend of testing efficacy of press releases? My biggest fear is using press releases then months/years down the line be hit hard by a penalty just for testing different PR sites.
White Hat / Black Hat SEO | | JasonKhoo0 -
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Is this Duplicate content?
Hi all, This is now popping up in Moz after using this for over 6 months.
White Hat / Black Hat SEO | | TomLondon
It is saying this is now duplicate site content. What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady. http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/ Thanks.0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0