Is Syndicated (Duplicate) Content considered Fresh Content?
-
Hi all,
I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain?
An example may clearly show what I'm after:
domain1.com is a lawyer in Seattle.
domain2.com is a lawyer in New York.Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value?
Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains).
Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well.
We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO.
Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain.
TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain?
Thanks so much,
Cole
-
Hi all,
Thanks for the responses & feedback.
Alan, in this example, the fresh content would be relevant. Of course there are search queries that don't need freshness or updates, but I would argue most do need updates / freshness (even the ones we think we know the answer to over time).Once again, the conversation is not about RANKING for that page but about HELPING the domain achieve "freshness & relevance" around a topic with that duplicate content.
Would love to see others chime in.
Thanks,
Cole
-
Well that could mean that some don't need any.
Like
Q. Who discovered Australia, A. Captain Cook.
This does not need freshness.Also consider being original content, in that case the timestamp being older would be better.
I like to think that I own google, and say to myself would I rank it? of cause some things may rank that were not intended to, but I think its quite safe to think that way.
-
This was the part that triggered me:
"Google Fellow Amit Singhal explains that “Dif__ferent searches have different freshness needs.”
The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query."
-
Had a quick look at that page, did not see that it affects all pages. Anyhow google said 35% of queries, so could not be all pages.
Some points- Why would fresh data be excluded from duplicate content?
- Is it likely that syndicated data is fresh?
- What are google trying to do here, rank syndicated duplicate data?
I cant see it working
-
Thanks a lot! Kinda made me realize I really should read some more about this update. Might be off topic, but what's your view on freshness applied to **all **pages. In this Whiteboard Friday its stated it only impacts the terms you describe:
http://moz.com/blog/googles-freshness-update-whiteboard-friday
But in this blogpost of that time (before the sum up) it’s stated that it’s applied to all pages, but does affect search queries in different ways:
-
Yes, freshness update was not for all queries, it was for certain queries that need fresh content such as football scores, or whose on the team this week, obviously we don't want the score from last year or who is playing last year we want the current data, that is where the freshness update may give you a boost while your content is fresh. Having syndicated content I cant see falling into this category, even if it did, being duplicate content would mean that only once source is going to rank.
Also you have to look at indexing, will the duplicate content even be indexed? if so how often.
That's why I say the short answer is no.
-
Hi Alan,
Is there any source / own research that can back up this answer?
Would love to read more about this subject!
-
Short answer, NO
-
Thanks for your feedback Mike - definitely helpful!
In this hypothetical, we're looking at research or comprehensive articles for specific niches that could serve multiple businesses well as an authority.
Thanks,
Cole
-
Hi Cole,
Fresh by Google (if not noindexed) in this case would be kind of like the freshness value of a "fresh" error.
Maybe that's extreme, but point being, the content is not needed by the web, since it already exists. If there was absolutely nothing else being added to or changed about the site and my one option was adding duplicate content, I'd noindex/follow it and figure I might have gotten some small, small, small benefit from updating the site a little, maybe an improved user signal. I'd for sure keep it out of the index. I guess that's how I'd do it, if it had some value for visitors. If it's only value was adding something fresh and not that great for visitors, I'd find the extra hour necessary to re-write it into something fresh, unique and valued by visitors. .
The other thing about syndicated content is that after you make sure where else you can find it on the web via an exact phrase search in Google, it may not mean you've seen the only instance of it as it may evolve. Having duplicate content indexed with other sites of possibly low quality may put you in a bad neighborhood as sites with common content. If I had a ten foot pole, I wouldn't touch it with it.
I hope that helps. Best... Mike
-
Hi Mike,
Thanks for the feedback. That was one potential point I was making.
Am still curious if duplicate content would be considered "fresh" within a website. Good point of the duplicate content overriding the benefit of fresh content.
Thanks,
Cole
-
In phrasing the question as "is it considered fresh/unique," I'm going to assume you mean by google for the site's organic benefit. So, I guess the reasoning would be is the fact that it's fresh to the site a bigger positive than the negative of duplicate content. Is that what you're getting at? Personally, knowingly on-boarding duplicate content would be too big of a potential negative for me to consider doing it. I've done it as a noindex/follow for reasons other than Google, but not for some mystery freshness bump.
Not that you can't find examples of duplicate content ranking in more than one place. To me on-boarding indexed duplicate content seems like just asking for trouble.
Hope that helps. Best... Mike
-
I'm curious to see what others have to say on this, but I've always assumed that "fresh" and "unique" go hand in hand when it comes to website content. Therefore, duplicate content would not be fresh content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Are links on a press page considered "reciprocal linking"?
Hi, We have a press page with a list of links to the articles that have mentioned us (most of which also have a link to our website). Is there any SEO impact with this approach? Does Google consider these reciprocal links? And if so, would making the links on the press page 'nofollow' solve the issue?
White Hat / Black Hat SEO | | mikekeeper0 -
Lots of websites copied my original content from my own website, what should I do?
1. Should I ask them to remove and replace the content with their unique and original content? 2. Should I ask them to link to the URL where the original content is located? 3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Switching site content
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts: Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site? What could the penalties be? Please advise...
White Hat / Black Hat SEO | | klsdnflksdnvl0 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780