Matt Cutts and Curated Content -- something is confusing here...
-
Okay, I read an interview somewhere this week where Matt Cutts said he didn't care much for curated content. Today I searched on that subject and came up with the following video of his:
So, in the video he is going along and saying not to just grab content and repost it. And then at around minute 3:15 he says that, on the other hand, you can have a blog like DaringFireball.net and that's a good thing, because the blogger takes the time to pick and choose what he is posting.
I went to Daring Fireball to take a look, and I saw that he writes maybe one line of commentary, and then pastes in a big chunk of the curated content along with a link to the source.
This shocked me. How could Matt like that blog -- he keeps telling that he likes original not duplicate, curated content. So, the difference is that a blog can get away with this if they exercise discretion in what they choose to copy and paste? How the hell would the Google algorithm know what the intention of the blogger is?
And here I've been wasting my time writing up paragraphs and paragraphs to precede any excerpts I paste in, in fear of getting hit by Google.
I'd like to hear your comments on this.
-
DaringFireball.net must have very low time on site numbers though. There's nothing to read on that blog, and links take the user away from the site. It's just a weird example that he chose.
-
@ Bizzer - I don't disagree with what you're saying. The issue is more complex and isolating one factor (even a major factor such as duplicate content issues) is often very difficult to do if you are comparing small sites with very large ones such as Mashable of HuffPo. Mr. Cutts has avoided answering whether non-analytics data about time on site is a ranking factor. I believe it is. Many other factors favor larger "high authority" sites. Even if you select better material and make more useful editorial comments about it (as evidenced by better time on site), G is going to favor the larger sites.
-
Yes, that's the difference, but how does Google know unless they takes the time to read each site?
In a blog of mine, I'm taking care to rewrite my own take on various stories that appear on other sites, only sometimes even pasting in an excerpt. I discriminate in my selections. So, instead, is Matt Cutts saying that I could just save a ton of time by just making a sentence of commentary and then paste in an excerpt? I guess so. (Yes, I know my blog is better for readers if I do it as I am now, that's not the issue).
Still, he's contradicting himself on duplicate content. And I still think it's hilarious (and confusing) that Cutts used DaringFireball as an example of how to curate. I'd at least think he should have pulled a site as an example that maybe writes a couple paragraphs, then some excerpt and then maybe a closing paragraph.
-
I think the difference is making a site that just regurgitates content nondiscriminatory vs one that has staff or editors that pick through the best of the content and then re-post it.
Yahoo repost Mashable stuff all day long, Huffington post built their world off or regurgitated content. Whats more is most news sites pull content from the Associated Press, each organization then decides what to tell its viewers about or spin to their political or company agendas.
In this sense a News Corp, or Website is actively discriminating at what they post and wish to tell its viewers about.
That's about the only thing I can make of it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content for Non-SEO Purposes
Duplicate Content for Non-SEO Purposes There are a few layers to this question, but at the most basic level the question is... -Will having the same article (in the form of archived e-newsletter issues) on multiple different websites' newsletter archives HURT those sites? I'm fairly sure it won't HELP any of them in terms of SEO, but will having these back issues of their e-newsletters archived on their websites get them penalized? For the purpose of this question, these are not clients we are doing SEO for, just hosting and their e-newsletters. So it's fine if the archives provide no SEO benefit, we just don't want to leave them up if they will become LIABILITIES for the websites. -If having the same article in archived issues of e-newsletters on multiple different websites WOULD be harmful, would moving these archives to a sub-domain change anything or would it be best to simply take the archives down altogether? -Alternately, would spinning these articles make any difference in whether or not these sites get penalized? -Lastly, would spinning make the articles usable for archived e-newsletters for clients that ARE signed on for SEO services? I have a hunch about this, but I'd love to hear your expert opinions. Thanks!
Content Development | | BrianAlpert780 -
Are press releases that could end up being published with duplicate content links point back to you bad for your site ?
With all the changes to the seo landscape in the resent years im a little unsure as to how a press release work looks in the eyes of Google (and others). For instance, you write up a 500 word press release and it gets featured on the following sites : Forbes Techcrunch BBC CNN NY Times etc ... If each of these cover your story but only rewrite 50% of the article (not saying these sites wouldn't re write the entire artcile, but for this purpose lets presume only 50% is rewritten) could it be negative to your backlink profile, ? Im thinking not, as these sites will have high authority, but what if once your press release is published on these sites 10 other smaller sites re publish the stories with almost no re writing, either straight from the press release or straight from the article in the mainstream news sites. (For clarification this Press release would be done in the fashion of a article suggestion to relevant journalists, rather than a blanket press release, via PR Newswire, mass mail out etc. Although i guess the effect with duplicate content backlinks is the same.) You now have c. 50 articles online all with very similar content with links pointing back at you, would this have a negative effect or would each link just not carry as much value as it normally would. By now we all understand publishing duplicate content on our own sites is a terrible idea, but dose have links pointing back to your self from duplicate (or similar) content hosted on other sites (some being highly authoritative) effect your site 's seo ?
Content Development | | Sam-P1 -
Any freelance writers with viral content / linkbait experience?
Looking for a great freelance writer to assist in creating linkbait and viral content pieces. Please contact me if you are, or know of, such a person. 🙂
Content Development | | AdamThompson0 -
On page content and PDF - Dup?
Hi We are writing a useful article which we want to put on our site, but we also want to add it as a pdf which people can download - will this be classed as dup copy?
Content Development | | jj34340 -
Help with Content Revamp
Many years ago we wrote about 60 content pages for our surfboard e-commerce website targeting all the top popular keywords. Many of them generic but very keyword focused. We are now revamping our content our our site and want to move away from the generic side of things and actually rewrite all the pages to make them very useful and actually stuff our customers can really use and will find very helpful. I noticed that many times we wrote small pages less than 500 words that target similar keywords around a general theme. In looking at the analytics all the pages are getting a good amount of traffic and ranking well but im wondering would it be ok to focus on a main topic and combine similar pages if they are related? So i can take the say 60 articles and combine it down to say 10 articles and make the articles cover alot more stuff instead of just being small 500 word articles. As an example we have many surfboard models so we wrote an article for -Longboard Surfboards -Funboard Surfboards -Mini Malibu Surfboards -Retro Fish Surfboards -Womens Surfboards -Beginner Surfboards My question is could i weave these all together and write one long guide on say "Choosing The Type of Surboard you need" and cover all the board models in that article and then redirect the old pages to point to that one article. Would i still rank well for all these words Or would this destroy all my current rankings for these words? What is the best approach to rewriting and or combining old content pages that currently rank well but could be combined with others around the same theme to make it more user friendly?
Content Development | | isle_surf0 -
Blogger - Multiple partial duplicate content and canonical
In Blogger, have at least three pages produced for each post - main post, archive and tag - each has their own canonical tag - are these considered duplicate content by Google? Not sure the best way to handle this.
Content Development | | holdtheonion0 -
Blog content practices for e-commerce sites
What is the best practice in regards to content for e-commerce blogs on the same domain as the web-store (blog.storename.com)? What balance of content should be on the blog vs. the item & section pages or doesn't it matter?
Content Development | | MEldridge0