Matt Cutts and Curated Content -- something is confusing here...
-
Okay, I read an interview somewhere this week where Matt Cutts said he didn't care much for curated content. Today I searched on that subject and came up with the following video of his:
So, in the video he is going along and saying not to just grab content and repost it. And then at around minute 3:15 he says that, on the other hand, you can have a blog like DaringFireball.net and that's a good thing, because the blogger takes the time to pick and choose what he is posting.
I went to Daring Fireball to take a look, and I saw that he writes maybe one line of commentary, and then pastes in a big chunk of the curated content along with a link to the source.
This shocked me. How could Matt like that blog -- he keeps telling that he likes original not duplicate, curated content. So, the difference is that a blog can get away with this if they exercise discretion in what they choose to copy and paste? How the hell would the Google algorithm know what the intention of the blogger is?
And here I've been wasting my time writing up paragraphs and paragraphs to precede any excerpts I paste in, in fear of getting hit by Google.
I'd like to hear your comments on this.
-
DaringFireball.net must have very low time on site numbers though. There's nothing to read on that blog, and links take the user away from the site. It's just a weird example that he chose.
-
@ Bizzer - I don't disagree with what you're saying. The issue is more complex and isolating one factor (even a major factor such as duplicate content issues) is often very difficult to do if you are comparing small sites with very large ones such as Mashable of HuffPo. Mr. Cutts has avoided answering whether non-analytics data about time on site is a ranking factor. I believe it is. Many other factors favor larger "high authority" sites. Even if you select better material and make more useful editorial comments about it (as evidenced by better time on site), G is going to favor the larger sites.
-
Yes, that's the difference, but how does Google know unless they takes the time to read each site?
In a blog of mine, I'm taking care to rewrite my own take on various stories that appear on other sites, only sometimes even pasting in an excerpt. I discriminate in my selections. So, instead, is Matt Cutts saying that I could just save a ton of time by just making a sentence of commentary and then paste in an excerpt? I guess so. (Yes, I know my blog is better for readers if I do it as I am now, that's not the issue).
Still, he's contradicting himself on duplicate content. And I still think it's hilarious (and confusing) that Cutts used DaringFireball as an example of how to curate. I'd at least think he should have pulled a site as an example that maybe writes a couple paragraphs, then some excerpt and then maybe a closing paragraph.
-
I think the difference is making a site that just regurgitates content nondiscriminatory vs one that has staff or editors that pick through the best of the content and then re-post it.
Yahoo repost Mashable stuff all day long, Huffington post built their world off or regurgitated content. Whats more is most news sites pull content from the Associated Press, each organization then decides what to tell its viewers about or spin to their political or company agendas.
In this sense a News Corp, or Website is actively discriminating at what they post and wish to tell its viewers about.
That's about the only thing I can make of it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with outdated and irrelevant content on a website?
Hi everyone, On our corporate website we have a blog where we publish articles which are directly related to our company (house heating systems and gas cylinders) and some articles which are completely irrelevant to our core business, but which might be of interest to our potential clients. Recently I've been told that it is not a good idea to include these not directly related posts to our core business, because Google might be somewhat confused at to what our core business is all about. I was advised to research this topic and think of completely removing blog posts that are irrelevant to our core business from our blog. By removing I mean completely removing pages and setting a 410 status to tell Google that it is not a 404 error but that these pages were intentionally removed. I would like to hear some independent advice from Moz community as to what I should do? Thank you very much in advance.
Content Development | | Intergaz0 -
Why should I avoid publishing off-topic content on my website?
As a fun project, my team wanted to build a mini-food blog based off the lunches we make here at our office -- but we're a software company and, topically, our product has nothing to do with food. Therefore, I suggested that we not publish this content on our website + create a Medium publication instead (this would also help us avoid the headache of creating an entirely new section of our website / potential 404 issues from non-technical editors / etc.) However, I struggled to articulate _why _it's a best practice to only publish relevant content on your website. Is it to help search engines understand what your website is about as an entity? Spam signals?
Content Development | | AsanaOps0 -
Tool to identify duplicated content on other sites
Hi does anyone know of a tool that could be used to identify if a site is using our content without permission? Thanks
Content Development | | turismodevino10 -
Product descriptions, when do they become classed as duplicate content, how different do they have to be?
I look after 3 sites which have a lot of crossover on products. We have 1000s of products and I've made it a requirement that we give each it's on description on each of the sites. This sounds like the right thing to but it's very hard for our content writers to write three different versions descriptions, especially when we have variations on the products so potentially writing unique product descriptions for 4-5 very similar products on three separate sites. We've worked very hard to create unique content deep through the site on all categories, subcategories and tag combinations and along with the other SEO work we've done over the last couple of years is producing great results. My question is now far do we have to go? I'm busy writing some product descriptions for a 3rd party site for some of our products, the easy thing to do is just copy and paste but I want Google to see the descriptions as unique. Whilst all SEO advice will say 'write unique descriptions' from a practical point of view this isn't especially useful as there doesn't really seem to be much guidance on how different they need to be. I gather we can't just move around the paragraphs or jumble up sentences a bit but it is easier to work from a description and change it than it is to start from a blank slate (our products range form being very interesting and unique, to quite everyday so sometimes tough to create varied unique content for). Does anyone know of any guidance or evidence of just how clever the Google algorithm is and how close content has to be before it becomes classed as the same or similar? Thanks Pete
Content Development | | PeterLeatherland0 -
Need advice on Paying for content?
Hi, We would like to attracting more readers onto our blog by accepting guest posting from external bloggers. Could anyone refer any good artricles on MOZ about the best practices on managing the activities and what would be the normal KPIs and ROI?
Content Development | | LauraHT0 -
Content
Hi, We produce research report on same topic every year (e.g. cost of patrol in different part of the country) and publish them to a new url every time. What is the best way to do it? create new page every year or update the existing research report with modification Considering long term benefits. Thanks
Content Development | | APH0 -
Marking our content as original, where the rel=author tag might not be applied
Hello, Can anyone tell, if it is possible to protect text –type content without the rel=author tag? We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased. My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google. The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:) What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10. Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own. So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done. (Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.) Thank you in advance for all of you, sharing your thoughts with me on the topic.
Content Development | | Dilbak0 -
Press Releases and Duplicate Content on Event Related Site
I have a site that lists events. I ask those submitting events to submit original content if possible, but frequently they submit press releases which are already published elsewhere. I rewrite some of the press releases, but do not have time to rewrite every press release that comes my way. I want my users to get a comprehensive list of events, but I don't want get a penalty for duplicate content. What is the best solution?
Content Development | | andywozhere0