Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is content aggregation good SEO?
-
I didn't see this topic specifically addressed here: what's the current thinking on using content aggregation for SEO purposes?
I'll use flavors.me as an example. Flavors.me lets you set up a domain that pulls in content from a variety of services (Twitter, YouTube, Flickr, RSS, etc.). There's also a limited ability to publish unique content as well.
So let's say that we've got MyDomain.com set up, and most of the content is being drawn in from other services. So there's blog posts from WordPress.com, videos from YouTube, a photo gallery from Flickr, etc.
How would Google look at this scenario? Is MyDomain.com simply scraped content from the other (more authoritative) sources? Is the aggregated content perceived to "belong" to MyDomain.com or not? And most importantly, if you're aggregating a lot of content related to Topic X, will this content aggregation help MyDomain.com rank for Topic X?
Looking forward to the community's thoughts. Thanks!
-
Thank you both.
-
To answer your main question, content aggregation is not good SEO.
First, search engines are going to see scraped duplicate content. Second, if you add any ads at all, you will only increase the odds you will be viewed as a profit-seeking-curated-content farmer which may as well be equivalent to spam.
Panda and Penguin were released to deal with sites that provide no-value content aggregation.
If you aren't adding value, don't do it.
OK, all that being said, I do find Flavors.me interesting.
I think it might work as a personal website used to drive traffic through to other channels. Like About.me, but with a bit more oomph!
Provide a solid profile (unique content providing value) that points to the places I spend my time on the Web.
-
Ask the thousands of sites that were content aggregating that got hammered with Pandas content updates. Aggregated content is duplicate content inherently . It isn't unique and original and creates a negative impact to a sites SEO. Now if the value is to not generate SEO but generate traffic that can be monetized in another way, go for it........but if you want to create solid SEO, STAY AWAY.
Hope this helps.
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Affect of ™ and ® in title for SEO
I am looking at adding the trademark and rights reserved symbols to some of my titles. I think this might help with click through rate. From what I have found, this shouldn't have an affect on SEO unless it makes the title too long. Is this correct? Stephen
On-Page Optimization | | stephen.volker1 -
Harms of hidden categories on SEO
On our website we have some invisible/hidden categories on our site. Can anyone advise whether these are harmful in terms of SEO?
On-Page Optimization | | CostumeD0 -
Duplicate Content - Bulk analysis tool?
Hi I wondered if there's a tool to analyse duplicate content - within your own site or on external sites, but that you can upload the URL's you want to check in bulk? I used Copyscape a while ago, but don't remember this having a bulk feature? Thank you!
On-Page Optimization | | BeckyKey0 -
Less Tags better for SEO?
I am currently reviewing my strategy when it comes to categories and tags on my site. Having been no-indexed for some time, and having many tags with just one entry I am thinking that this is not optimal for SEO purposes. This is what I am planning: Categories - Change these to Index, but only after adding a hundred words or so by way of introduction (see this example - https://www.besthostnews.com/news/hosting/a-small-orange-news/). With the categories I am thinking of highlighting key articles as well to improve link juice distribution to older articles that are important. Tags - About half my tags have only 1 entry, with a few more just having 2 entries. I am thinking of deleting all tags with just one entry, and trying to merge those with just two or 3 entries where it makes sense to do so. I will keep these as no-index, but I think this will mean more optimal distribution of link juice within the site. I would appreciate your thoughts \ suggestions on the best practices here.
On-Page Optimization | | TheWebMastercom0 -
Multilingual site with untranslated content
We are developing a site that will have several languages. There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated. We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have: etc In the spanish version, we would point to the french version and the english version etc. My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages? I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
On-Page Optimization | | jorgeapartime0 -
ECommerce Filtering Affect on SEO
I'm building an eCommerce website which has an advanced filter on the left hand side of the category pages. It allows users to tick boxes for colours, sizes, materials, and so on. When they've made their choices they submit (this will likely be an AJAX thing in a future release, but isn't at time of writing). The new filtered page has a new URL, which is made up of the IDs of the filter's they've ticked - it's a bit like /department/2/17-7-4/10/ My concern is that the filtered pages are, on the most part, going to be the same as the parent. Which may lead to duplicate content. My other concern is that these two URLs would lead to the exact same page (although the system would never generate the 'wrong' URL) /department/2/17-7-4/10/ /department/2/**10/**17-7-4/ But I can't think of a way of canonicalising that automatically. Tricky. So the meat of the question is this: should I worry about this causing issues with the SEO - or can I have trust in Google to work it out?
On-Page Optimization | | AndieF0 -
SEO Optimizing in UMBRACO
Hi there, I am planning to use UMBRACO to manage my existing website, so my question to Seomozzers out there is what should I be aware of, how safe is it to have UMBRACO in terms of SEO. By using this software, would it be possible to get a positive or negative impact on my keyword rankings? Thanks!
On-Page Optimization | | matti_wilson0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0