Hi Carl,
Several large publications do this sort of thing already, but they do have a lot of content of their own to back the duplicate / blocked content up. The most large-scale example of this is newspapers that syndicate content from other papers, often internationally. I was the SEO on a project like this for a large UK paper, and we blocked the duplicated content's subfolder via robots.txt so that the newspaper was not re-publishing indexable content from its international sister.
Your other option is to use the canonical tag to point back to the original version of the content.
Syndication shouldn't be harmful, and if you were doing this with a lot of content on the site to begin with, it would be normal and fine. What worries me is Google seeing a new site where there is literally no content (to begin with) and a large, blocked section. After the Panda update, it's pretty important to show a resource-heavy website, even if the site's purpose is filled without content. For instance, a property search engine I worked on saw a huge Panda penalty because all of their articles were on an artlce subdomain, not on the same subdomain as the "money" part of their site. We had to move the articles over to the main site.
It's not possible for me to say exactly what will happen if you go ahead with this, but I must advise that you should be building out your unique content both before launch, and quickly post-launch. It's vital that unique, indexable content be live on the site for it to perform well, even for commercial queries that don't rely on a site having articles.
Cheers,
Jane