OK, this is just what I've done, and it might not work for everyone.
As far as I can tell, the duplicate content warnings do not hurt my rankings, I don't think. When I first signed up for SEOMoz they really alarmed me. If they are hurting my rankings, it's not much - as we preform well in many competitive keywords for our industry, and our website traffic has been growing ~20% year over year for many years now.
The fix for auto-generated duplicate content on our site (which I inherited as my responsibility when I started at my company) would be very expensive. It's something I plan on doing eventually along with some other overhauls, but right now it's not in the budget, because it would basically involve re-architecting how the site and databases function on the back end (ugh).
So, in order to help mitigate any issues and help keep Google from indexing all the duplicate content that can be generated by our system, I use the "URL Parameters" setting in Google Webmaster Tools (under Site Configuration). I've set up a few parameters for Google to specifically NOT INDEX, to keep the duplicate content out of the search engine. I've also set some parameters to specifically reenforce content I want indexed (along with including the original content in sitemaps I've curated myself, rather than having auto-generated sitemaps potentially polluted with duplicate content).
My thinking is that while Roger the SEOMoz bot is still finding this stuff and generating warnings, Googlebot is not.
I don't work at an agency - I'm in-house and I've hard to learn everything by trial and error and often fly by the seat of my pants with this sort of thing. So my conclusion/solutions may be wrong or not work for you, but it seems to work for me.
It's a band-aid fix at best, but it seems to be better than nothing!
Hope this helps,
-Adam