First there was the NOFOLLOW meta tag for page-level exclusion and then Google adopted the more granular rel=nofollow attribute for individual links on a page. I find that too many SEOs overuse the rel=nofollow attribute when there is a much more elegant solution available. The reason for this is now myth formerly known as the abused tactic called PageRank sculpting. I had a well-known culture/nightlife site in NYC as a client that had placed literally thousands of rel=nofollow attributes on links throughout the site... granted this does not seem to be your problem but I digress...
To illustrate my point, Matt Cutts discusses how rel=nofollow attributes affect how Google passes PageRank to other parts of your site (or more precisely how nofollows decay the amount of link juice passed). In the case of a few pages or even large directories, etc, I would do the following:
- Disallow crawling of less valuable pages via Robots.txt
- Use the meta exclusion NOINDEX, NOFOLLOW tag at the page level - unless these pages pass valuable link juice/anchor text to other parts of the site then use NOINDEX, FOLLOW (page is not indexed but important links are followed)
- Also, leave these pages out of your XML sitemap(s) - although you may want leave them in the HTML sitemap and place a granular rel=nofollow at link-level in the case of a 404 error page for usability purposes or required privacy statement for landing pages.
Saving your Googlebot crawl budget for only high value pages is a great way to get more of those pages in the Google index providing you with more opportunity to promote your products, services, etc. Also, limiting the number of rel=nofollows used and allowing link juice (or Page Rank) to flow more freely throughout your site will prove beneficial.