It should be a noindex. You don't want them to appear, right ? In that case, it's best to do a noindex. However, since you have already done a robots.txt disallow, the bots won't be able to read the pages any more (most of the bots). So your solution will work, it's more like a brute force method, but will work.
The right way is to do a noindex.