Theoretically, if the URL is blocked by robots.txt it should not appear in the index results no matter if they are in the sitemap but I have seen URLs indexed that are blocked by robots.txt but are in the sitemap and have good links pointing to it. If you want to block pages that have good links pointing to them, my advice is to remove them from sitemap. #justathought.
About URLs from multiple domains, I personally create separate sitemaps for different subdomains and link to main sitemap and I see better indexing that way.
Again, these are my personal experiences and not rules so please do keep that in mind as things can be different fro them.