Hi Jay,
I definitely sympathize, and I'm sorry you're dealing with this. I'm aware that there is a small subset of hosts that feels our crawlers are too aggressive, and that yours isn't the only one. (As it happens, though, both of those posts are regarding Dotbot, the Mozscape Index crawler, and the first response to the Q&A post indicates that the host has no issues with Rogerbot.)
Our challenge in this area centers around our need to accommodate a vast and diverse customer base. We have customers with sites spanning millions of pages, and we're obligated to meet our service-level agreement with them to provide data in a timely manner. As it is, many of our larger customers receive crawl updates only once per month in order to prevent Rogerbot from having to crawl too aggressively. We've found that the rate at which Rogerbot crawls is acceptable to the vast majority of hosts, and that the few who would prefer a less-aggressive crawl are almost always willing to apply a crawl limit.
This is especially true given that Rogerbot only crawls sites on-demand, either as part of an ongoing Moz campaign or in a Crawl Test. Since having a site crawled by Rogerbot is voluntary, it generally falls on that few to adjust their crawl limits accordingly. We simply can't adjust our crawl rate to suit the requirements of that few. This is the case with all of our competitors, as well.
That said, I'd still love to show a server log from one of those old crawls to our engineering team. If something _is _amiss with our crawler, we absolutely want to make sure it's addressed. I understand you've been in touch with our Help team, so you can go ahead and send it over to them.