Hey Ture,
I'll go ahead and address your questions one by one.
I took a look at your campaign and it is by no means giving a 1 page crawl every second time. You do seem to have an issue with your server that causes it to give this response (you can find the response in your CSV file) every 3-5 months:
Connection was refused by other side: 111: Connection refused.
Unfortunately I could not tell you what causes it as I don't do web or server support. You'll likely need to speak with your admin to see what can be done to avoid that.
As far as the robots.txt, RogerBot definitely does obey properly formatted robots.txt directives. If you feel that he's doing otherwise, you should email help [at] seomoz.org with specific details.
The re-auth issue with facebook is them expiring your auth token and wanting you to renew it. It is not us, we'd obviously much rather they not expire the token. You also see that sometimes with Google, especially if you have a large number of sites.
I remember we talked about this issue with your twitter handles before. This issue was resolved at the time, twitter was reporting that account as invalid in their API, we reached out to twitter and had them fix that for you. If you're seeing this again email us the information so we can look at it.
Remember, the Q&A is meant more for community based questions and is not really a forum for seeking technical support as we can't discuss any details of your account. In the future, email technical support questions directly to help [at] seomoz.org.
Cheers,
Joel.