I'm not sure what logic Moz is using for its reporting of Site Crawl issues, but it appears to be pretty flawed (unless I'm missing something, which is possible).
I've got a client site that has been in Moz for about 6 months now. Every time the crawler runs, the same number of pages are reported as having been crawled. However I'm consistently getting "New Issues" reported that should have been reported during previous crawls.
Example: A redirect chain was reported several month ago. The referring URL was the homepage of the website, and we tracked it down to an old link in the header. This was fixed, marked as resolved, and the issue was not shown on the next crawl. Several weeks later, the same issue was reported for a different page on the website - a page which has existed since 2014 and was already crawled many times. Again, we fixed. Fast-forward to the report that just ran on 12/1 and we have the same issue reported, for a different page, which has also existed for years and has been previously crawled.
It's very hard to explain to a client "this item you are seeing has been resolved", only to have it continually crop back up in future reports. Note this is not limited to redirect chains - that's just an example. I'm seeing this for other items such as missing canonicals, duplicate titles, etc.