These tools are designed to report very different things, Eric. SF and Moz are only able to tell you what they can discover about what's happening within your site, since that's all they have access to. (Moz's link index isn't tied into their site crawl reports). So your own pages that are linking to other pages which no longer exist.
Search Console is also showing what's happening on the rest of the web outside your website. So it's picking up and reporting on links on other websites that it's trying to follow but finding 404 at the end on your site.
So, for example, Search Console is the only one that's going to be able to find and report on URLs that you used to have, but that haven't been correctly redirected after the URLs changed, like in a redesign. Assuming you cleaned up your internal linking after the redesign, the only way to know that these other URLs still haven't been properly redirected is via GSC's external visibility.
If GSC is finding a lot of these, and you want to proactively find as many more as possible without waiting for GSC to find them. there are a couple of methods available to try to address in bulk. One way is to run a Screaming Frog crawl of the previous version(s) of the site in the Wayback Machine, then rerun those as a list to discover which ones now 404. Or if you have an old sitemap, you can upload that into SF and run that and check for 404s.
A partial alternative is to run a custom report in GA that shows the referrer to the 404 page. But it will only track actual clickthroughs, whereas GSC actually reports on external broken links, whether they've been clicked on recently or not. So much broader and a bit more proactive. Eoghan at Rebelytics has a great post on assembling these useful reports, including links to directly import from the Solutions Gallery. Do note though, this data can often be pretty "dirty" with direct/(not set) referrers, or very vague ones like "Facebook"
But bottom line - use your Screaming Frog crawl to find and fix the broken links within your own site. USe the GSC list (downloaded) to write appropriate redirects for the URLs it finds that haven't' been covered by your internal fixes. Since you can't do anything about the broken links on third-party sites, the only control you have of them is writing redirects inside your own system. Put another way - it's not worth the effort to find where on the external webs the broken inbound links are from, since you can't do anything about changing them anyway.
Hope that all makes sense?
Paul
P.S. Marking the 404s as fixed in GSC will suppress them for about 90 days, but if the broken links still exist out on the web or in the index and are re-crawled, they will again start to show up in your GSC list until they've been crawled often enough that G decides to trust the 404 and may stop crawling them. Once you apply redirects, G will stop reporting them as 404s. Do note though that if the equivalent content really is gone, there's nothing wrong with letting the broken links 404. Trying to artificially redirect them to unrelated pages (like the home page) will just result in G treating them as (soft)404s anyway.