Oops, I missed that part. Have you checked Google Search Console to see if someone set any URL parameters?
The first thing I would do is determine how many pages actually should be indexed to see if there's a large discrepancy between that and the number Google shows. A crawler like Screaming Frog can help with this. If you export the crawl to Excel, you can easily remove duplicates in the canonical URL column and filter out the noindexed pages.
If you find there's no real discrepancy, Google may have simply been cleaning house of some really old links in the index that hadn't been crawled in a while.
Beyond that, if you can pinpoint any specific URLs that have been deindexed, use the "Fetch as Google" tool to help diagnose or post it here so the community can take a look.