I don't know if he was harsh, every site can be improved, it's the nature of the changing beast.
Large chunks of the content appearing on other sites is a big problem and could explain drop if Google has recently crawled/valued those sites.
I had already checked your KW density for "facebook login" and it is reasonable (not over the top). That page is in the index so whatever your XML sitemap problem is, at least it is not keeping the page from the index. If all other pages on your domain are okay and the XML sitemap problem was not related only to this page (which I am assuming), then that is obviously not the culprit.
If you're content is going to be syndicated/"borrowed", you might want to add the rel="author" etc. info to your site.
Q: Do you see any of the other websites' pages (with the copied content) ranking for your term?
Q: You didn't answer about link tracking. Here's what I am getting out here..
Hypothetical: let's suppose a few weeks ago you have 40 external links to that page. Let's say of that 40, Google may value 12. Of those 12, perhaps 2 have great authority, relevance, PR, etc. according to Google. And then, for reasons unknown, those two sites have removed the links to your page.
So if you have before/after tracking of inbound links, you might see a changes that wave a red flag. It's just one more thing to check.