Hello Or,
I just checked the most recent cache and it looks like Google does NOT see the content on the first URL (ending in /71232/) but does see it on the second one (ending in 69811).
This is the opposite of the situation you described above.
Yes, Google "can" execute Javascript, but just because they can doesn't mean they will every time. Also, perhaps not all of their bots can or do execute Javascript every time. For instance, the bot they use for pure discovery may not, while the one they use to render previews may.
Or they could have given the Javascript only so long to execute.
I also notice the page that is currently not indexed fully has an embedded YouTube video. Not that this would typically cause any problems with getting other content indexed, in your case it may be worth looking into. For example, it could contribute to the load time issue mentioned above.
When it comes to executing scripts, submitting forms, etc... Google is very much at the stage of just randomly "trying stuff out" to "see what happens". It's like a hyperactive baby in a spaceship just pushing buttons like crazy, which is why we run into issues with "spider traps" and with unintentionally getting dynamic pages indexed from form submissions, internal searches and other oddities in site architecture. It is also one of the reasons why markup like Schema.org and JSON-LD are important: They allow us to label the buttons so the bot "understands" what it is pressing (or not).
I apologize that there is not definitive answer for your problem at the moment, but given the behavior has switched completely I'm not sure how to go about investigating. This is why it is still very much a best practice to ensure all of your content is indexable by not rendering it with Javascript. If you can't see the textual content in the source code (as is the case here) then you are at risk of it not being seen by Google.