the apparent paradox presented by the Monty Hall problem can be difficult to naively resolve bc it lies at the intersection of probability & epistemology, which are already complex enough on their own
it requires carefully thinking through "what you know", which is nontrivial
---
RT @Jerbivore
not gunna pretend to truly understand Bayes' theorem, but the relevant idea is that to get the probabilities right, you need to correctly "update your priors" in response to the door(s) being opened, ie, your a priori estimation of each door's contents; we're often bad at this
the Monty Hall problem functions as a sort of Bayesian "mu", an epistemic shock, attempting to knock you out of intuitive complacency, ideally motivating you to re-evaluate your reasoning in other domains; if you got this one wrong, what else might you be?
---
RT @pee_zombie
it feels bad, being hit with a {mu, categorical error, not-even-wrong}, but digging into that discomfort is the path to a small enlightenment, a realization that yo…
https://twitter.com/pee_zombie/status/1328950758755151879
in this way, its comparable to Newcomb's paradox, which functions similarly, but for decision theory instead. when trying to understand a system, you learn the most from studying the ways it fails, its boundary conditions; your mind is no exception here.
---
RT @pee_zombie
the problem is effectively designed to capture intuitive reasoning, which typically fails to arrive at the presumably "correct" answer of 1box; it attempts to demon…
https://twitter.com/pee_zombie/status/1391406993211015171
our intuitions utilize many heuristics to quickly arrive at an answer, which is critical when you're time-constrained, such as when being chased by a lion
in the modern environment however, decisions are rarely that urgent, and as such, these heuristics tend to bias our thinking