the apparent paradox presented by the Monty Hall problem can be difficult to naively resolve bc it lies at the intersection of probability & epistemology, which are already complex enough on their own

it requires carefully thinking through "what you know", which is nontrivial
---
RT @Jerbivore

twitter.com/Jerbivore/status/1

compared to typical probability-based problems where you already have all the info available to you, this one involves changing probabilities based on the introduction of new info, ie the domain of Bayesian analysis. it is easy to get confused, with a naive interpretation thereof

not gunna pretend to truly understand Bayes' theorem, but the relevant idea is that to get the probabilities right, you need to correctly "update your priors" in response to the door(s) being opened, ie, your a priori estimation of each door's contents; we're often bad at this

our intuitions utilize many heuristics to quickly arrive at an answer, which is critical when you're time-constrained, such as when being chased by a lion

in the modern environment however, decisions are rarely that urgent, and as such, these heuristics tend to bias our thinking

Follow

this problem's real utility is in showing just how faulty our typical reasoning can be; far from being a weird edge case, the intuitive failure mode highlighted here is representative of many modern situations where we must update our beliefs in response to confusing evidence

the Monty Hall problem functions as a sort of Bayesian "mu", an epistemic shock, attempting to knock you out of intuitive complacency, ideally motivating you to re-evaluate your reasoning in other domains; if you got this one wrong, what else might you be?

---
RT @pee_zombie
it feels bad, being hit with a {mu, categorical error, not-even-wrong}, but digging into that discomfort is the path to a small enlightenment, a realization that yo…
twitter.com/pee_zombie/status/

in this way, its comparable to Newcomb's paradox, which functions similarly, but for decision theory instead. when trying to understand a system, you learn the most from studying the ways it fails, its boundary conditions; your mind is no exception here.

---
RT @pee_zombie
the problem is effectively designed to capture intuitive reasoning, which typically fails to arrive at the presumably "correct" answer of 1box; it attempts to demon…
twitter.com/pee_zombie/status/

Sign in to participate in the conversation
Mastodon

a Schelling point for those who seek one