So, how *do* you engage in a conflict where one side is trying to avoid apocalyptic but unobservable behavior, but everyone else doesn't believe their arguments?
We might do that with money, but feels insufficient. Assume evaluating object-level arguments is really really difficult here.
Rarely doomers could be right.
So how do you navigate this dilemma? People can't just disagree but avoid each other, setup implies large externalities.