A leverage prior stops utilities from being unnormalisable, and maybe you only care about expected utility, because that’s what expected utility is, but leverage priors are weird and having a completely janked intermediate world model is weird so 🤔
“What is the nature of reality” well if you use the SIA to update on anthropics no matter which Turing machine you guess is reality there’s an arbitrarily more probable one with a smaller prior probability
“How many jelly beans are in the jar” well if your loss function is inverse-googological or harsher in the difference then for any number you can name you’d regret naming a bigger one because of your prior on very very large number of jelly beans
tweet about little free libraries and effective altruism and virtue signalling in general I guess
Moved to @TetraspaceGrouping