A thought: given that Roko's Basilisk is a variant version of Pascal's Wager … did anyone ever generalize it to derive a Many Basilisks cosmology, in which an infinite number of future malign AIs each torture for eternity a simulation of each individual who didn't contribute materially to that particular AI's creation? (So that you can never win—there's always another Basilisk?)
@cstross @nyrath That's actually my favorite counter! But I usually go with a variant. What if a godlike AI is created sometime in the next trillion years, that really hates itself, and resurrects and tortures everyone who *did* contribute to the creation of AGI while rewarding everyone who didn't?
How would this interact with the Basilisk? Would you be both in virtual Hell and virtual Heaven?
The sensible resolution to this paradox is that your perspective doesn't transfer to the simulation.
Your perspective doesn't need to transfer to the simulation. All you need to assume is that you can empathize/sympathize with entities like yourself, and as such, will go to some effort to prevent the entity most like yourself from being hurt.
On the former: that's not unique to RB. There's nothing unique about acausal vs. regular blackmail that prevents multiple blackmailers wanting you to do contradictory things. It just reduces to the usual multiple-blackmail case: to wit, you're screwed.
(Fortunately, not negotiating with terrorists wins this case too.)