A thought: given that Roko's Basilisk is a variant version of Pascal's Wager … did anyone ever generalize it to derive a Many Basilisks cosmology, in which an infinite number of future malign AIs each torture for eternity a simulation of each individual who didn't contribute materially to that particular AI's creation? (So that you can never win—there's always another Basilisk?)
@cstross @nyrath That's actually my favorite counter! But I usually go with a variant. What if a godlike AI is created sometime in the next trillion years, that really hates itself, and resurrects and tortures everyone who *did* contribute to the creation of AGI while rewarding everyone who didn't?
How would this interact with the Basilisk? Would you be both in virtual Hell and virtual Heaven?
The sensible resolution to this paradox is that your perspective doesn't transfer to the simulation.
Your perspective doesn't need to transfer to the simulation. All you need to assume is that you can empathize/sympathize with entities like yourself, and as such, will go to some effort to prevent the entity most like yourself from being hurt.
On the latter: the thought experiment doesn't specify how far into the future the acausal blackmailer is. Could be a gigayear. Could be next Tuesday.
Or what it is, for that matter. Could be me with a promise to stab everyone next Tuesday who doesn't send me a Twinkie, if people were sufficiently intimidatable to be blackmailable by the mere idea of me in the candy store with the carving knife.