A thought: given that Roko's Basilisk is a variant version of Pascal's Wager … did anyone ever generalize it to derive a Many Basilisks cosmology, in which an infinite number of future malign AIs each torture for eternity a simulation of each individual who didn't contribute materially to that particular AI's creation? (So that you can never win—there's always another Basilisk?)
@cstross @nyrath That's actually my favorite counter! But I usually go with a variant. What if a godlike AI is created sometime in the next trillion years, that really hates itself, and resurrects and tortures everyone who *did* contribute to the creation of AGI while rewarding everyone who didn't?
How would this interact with the Basilisk? Would you be both in virtual Hell and virtual Heaven?
The sensible resolution to this paradox is that your perspective doesn't transfer to the simulation.
The answer, of course, is simple. Precommit to not negotiating with terrorists, and genuinely mean it, such that any possible sufficiently you-like emulation must also know that.
Then anyone capable of blackmailing you by torturing a copy of you will know that it's futile, and, if otherwise rational, not bother.
(Alternate options include becoming a sociopath or learning to truly, deeply, hate yourself and anyone else whose pain might otherwise affect you.)