Content note: discussion of things that are worse than death
Over the past few years, a few people have claimed rejection of cryonics due to concerns that they might be revived into a world that they preferred less than being dead or not existing. For example, lukeprog pointed this out in a LW comment here, and Julia Galef expressed similar sentiments in a comment on her blog here.
I use brain preservation rather than cryonics here, because it seems like these concerns are technology-platform agnostic.
To me one solution is that it seems possible to have an "out-clause": circumstances under which you'd prefer to have your preservation/suspension terminated.
Here's how it would work: you specify, prior to entering biostasis, circumstances in which you'd prefer to have your brain/body be taken out of stasis. Then, if those circumstances are realized, the organization carries out your request.
This almost certainly wouldn't solve all of the potential bad outcomes, but it ought to help some. Also, it requires that you enumerate some of the circumstances in which you'd prefer to have your suspension terminated.
While obvious, it seems worth pointing out that there's no way to decrease the probability of worse-than-death outcomes to 0%. Although this also is the case for currently-living people (i.e. people whose brains are not necessarily preserved could also experience worse-than-death outcomes and/or have their lifespan extended against their wishes).
For people who are concerned about this, I have three main questions:
1) Do you think that an opt-out clause is a useful-in-principle way to address your concerns?
2) If no to #1, is there some other mechanism that you could imagine which would work?
3) Can you enumerate some specific world-states that you think could lead to revival in a worse-than-death state? (Examples: UFAI is imminent, or a malevolent dictator's army is about to take over the world.)
Ah, got it. Yeah, that would help, though there would remain many cases where bad futures come too quickly (e.g., if an AGI takes a treacherous turn all of a sudden).