Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Yvain2 18 March 2009 01:43:43AM 7 points [-]

"There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities."

That doesn't seem at all obvious to me. First, our current society doesn't allow people to die, although today law enforcement is spotty enough that they can't really prevent it. I assume far future societies will have excellent law enforcement, including mind reading and total surveillance (unless libertarians seriously get their act together in the next hundred years). I don't see any reason why the taboo on suicide *must* disappear. And any society advanced enough to revive me has by definition conquered death, so I can't just wait it out and die of old age. I place about 50% odds on not being able to die again after I get out.

I'm also less confident the future wouldn't be a dystopia. Even in the best case scenario the future's going to be scary through sheer cultural drift (see: legalized rape in Three Worlds Collide). I don't have to tell *you* that it's easier to get a Singularity that goes horribly wrong than one that goes just right, and even if we restrict the possibilities to those where I get revived instead of turned into paperclips, they could still be pretty grim (what about some well-intentioned person hard-coding in "Promote and protect human life" to an otherwise poorly designed AI, and ending up with something that resurrects the cryopreserved...and then locks them in little boxes for all eternity so they don't consume unnecessary resources.) And then there's just the standard fears of some dictator or fundamentalist theocracy, only this time armed with mind control and total surveillance so there's no chance of overthrowing them.

The deal-breaker is that I really, really don't want to live forever. I might enjoy living a thousand years, but not forever. You could change my mind if you had a utopian post-singularity society that completely mastered Fun Theory. But when I compare the horrible possibility of being forced to live forever either in a dystopia or in a world no better or worse than our own, to the good possibility of getting to live between thousand years and forever in a Fun Theory utopia that can keep me occupied...well, the former seems both more probable and more extreme.

Comment author: Ulysses 22 January 2011 04:35:29AM 1 point [-]

The threat of dystopia stresses the importance of finding or making a trustworthy, durable institution that will relocate/destroy your body if the political system starts becoming grim.

Of course there is no such thing. Boards can become infiltrated. Missions can drift. Hostile (or even well-intentioned) outside agents can act suddenly before your guardian institution can respond.

But there may be measures you can take to reduce fell risk to acceptable levels (i.e: levels comparable to current risk of exposure to, as Yudkowsky mentioned, secret singularity-in-a-basement):

  1. You could make contracts with (multiple) members of the younger generation of cryonicists, on condition that they contract with their younger generation, etc. to guard your body throughout the ages.

  2. You can hide a very small bomb in your body that continues to countdown slowly even while frozen (don't know if we have the technology yet, but it doesn't sound too sophisticated) so as to limit the amount of divergence from now that you are willing to expose yourself to [explosion small enough to destroy your brain, but not the brain next to you].

  3. You can have your body hidden and known only to cryonicist leaders.

  4. You can have your body's destruction forged.

I don't think any combination of THESE suggestions will suffice. But it is worth very much effort inventing more (and not necessarily sharing them all online), and making them possible if you are considering freezing yourself.

Comment author: DanielLC 17 October 2010 07:29:07PM *  1 point [-]

That just means you have to change the experiment. Suppose he just said he'll cause a certain amount of net disutility, without specifying how.

This works unless you assume a maximum possible disutility.

Comment author: Ulysses 22 January 2011 02:44:19AM 2 points [-]

You are not entitled to assume a maximum disutility, even if you think you see a proof for it (see Confidence Levels Inside and Outside an Argument).