enoonsti comments on Cryonics Questions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (165)
Some of these questions, like the one about running away from a fire, ignore the role of irrational motivation.
People, when confronted with an immediate threat to their lives, gain a strong desire to protect themselves. This has nothing to do with a rational evaluation of whether or not death is better than life. Even people who genuinely want to commit suicide have this problem, which is one reason so many of them try methods that are less effective but don't activate the self-defense system (like overdosing on pills instead of shooting themselves in the head). Perhaps even a suicidal person who'd entered the burning building because e planned to jump off the roof would still try to run out of the fire. So running away from a fire, or trying to stop a man threatening you with a sword, cannot be taken as proof of a genuine desire to live, only that any desire to die one might have is not as strong as one's self-protection instincts.
It is normal for people to have different motivations in different situations. When I see and smell pizza, I get a strong desire to eat the pizza; right now, not seeing or smelling pizza, I have no particular desire to eat pizza. The argument "If your life was in immediate danger, you would want it to be preserved; therefore, right now you should seek out ways to preserve your life in the future, whether you feel like it or not" is similar to the argument "If you were in front of a sizzling piece of pizza, you would want to eat it; therefore, right now you should seek out pizza and eat it, whether you feel like it or not".
Neither argument is inevitably wrong. But first you would have to prove that the urge comes from a reflectively stable value - something you "want to want", and not just from an impulse that you "want" but don't "want to want".
The empirical reason I haven't signed up for cryonics yet is that the idea of avoiding death doesn't have any immediate motivational impact on me, and the negatives of cryonics - weirdness, costs in time and money, negative affect of being trapped in a dystopia - do have motivational impact on me. I admit this is weird and not what I would have predicted about my motivations if I were considering them in the third person, but empirically, that's how things are.
I can use my willpower to overcome an irrational motivation or lack of motivation. But I only feel the need to do that in two cases. One, where I want to help other people (eg giving to charity even when I don't feel motivated to do so). And two, when I predict I will regret my decision later (eg I may overcome akrasia to do a difficult task now when I would prefer to procrastinate). The first reason doesn't really apply here, but the second is often brought out to support cryonics signup.
Many people who signal acceptance of death appear to genuinely go peacefully and happily - that is, even to the moment of dying they don't seem motivated to avoid death. If this is standard, then I can expect to go my entire life without regretting the choice not to sign up for cryonics at any moment. After I die, I will be dead, and not regretting anything. So I expect to go all of eternity without regretting a decision not to sign up for cryonics. This leaves me little reason to overcome my inherent dismotivation to get it.
Some have argued that, when I am dead, it will be a pity, because I would be having so much more fun if I were still alive, so I ought to be regretful even though I'm not physically capable of feeling the actual emotion. But this sounds too much like the arguments for a moral obligation to create all potential people, which lead to the Repugnant Conclusion and which I oppose in just about all other circumstances.
That's just what I've introspected as the empirical reasons I haven't signed up for cryonics. I'm still trying to decide if I should accept the argument. And I'm guessing that as I get older I might start feeling more motivation to cheat death, at which point I'd sign up. And there's a financial argument that if I'm going to sign up later, I might as well sign up now, though I haven't yet calculated the benefits.
But analogies to running away from a burning building shouldn't have anything to do with it.
Jack: "I've got the Super Glue for Yvain. I'm on my way back."
Chloe: "Hurry, Jack! I've just run the numbers! All of our LN2 suppliers were taken out by the dystopia!"
Freddie Prinze Jr: "Don't worry, Chloe. I made my own LN2, and we can buy some time for Yvain. But I'm afraid the others will have to thaw out and die. Also, I am sorry for starring in Scooby Doo and getting us cancelled."
- Jack blasts through wall, shoots Freddie, and glues Yvain back together -
Jack: "Welcome, Yvain. I am an unfriendly A.I. that decided it would be worth it just to revive you and go FOOM on your sorry ass."
(Jack begins pummeling Yvain)
(room suddenly fills up with paper clips)
This is one of the worst examples that I've ever seen. Why would a paperclip maximizer want to revive someone so they could see the great paperclip transformation? Doing so uses energy that could be allocated to producing paperclips, and paperclip maximizers don't care about most human values, they care about paperclips.
That was a point I was trying to make ;)
I should have ended off with (/sarcasm)
I think the issue is that the dystopia we're talking about here isn't necessarily paperclip maximizer land, which isn't really a dystopia in the conventional sense, as human society no longer exists in such cases. What if it's I Have No Mouth And I Must Scream instead?
Yes, the paper clip reference wasn't the only point I was trying to make; it was just a (failed) cherry on top. I mainly took issue with being revived in the common dystopian vision: constant states of warfare, violence, and so on. It simply isn't possible, given that you need to keep refilling dewars with LN2 and so much more; in other words, the chain of care would be disrupted, and you would be dead long before they found a way to resuscitate you.
And that leaves basically only a sudden "I Have No Mouth" scenario; i.e. one day it's sunny, Alcor is fondly taking care of your dewar, and then BAM! you've been resuscitated by that A.I. I guess I just find it unlikely that such an A.I. will say: "I will find Yvain, resuscitate him, and torture him." It just seems like a waste of energy.
Upvoted for making a comment that promotes paperclips.