Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

lukeprog comments on Open Thread, March 1-15, 2013 - Less Wrong

3 Post author: Jayson_Virissimo 01 March 2013 12:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (237)

You are viewing a single comment's thread.

Comment author: lukeprog 06 March 2013 07:52:53PM *  22 points [-]

Why am I not signed up for cryonics?

Here's my model.

In most futures, everyone is simply dead.

There's a tiny sliver of futures that are better than that, and a tiny sliver of futures that are worse than that.

What are the relative sizes of those slivers, and how much more likely am I to be revived in the "better" futures than in the "worse" futures? I really can't tell.

I don't seem to be as terrified of death as many people are. A while back I read the Stoics to reduce my fear of death, and it worked. I am, however, very averse to being revived into a worse-than-death future and not being able to escape.

I bet the hassle and cost of cryonics disincentivizes me, too, but when I boot up my internal simulator and simulate a world where cryonics is free, and obtained via a 10-question Google form, I still don't sign up. I ask to be cremated instead.

Cryonics may be reasonable for someone who is more averse to death and less averse to worse-than-death outcomes than I am. Cryonics may also be reasonable for someone who has strong reasons to believe they are more likely to be revived in better-than-death futures than in worse-than-death futures. Finally, there may be a fundamental error in my model.

This does, however, put me into disagreement with both Robin Hanson ("More likely than not, most folks who die today didn't have to die!") and Eliezer Yudkowsky ("Not signing up for cryonics [says that] you've stopped believing that human life, and your own life, is something of value").

Comment author: Elithrion 07 March 2013 12:07:10AM *  7 points [-]

So are you saying the P(worse-than-death|revived) and the P(better-than-death|revived) probabilities are of similar magnitude? I'm having trouble imagining that. In my mind, you are most likely to be revived because the reviver feels some sort of moral obligation towards you, so the future in which this happens should, on the whole, be pretty decent. If it's a future of eternal torture, it seems much less likely that something in it will care enough to revive some cryonics patients when it could, for example, design and make a person optimised for experiencing the maximal possible amount of misery. Or, to put it differently, the very fact that something wants to revive you suggests that that something cares about a very narrow set of objectives, and if it cares about that set of objects it's likely because they were put there with the aim of achieving a "good" outcome.

(As an aside, I'm not very averse to "worse-than-death" outcomes, so my doubts definitely do arise partially from that, but at the same time I think they are reasonable in their own right.)

Comment author: lukeprog 07 March 2013 12:56:57AM 2 points [-]

So are you saying the P(worse-than-death|revived) and the P(better-than-death|revived) probabilities are of similar magnitude?

Yes. Like, maybe the latter probability is only 10 or 100 times greater than the former probability.

Comment author: CarlShulman 14 August 2013 01:27:45AM *  3 points [-]

This seems strangely averse to bad outcomes to me. Are you taking into account that the ratio between the goodness of the best possible experiences and the badness of the worst possible experiences (per second, and per year) should be much closer to 1:1 than the ratio of the most intense per second experiences we observe today, for reasons discussed in this post?

Comment author: Pablo_Stafforini 15 January 2015 05:50:46AM *  2 points [-]

Why should we consider possible rather than actual experiences in this context? It seems that cryonics patients who are successfully revived will retain their original reward circuitry, so I don't see why we should expect their best possible experiences to be as good as their worst possible experiences are bad, given that this is not the case for current humans.

Comment author: CarlShulman 16 January 2015 02:28:17AM *  2 points [-]

For some of the same reasons depressed people take drugs to elevate their mood.

Comment author: lukeprog 14 August 2013 03:44:08AM 1 point [-]

I like that post very much. I'm trying to make such an update, but it's hard to tell how much I should adjust from my intuitive impressions.

Comment author: hairyfigment 16 March 2013 06:54:56AM *  0 points [-]

OK, what? When you say "worse-than-death", are you including Friendship is Optimal?

What about a variant of Hanson's future where:

  • versions of you repeatedly come into existence, do unfulfilling work for a while, and cease to exist
  • no version of you contacts any of the others
  • none of these future-selves directly contribute to changing this situation, but
  • your memories do make it into a mind that can act more freely than most or all of us today, and
  • the experiences of people like your other selves influence the values of this mind, and
  • the world stops using unhappy versions of you.

(Edited for fatigue.)

Comment author: lukeprog 12 August 2013 06:59:49PM 1 point [-]

I haven't read Friendship is Optimal, because I find it difficult to enjoy reading fiction in general.

Not sure how I feel about the described Hansonian future, actually.

Comment author: Synaptic 21 February 2015 11:02:00PM 2 points [-]
Comment author: ModusPonies 06 March 2013 11:08:06PM 1 point [-]

I'm not very averse to death.

Whoa. What? I notice that I am confused. Requesting additional information.

Most of the time, if I read something like that, I'd assume it was merely false—empty posturing from someone who didn't understand the implications of what they were writing. In this case, though... everything else I've seen you write is coherent and precise. I'm inclined to believe your words literally, in which case either A) I'm missing some sort of context or qualifiers or B) you really ought to see a therapist or something.

Do you mean you're not averse to death decades from now? Does that feel different from the possibility of getting hit by a bus next week?

(Only tangentially related, but I'm curious: what's your order of magnitude probability estimate that cryonics would actually work?)

Comment author: ArisKatsaris 06 March 2013 11:41:52PM 16 points [-]

you really ought to see a therapist or something.

No, I'm sorry, but there are simply many atheists who really aren't that scared of non-existence. We don't seek it out, we do prefer continuation of our lives and its many joys, but dying doesn't scare the hell out of us either.

This, in me at least, has nothing to do with depression or anything that requires therapy. I'm not suicidal in the least; even though I'd be scared of being trapped in an SF-style dystopia that didn't allow me to so suicide.

Comment author: [deleted] 07 March 2013 03:05:25PM 2 points [-]

What's that quote that says something to the nature of "I didn't exist for billions of years before I was born, and it didn't bother me one bit" ?

Comment author: tut 07 March 2013 05:31:27PM 6 points [-]

“I do not fear death. I had been dead for billions and billions of years before I was born, and had not suffered the slightest inconvenience from it.” ― Mark Twain

Comment author: lukeprog 07 March 2013 12:53:24AM *  2 points [-]

Whoa. What?

Sorry, I just meant that I seem to be less averse to death than other people. I'd be very sad to die, and not have the chance to achieve my goals, but I'm not as terrified as death as many people seem to be. I've clarified the original comment.

Comment author: James_Miller 24 December 2013 05:51:15PM *  0 points [-]

In most futures, everyone is simply dead.

If there is a high probability of these bad futures happening before you retire, this belief reduces the cost of cryonics to you in terms of the opportunity cost of instead putting money into retirement accounts.

In the really bad futures you probably don't experience extra suffering if you sign up for cryonics because all possible types of human minds get simulated.

Comment author: MugaSofer 04 September 2013 08:17:42PM 0 points [-]

This does, however, put me into disagreement with both Robin Hanson ("More likely than not, most folks who die today didn't have to die!") and Eliezer Yudkowsky ("Not signing up for cryonics [says that] you've stopped believing that human life, and your own life, is something of value").

I ... don't think it does, actually. Well, the bit about "most possible futures are empty" does put you in conflict with Robin Hanson ("More likely than not, most folks who die today didn't have to die!"), I guess, but the actual thesis seems to fall into the category of Eliezer Yudkowsky's "you've stopped believing that human life, and your own life, is something of value" (after a certain point in history.)