Eliezer_Yudkowsky comments on Being Half-Rational About Pascal's Wager is Even Worse - Less Wrong

18 Post author: Eliezer_Yudkowsky 18 April 2013 05:20AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (168)

You are viewing a single comment's thread. Show more comments above.

Comment author: orthonormal 18 April 2013 02:42:20PM *  8 points [-]

I was under the impression that the standard LW argument for signing up was (tiny probability of success)*(monumental heap of utility if it works)=(a good investment). If that's not your argument, what is?

The standard LW argument is that cryonics has a non-tiny probability of success. I did my own estimate, and roughly speaking, P(success) is at least P(A)P(B|A)P(C|A,B)P(D|A,B,C), where

  • A = materialism, i.e. preserving the brain or just the information of the brain-state is enough to preserve the mind
  • B = the freezing process preserves the information of the brain-state in such a way that plausible future tech can recover it
  • C = future society develops the tech to actually recover the info from cryopreserved brains without extravagant energy costs
  • D = my cryonics provider keeps me frozen for the entire time, and someone in the future sees fit to revive me

And my honest estimates were, roughly, P(A) > .95, P(B|A) > .8, P(C|A,B) > .3, and P(D|A,B,C) > .2, giving an overall lower-bound estimate of about 5% (with a lot of metauncertainty, obviously); then I tried to estimate how much waking up in the future would really be worth to me in terms of my current values compared to the value of money now, and overall determined that it was worth it for me to sign up (but not overwhelmingly; had it been 10 times the actual cost of $20 a month for the insurance and dues, I'd have waited until I was significantly richer).

And the point of the post is that 5% is not a Pascal's Wager-esque tiny probability, but something on the level of health risks that we do in fact take seriously.

(You're welcome to take exception to my estimates, of course, but the main point is that I signed up for non-Pascalian reasons. There's actually more to consider, including the effect of MWI and anthropic reasoning on C and D, and consideration about the motivations that someone might revive cryopreserved people, but this is a good enough sketch for current purposes.)

Comment author: Eliezer_Yudkowsky 18 April 2013 05:09:18PM 5 points [-]

Expanding conjunctive probabilities without expanding disjunctive probabilities is another classic form of one-sided rationality. If I wanted to make cryonics look more probable than this, I would individually list out many different things that could go right.

Comment author: [deleted] 20 April 2013 09:15:31AM 3 points [-]

Can you give a few examples?

Comment author: orthonormal 18 April 2013 09:01:45PM 2 points [-]

For the purpose of establishing that it's not a Pascalian probability, it suffices to talk about a lower bound on the main line of reasoning.

Ah, I see that I said "estimate" instead of "lower bound" in the critical place. I'll edit.

Comment author: Luke_A_Somers 18 April 2013 09:28:14PM 1 point [-]

In this case, I'm not seeing the disjunctive possibilities that lead one to sign up for cryo in this particular case. ABCD seem to be phrased pretty broadly, and A and B in particular are already pretty big. Do you mean as an alternate to D that, say, a new cryo provider takes over the abandoned preserved heads before they thaw? Or as an alternate to C, that even though the cost is high, they go ahead and do it anyway?

Beyond that, I only see scenarios that are nice but don't point one toward cryopreservation. Like, time travel scans of dying people meaning no one ever really died is wonderful, but it means getting cryopreserved only did good in that your family wouldn't be QUITE as sad you were gone in the time before they 'died'.

Comment author: Eliezer_Yudkowsky 18 April 2013 10:24:15PM 2 points [-]

Do you mean as an alternate to D that, say, a new cryo provider takes over the abandoned preserved heads before they thaw?

Sure. That happened already once in history (though there was, even earlier, a loss-thaw). It's why all modern cryo organizations are very strict about demanding advance payment, despite their compassionate hearts screaming at them not to let their friends die because of mere money. Sucks to be them, but they've got no choice.

Or as an alternate to C, that even though the cost is high, they go ahead and do it anyway?

Yep. I'd think FAI scenarios would tend to yield that.

Basically I always sigh sadly when somebody's discussing a future possibility and they throw up some random conjunction of conditional probabilities, many steps of which are actually pretty darned high when I look at them, with no corresponding disjunctions listed. This is the sort of thinking that would've led Fermi to cleverly assign a probability way lower than 10% to having an impact, by the time he was done expanding all the clever steps of the form "And then we can actually persuade the military to pay attention to us..." If you're going to be silly about driving down all impact probabilities to something small via this sort of conjunctive cleverness, you'd better also be silly and multiply the resulting small probability by a large payoff, so you won't actually ignore all possible important issues.

Comment author: CarlShulman 18 April 2013 11:17:03PM 1 point [-]

"And then we can actually persuade the military to pay attention to us..."

The government did sit on it for quite a while, delaying the bomb until after the defeat of Germany. Nudges from Britain were important in getting things moving.

Comment author: SilasBarta 19 April 2013 12:18:58PM 1 point [-]

The military "paid attention to them" long before that though.