Ishaan comments on I Will Pay $500 To Anyone Who Can Convince Me To Cancel My Cryonics Subscription - Less Wrong

33 Post author: ChrisHallquist 11 January 2014 10:39AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (181)

You are viewing a single comment's thread.

Comment author: Ishaan 11 January 2014 08:56:56PM *  -1 points [-]

This post inspired me to quickly do this calculation. I did not know what the answer would be when I started. It could convince you in either direction really, depending on your level of self/altruism balance and probability estimate.

Cost of neuro-suspension cryonics > $20,000

Cost of saving a single life via effective altruism, with high certainty < $5,000

Let's say you value a good outcome with a mostly-immortal life at X stranger's regular-span lives.

Let "C" represent the threshold of certainty that signing up for cryonics causes that good outcome.

C*X / $20,000 > 1 / $5,000

C > 4/x

Conclusion: with estimates biased towards the cryonics side of the equation... in order to sign up your minimum certainty that it will work as expected must be four divided by the number of strangers you would sacrifice your immortality for.

If you value immortality at the cost of 4 strangers, you should sign up for cryonics instead of E.A. only if you are 100% certain it will work.

If you value immortallity at the cost of 400 strangers, you should sign of for cryonics instead of E.A. only if you are more than 1% certain it will work.

(^ Really what is happening here is that at the cost of 4 strangers you are taking a gamble on a 1% chance..but it amounts to the same thing if you shut up and multiply)

The numbers for whole-body suspension will be rather different.

Comment author: solipsist 11 January 2014 09:38:58PM *  4 points [-]

This sort of utilitarian calculation should be done with something like QALYs, not lives. If the best charities extend life at $150 per QALY, and a $20,000 neuro-suspension extends life by a risk-adjusted 200 QALYs, then purchasing cryonics for yourself would be altruistically utilitarian.

Comment author: jkaufman 12 January 2014 06:22:12AM 2 points [-]

These calculations get really messy because the future civilization reviving you as an upload is unlikely to have their population limited by frozen people to scan. Instead they probably run as many people as they have resources or work for, and if they decide to run you it's instead of someone else. There are probably no altruistic QALYs in preserving someone for this future.

Comment author: solipsist 15 January 2014 02:42:36AM 0 points [-]

This reply made me really think, and prompted me to ask this question.

Comment author: Ishaan 11 January 2014 09:53:39PM *  1 point [-]

True, but that's much harder to estimate (because real world QALY data) and involves more uncertainty (how many QALYs to expect after revival?) and I didn't want that much work - just a quick estimate.

However, I'm guessing someone else has done this properly at some point?

Comment author: solipsist 11 January 2014 11:15:13PM 1 point [-]

However, I'm guessing someone else has done this properly at some point?

Note: I have not, so do not use my 200 QALYs as an anchor.

Comment author: somervta 12 January 2014 02:22:00AM -1 points [-]

<sarcasm>

Yes. Because instructing people to avoid anchoring effects works.

</sarcasm>