Will_Newsome comments on Abnormal Cryonics - Less Wrong

56 Post author: Will_Newsome 26 May 2010 07:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (365)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 26 May 2010 12:13:32PM *  3 points [-]

Good point: mainstream cryonics would be a big step towards raising the sanity waterline, which may end up being a prerequisite to reducing various kinds of existential risk. However, I think that the causal relationship goes the other way, and that raising the sanity waterline comes first, and cryonics second: if you can get the average person across the inferential distance to seeing cryonics as reasonable, you can most likely get them across the inferential distance to seeing existential risk as really flippin' important. (I should take the advice of my own post here and note that I am sure there are really strong arguments against the idea that working to reduce existential risk is important, or at least against having much certainty that reducing existential risk will have been the correct thing to do upon reflection, at the very least on a personal level.) Nonetheless, I agree further analysis is necessary, though difficult.

Comment deleted 26 May 2010 01:15:58PM *  [-]
Comment author: Will_Newsome 26 May 2010 01:41:34PM *  4 points [-]

Your original point was that "getting cryo to go mainstream would be a strong win as far as existential risk reduction is concerned (because then the public at large would have a reason to care about the future) and as far as rationality is concerned", in which case your above comment is interesting, but tangential to what we were discussing previously. I agree that getting people to sign up for cryonics will almost assuredly get more people to sign up for cryonics (barring legal issues becoming more salient and thus potentially more restrictive as cryonics becomes more popular, or bad stories publicized whether true or false), but "because then the public at large would have a reason to care about the future" does not seem to be a strong reason to expect existential risk reduction as a result (one counterargument being the one raised by timtyler in this thread). You have to connect cryonics with existential risk reduction, and the key isn't futurism, but strong epistemic rationality. Sure, you could also get interest sparked via memetics, but I don't think the most cost-effective way to do so would be investment in cryonics as opposed to, say, billboards proclaiming 'Existential risks are even more bad than marijuana: talk to your kids.' Again, my intuitions are totally uncertain about this point, but it seems to me that the option a) 10 million dollars -> cryonics investment -> increased awareness in futurism -> increased awareness in existential risk reduction, is most likely inferior to option b) 10 million dollars -> any other memetic strategy -> increased awareness in existential risk reduction.

Comment deleted 26 May 2010 02:15:50PM *  [-]
Comment author: Will_Newsome 26 May 2010 02:25:17PM *  5 points [-]

And if you continue to spend more than $1 a day on food and luxuries, do you really value your life at less than one Hershey bar a day?

I think the correct question here is instead "Do you really value a very, very small chance at you having been signed up for cryonics leading to huge changes in your expected utility in some distant future across unfathomable multiverses more than an assured small amount of utility 30 minutes from now?" I do not think the answer is obvious, but I lean towards avoiding long-term commitments until I better understand the issues. Yes, a very very very tiny amount of me is dying everyday due to freak kitchen accidents, but that much of my measure is so seemingly negligible that I don't feel too horrible trading it off for more thinking time and half a Hershey's bar.

The reasons you gave for spending a dollar a day on cryonics seem perfectly reasonable and I have spent a considerable amount of time thinking about them. Nonetheless, I have yet to be convinced that I would want to sign up for cryonics as anything more than a credible signal of extreme rationality. From a purely intuitive standpoint this seems justified. I'm 18 years old and the singularity seems near. I have measure to burn.

Comment deleted 26 May 2010 02:28:57PM [-]
Comment author: Will_Newsome 26 May 2010 02:40:30PM *  3 points [-]

Perhaps. I think a singularity is more likely to occur before I die (in most universes, anyway). With advancing life extension technology, good genes, and a disposition to be reasonably careful with my life, I plan on living pretty much indefinitely. I doubt cryonics has any effect at all on these universes for me personally. Beyond that, I do not have a strong sense of identity, and my preferences are not mostly about personal gain, and so universes where I do die do not seem horribly tragic, especially if I can write down a list of my values for future generations (or a future FAI) to consider and do with that they wish.

So basically... (far) less than a 1% chance of saving 'me', but even then, I don't have strong preferences for being saved. I think that the technologies are totally feasible and am less pessimistic than others that Alcor and CI will survive for the next few decades and do well. However, I think larger considerations like life extension technology, uFAI or FAI, MNT, bioweaponry, et cetera, simply render the cryopreservation / no cryopreservation question both difficult and insignificant for me personally. (Again, I'm 18, these arguments do not hold equally well for people who are older than me.)

Comment author: Airedale 26 May 2010 07:16:48PM 5 points [-]

a disposition to be reasonably careful with my life

When I read this, two images popped unbidden into my mind: 1) you wanting to walk over the not-that-stable log over the stream with the jagged rocks in it and 2) you wanting to climb out on the ledge at Benton House to get the ball. I suppose one person's "reasonably careful" is another person's "needlessly risky."

Comment author: Will_Newsome 27 May 2010 10:40:28PM 2 points [-]

This comment inspired me to draft a post about how much quantum measure is lost doing various things, so that people can more easily see whether or not a certain activity (like driving to the store for food once a week instead of having it delivered) is 'worth it'.

Comment author: Will_Newsome 26 May 2010 07:47:07PM 1 point [-]

Ha, good times. :) But being careful with one's life and being careful with one's limb are too very different things. I may be stupid, but I'm not stupid.

Comment author: Jonathan_Graehl 27 May 2010 02:59:58AM 2 points [-]

Unless you're wearing a helmet, moderate falls that 99+% of the time just result in a few sprains/breaks, may <1% of the time give permanent brain damage (mostly I'm thinking of hard objects' edges striking the head). Maybe my estimation is skewed by fictional evidence.

Comment deleted 26 May 2010 02:49:36PM *  [-]
Comment author: Will_Newsome 26 May 2010 02:56:14PM 0 points [-]

Hm, thanks for making me really think about it, and not letting me slide by without doing calculation. It seems to me, given my preferences, about which I am not logically omniscient, and given my structural uncertainty around these issues, of which there is much, I think that my 50 percent confidence interval is between .00001%, 1 in 10 million, to .01%, 1 in ten thousand.

Comment deleted 26 May 2010 02:58:46PM [-]
Comment author: Vladimir_Nesov 26 May 2010 05:16:47PM *  0 points [-]

It seems to me, given my preferences, about which I am not logically omniscient, [...]

I'd say your preference can't possibly influence the probability of this event. To clear up the air, can you explain how does taking into account your preference influence the estimate? Better, how does the estimate break up on the different defeaters (events making the positive outcome impossible)?

Comment author: kpreid 26 May 2010 09:13:33PM 0 points [-]

I have measure to burn.

I like this turn of phrase.