Technologos comments on That Magical Click - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (400)
I am puzzled by Eliezer's confidence in the rationality of signing up for cryonics given he thinks it would be characteristic of a "GODDAMNED SANE CIVILIZATION". I am even more puzzled by the commenters overwhelming agreement with Eliezer. I am personally uncomfortable with cryonics for the two following reasons and am surprised that no one seems to bring these up.
(a) Have my life support system turned off and die peacefully.
(b) Keep the life support system going but subsequently give up all autonomy over my life and body and place it entirely in the hands of others who are likely not even my immediate kin. I could be made to put up with immense suffering either due to technical glitches which are very likely since this is a very nascent area, or due to willful malevolence. In this case I would very likely choose (a).
Note that in addition to prolonged suffering where I am effectively incapable of pulling the plug on myself, there is also the chance that I would be an oddity as far as future generations are concerned. Perhaps I would be made a circus or museum exhibit to entertain that generation. Our race is highly speciesist and I would not trust the future generations with their bionic implants and so on to even necessarily consider me to be of the same species and offer me the same rights and moral consideration.
Last but not the least is a point I made as a comment in response to Robin Hanson's post. Robin Hanson expressed a preference for a world filled with more people with scarce per-capita resources compared to a world with fewer people with significantly better living conditions. His point was that this gives many people the opportunity to "be born" who would not have come into existence. And that this was for some reason a good thing. I suspect that Eliezer too has a similar opinion on this, and this is probably another place we widely differ.
I couldn't care less if I weren't born. As the saying goes, I have been dead/not existed for billions of years and haven't suffered the slightest inconvenience. I see cryonics and a successful recovery as no different from dying and being re-born. Thus I assign virtually zero positives to being re-born, while I assign huge negatives to 1 and 2 above.
We are evolutionarily driven to dislike dying and try to postpone it for as long as possible. However I don't think we are particularly hardwired to prefer this form of weird cryonic rebirth over never waking up at all. Given that our general preference to not die has nothing fundamental about it, but is rather a case of us following our evolutionary leanings, what makes it so obvious that cryonic rebirth is a good thing. Some form of longetivity research which extends our life to say 200 years without going the cryonic route with all the above risks especially for the first few generations of cryonic guinea pigs, seems much harder to argue against.
Unfortunately all the discussion on this forum including the writings by Eliezer seem to draw absolutely no distinction between the two scenarios:
A. Signing up for cryonics now, with all the associated risks/benefits that I just discussed.
B. Some form of payment for some experimental longetivity research that you need to make upfront when you are 30. If the research succeeds and is tested safe, you can use the drugs for free and live to be 200. If not, you live your regular lifespan and merely forfeit the money that you paid to sponsor the research.
I can readily see myself choosing (B) if the rates were affordable and if the probability of success seemed reasonable to justify that rate. I find it astounding that repeated shallow arguments are made on this blog which address scenario (A) as though it were identical to scenario (B).
Could you supply a (rough) probability derivation for your concerns about dystopian futures?
I suspect the reason people aren't bringing those possibilities up is that, through a variety of elements including in particular the standard Less Wrong understanding of FAI derived from the Sequences, LWers have a fairly high conditional probability Pr(Life after cryo will be fun | anybody can and bothers to nanotechnologically reconstruct my brain) along with at least a modest probability of that condition actually occurring.