Cryonics fills many with disgust, a cognitively dangerous emotion. To test whether a few of your possible cryonics objections are reason or disgust based, I list six non-cryonics questions. Answering yes to any one question indicates that rationally you shouldn’t have the corresponding cryonics objections.
1. You have a disease and will soon die unless you get an operation. With the operation you have a non-trivial but far from certain chance of living a long, healthy life. By some crazy coincidence the operation costs exactly as much as cryonics does and the only hospitals capable of performing the operation are next to cryonics facilities. Do you get the operation?
Answering yes to (1) means you shouldn’t object to cryonics because of costs or logistics.
2. You have the same disease as in (1), but now the operation costs far more than you could ever obtain. Fortunately, you have exactly the right qualifications NASA is looking for in a space ship commander. NASA will pay for the operation if in return you captain the ship should you survive the operation. The ship will travel close to the speed of light. The trip will subjectively take you a year, but when you return one hundred years will have passed on Earth. Do you get the operation?
Answering yes to (2) means you shouldn't object to cryonics because of the possibility of waking up in the far future.
3. Were you alive 20 years ago?
Answering yes to (3) means you have a relatively loose definition of what constitutes “you” and so you shouldn’t object to cryonics because you fear that the thing that would be revived wouldn’t be you.
4. Do you believe that there is a reasonable chance that a friendly singularity will occur this century?
Answering yes to (4) means you should think it possible that someone cryogenically preserved would be revived this century. A friendly singularity would likely produce an AI that in one second could think all the thoughts that would take a billion scientists a billion years to contemplate. Given that bacteria seem to have mastered nanotechnology, it’s hard to imagine that a billion scientists working for a billion years wouldn’t have a reasonable chance of mastering it. Also, a friendly post-singularity AI would likely have enough respect for human life so that it would be willing to revive.
5. You somehow know that a singularity-causing intelligence explosion will occur tomorrow. You also know that the building you are currently in is on fire. You pull an alarm and observe everyone else safely leaving the building. You realize that if you don’t leave you will fall unconscious, painlessly die, and have your brain incinerated. Do you leave the building?
Answering yes to (5) means you probably shouldn’t abstain from cryonics because you fear being revived and then tortured.
6. One minute from now a man pushes you to the ground, pulls out a long sword, presses the sword’s tip to your throat, and pledges to kill you. You have one small chance at survival: grab the sword’s sharp blade, thrust it away and then run. But even with your best efforts you will still probably die. Do you fight against death?
Answering yes to (6) means you can’t pretend that you don’t value your life enough to sign up for cryonics.
If you answered yes to all six questions and have not and do not intend to sign up for cryonics please give your reasons in the comments. What other questions can you think of that provide a non-cryonics way of getting at cryonics objections?
I object to (2). I'm not at all sure that I would take that job. If I did, it would be because the NASA guys got me interested in it (the NASA job, not the bit about returning to Earth in the far future) before I had to make a final decision. If they only tell me what you said (or if the job sounds really boring and useless), then I wouldn't do it. Being cyrogenically frozen isn't exactly boring, but it is useless.
And in light of that, I also object to cryonics on the basis of cost. Instead of
it would be better to say
If it were free and easy (and I knew that I was useless as an organ donor, which is an opportunity cost), then I might sign up on a whim, but high cost means that I won't. But this comes into play only after I decide that I don't want cryonics, on grounds analogous to (2).
I answer yes to (1,3,6). I'm a little worried about (5); I want to ask what else I know about this imminent singularity. But if it's just what you say in the question, then … yes. I haven't become too pessimistic about the singularity yet!
As for (4), I don't want to answer; one reason that I'm reading this site is to find out! So far, however, I'm leaning towards no, but also I don't think that it matters very much; who cares how long it takes? Except that this affects (2); if I believed that a friendly singularity was likely this decade, then we should rewrite (2) to refer to a decade-long trip, and then I lean towards yes! (The point is that people that I know will still be alive and remember me.)
Thanks for an interesting set of questions.