Cryonics fills many with disgust, a cognitively dangerous emotion. To test whether a few of your possible cryonics objections are reason or disgust based, I list six non-cryonics questions. Answering yes to any one question indicates that rationally you shouldn’t have the corresponding cryonics objections.
1. You have a disease and will soon die unless you get an operation. With the operation you have a non-trivial but far from certain chance of living a long, healthy life. By some crazy coincidence the operation costs exactly as much as cryonics does and the only hospitals capable of performing the operation are next to cryonics facilities. Do you get the operation?
Answering yes to (1) means you shouldn’t object to cryonics because of costs or logistics.
2. You have the same disease as in (1), but now the operation costs far more than you could ever obtain. Fortunately, you have exactly the right qualifications NASA is looking for in a space ship commander. NASA will pay for the operation if in return you captain the ship should you survive the operation. The ship will travel close to the speed of light. The trip will subjectively take you a year, but when you return one hundred years will have passed on Earth. Do you get the operation?
Answering yes to (2) means you shouldn't object to cryonics because of the possibility of waking up in the far future.
3. Were you alive 20 years ago?
Answering yes to (3) means you have a relatively loose definition of what constitutes “you” and so you shouldn’t object to cryonics because you fear that the thing that would be revived wouldn’t be you.
4. Do you believe that there is a reasonable chance that a friendly singularity will occur this century?
Answering yes to (4) means you should think it possible that someone cryogenically preserved would be revived this century. A friendly singularity would likely produce an AI that in one second could think all the thoughts that would take a billion scientists a billion years to contemplate. Given that bacteria seem to have mastered nanotechnology, it’s hard to imagine that a billion scientists working for a billion years wouldn’t have a reasonable chance of mastering it. Also, a friendly post-singularity AI would likely have enough respect for human life so that it would be willing to revive.
5. You somehow know that a singularity-causing intelligence explosion will occur tomorrow. You also know that the building you are currently in is on fire. You pull an alarm and observe everyone else safely leaving the building. You realize that if you don’t leave you will fall unconscious, painlessly die, and have your brain incinerated. Do you leave the building?
Answering yes to (5) means you probably shouldn’t abstain from cryonics because you fear being revived and then tortured.
6. One minute from now a man pushes you to the ground, pulls out a long sword, presses the sword’s tip to your throat, and pledges to kill you. You have one small chance at survival: grab the sword’s sharp blade, thrust it away and then run. But even with your best efforts you will still probably die. Do you fight against death?
Answering yes to (6) means you can’t pretend that you don’t value your life enough to sign up for cryonics.
If you answered yes to all six questions and have not and do not intend to sign up for cryonics please give your reasons in the comments. What other questions can you think of that provide a non-cryonics way of getting at cryonics objections?
Just to keep things in context, my main point in posting was to demonstrate the unlikelihood of being awakened in a dystopia; it's almost as if critics suddenly jump from point A to point B without a transition. While your Niven scenario you listed below seems to be agreeable to my position, it's actually still off; you are missing the key point behind the chain of constant care, the needed infrastructure to continue cryonics care, etc. This has nothing to do with a family reviving ancestors: if someone - anyone - is there taking the time and energy to keep on refilling your dewar with LN2, then that means someone is there wanting to revive you. Think coma patients; hospitals don't keep them around just to feed them and stare at their bodies.
Anyways, moving on to the "initiatives" comment. Given that Lesswrong tends to overlap with SIAI supporters, perhaps I should have said mission? Again, I haven't looked too much into Yvain's history. However, let's suppose for the moment that he's a strong supporter of that mission. Since we:
...I guess I was just wondering if he thought it's a grim outlook for the mission. Signing up for cryonics seems to give a "glass half full" impression. Furthermore, due to #1 and #2 above, I'll eventually be arguing why mainstreaming cryonics could significantly assist in reducing existential risk.... and why it may be helpful for everyone from the LessWrong community to IEET be a little more assertive on the issue. Of course, I'm not saying eliminating risk. But at the very least, mainstreaming cryonics should be more helpful with existential risk than dealing with, say, measles ;)
To be honest, that did not clear anything up. I still don't know whether to interpret your original question as:
To be honest once again, I no longer care what you meant because you have made it clear that you don't really care what the answer is. You have ... (read more)