Some of these questions, like the one about running away from a fire, ignore the role of irrational motivation.
People, when confronted with an immediate threat to their lives, gain a strong desire to protect themselves. This has nothing to do with a rational evaluation of whether or not death is better than life. Even people who genuinely want to commit suicide have this problem, which is one reason so many of them try methods that are less effective but don't activate the self-defense system (like overdosing on pills instead of shooting themselves in the head). Perhaps even a suicidal person who'd entered the burning building because e planned to jump off the roof would still try to run out of the fire. So running away from a fire, or trying to stop a man threatening you with a sword, cannot be taken as proof of a genuine desire to live, only that any desire to die one might have is not as strong as one's self-protection instincts.
It is normal for people to have different motivations in different situations. When I see and smell pizza, I get a strong desire to eat the pizza; right now, not seeing or smelling pizza, I have no particular desire to eat pizza. The argument "If yo...
I suppose it serves as a vote of less than infinite confidence. I don't know if it makes me any less confident than SIAI themselves. It's still worth helping SIAI in any way possible, but they've never claimed a 100% chance of victory.
But I would like my nature changed in some ways. If an AI does that for me, does that make it unFriendly?
Unpack, please?
Sure.
the body's natural pheromones, for example, are an ordinary part of everyday human interaction, but date-rape drugs are rightly considered beyond the pale.
Humans are ridiculously easy to hack. See the AI box experiment, see Cialdini's 'Influence' and see the way humans are so predictably influenced in the mating dance. We don't object to people influencing us with pheremones. Don't complain when people work out at the gym before interacting with us, something that produces rather profound changes in perception (try it!). When it comes to influence of the kind that will facilitate mating most of these things are actually encouraged. People like being seduced.
But these vulnerabilities are exquisitely calibrated to be exploitable by a certain type of person and a certain kind of hard to fake behaviours. Anything that changes the game to even the playing field will be perceived as a huge violation. In the case date rape drugs, of course, it is a huge violation. But it is clear that our objection to the influence represented by date rape drugs is not objection to the influence itself, but to the details of what kind of influence, how it is done and by whom.
As Pavitra said, there is not a clear dividing line here.
From what I see, your questions completely ignore the crucial problem of weirdness signaling. Your question (1) should also assume that these hospitals are perceived by the general population, as well as the overwhelming majority of scientists and intellectuals, as a weird crazy cult that gives off a distinctly odd, creepy, and immoral vibe -- and by accepting the treatment, you also subscribe to a lifelong affiliation with this cult, with all its negative consequences for your relations with people. (Hopefully unnecessary disclaimer for careless readers: I am not arguing that this perception is accurate, but merely that it is an accurate description of the views presently held by people.)
As for question (3), the trouble with such arguments is that they work the other way around too. If you claim that the future "me" 20 years from now doesn't have any more special claim to my identity than whatever comes out of cryonics in more distant future, this can be used to argue that I should start identifying with the latter -- but it can also be used to argue that I should stop identifying with the former, and simply stop caring about what happens to "me" 20 years, or ...
(7) If you have a fatal disease that can only be cured by wearing a bracelet or necklace under your clothing, and anyone who receives an honest explanation of what the item is will think you're weird, do you wear the bracelet or necklace?
Answering yes to (7) means that you shouldn't refrain from cryonics for fear of being thought weird.
Heh -- that actually doubles as an explanation to people who ask:
"I'm wearing this necklace because I have a fatal disease that can only be cured by wearing it, and even then it only has a small chance of working."
--Oh no! I'm so sorry! What's the disease?
"Mortality."
I've heard of stories like that, except replace 'cryonics' with 'organ donation' and 'this terrible and obscene thing' refers to destroying the sanctity of a dead body rather than preserving the entire body cryonically. In Australia at least, the family's wishes win out over those of the deceased.
I'm often presented with a "the cycling of the generations is crucial. Without it progress would slow, the environment would be over-stressed, and there would be far fewer jobs for new young people" argument. I reply with question 8.
I'm willing to answer yes to 1-6 and to Eliezer's 7, but I am not signed up and have no immediate plans to do so. I may well consider it if the relevant circumstances change, which are:
1. I live in the UK where no cryonics company yet operates. I would have to move myself and my career to the US to have any chance of a successful deanimation. The non-cryonic scenario would be:
8. You suffer from a disease that will slowly kill you in thirty years, maybe sooner. There is a treatment that has a 10% chance of greatly extending that, but you would have to spend the rest of your life within reach of one of the very few facilities where it is available. These are all in other countries, where you would have to emigrate and find new employment to support yourself for at least the rest of your expected time.
And I really would not give a whole-hearted yes to that.
2. I am too old to finance it with insurance: I would have to pay for it directly, as I do with everything else. I probably can, but this actually makes it easier to put off -- no pressure to buy now while it's cheap.
What I am moved to do about cryonics is ask where I should be looking to keep informed about the current state and availability of the art. Is there a good source of cryonics news? At this point I'm not interested in arguments about whether not dying is a good thing, fears of waking up in the far future, or philosophising about bodily resuscitation vs. scan-and-upload. Just present-day practicalities.
If you answered yes to all six questions and have not and do not intend to sign up for cryonics please give your reasons in the comments.
Yes, these answers are somewhat flip. But ...
I can easily imagine someone rational sig...
Economies of scale mean that increasing numbers of cryonics users lower costs and improve revival chances. I would class this with disease activism, e.g. patients (and families of patients) with a particular cancer collectively organizing to fund and assist research into their disease. It's not a radically impartial altruist motivation, but it is a moral response to a coordination/collective action problem.
1 should be more like: You have an illness that will kill you sometime in the next 50 years unless you have an operation right when you die but not too late. The clinics that can perform this operation are so far away that the chances of you reaching the facility in time is negligible. Do you sign up for the operation?
Edit: The correct choice of course is to move nearer to the clinics in about 20 to 30 years.
Edit2: Also there is a chance that with some more research in the next couple of years a method could be developed that might not cure you but will vastly lengthen the time until you die with a much greater chance than the operation has. Do you pay for the operation or fund that research?
I havent signed up yet because at my age (31) my annual unexpected chance of death is low in comparison to my level of uncertainty about the different options, especially with whole brain plasticization possibly becoming viable in the near future (which would be much cheaper and probably have a higher future success rate ).
A little nit-picky, but:
A friendly singularity would likely produce an AI that in one second could think all the thoughts that would take a billion scientists a billion years to contemplate.
Without a source these figures seem to imply a precision that you don't back up. Are you really so confident that an AI of this level of intelligence will exist? I feel your point would be stronger by removing the implied precision. Perhaps:
A friendly singularity would likely produce a superintelligence capable of mastering nanotechnology.
(Responding to old post)
This is ridiculous. Each objection makes the deal less good; several objections combined together may make it bad enough that you should turn down the deal. Just because each objection by itself isn't enough to break the deal doesn't mean that they can't be bad enough cumulatively.
I might read a 40 chapter book with a boring first chapter. Or with a boring second chapter. Or with a boring third chapter, etc. But I would not want to read a book which contains 40 boring chapters.
This is especially so in the case of objections 1 a...
I'd dispute the claimed equivalence between several of these questions and cryonics (particularly the first) and I'd also take issue with some of the premises but I'd answer yes to all of them with caveats and I'm not signed up for cryonics nor do I intend to in the near future.
The reason I have no immediate plans to sign up is that I think there are relatively few scenarios where signing up now is a better choice than deferring a decision until later. I am currently healthy but if diagnosed with a terminal illness I could sign up then if it seemed like th...
I am currently healthy but if diagnosed with a terminal illness I could sign up then if it seemed like the best use of resources at the time.
Life insurance is a lot easier to get when you are healthy and not diagnosed with a terminal illness.
Quick Note: I found it mildly distract that the explanations (which all started with 'Answering' as the 1st word) were right under each question. I kept finding myself tempted to read the 'answers' first. I'd personally prefer all the explanations at the end.
Answering yes to [“Were you alive 20 years ago?”] means you have a relatively loose definition of what constitutes “you” and so you shouldn’t object to cryonics because you fear that the thing that would be revived wouldn’t be you.
Not necessarily. My definition of “me” may depend on the context. If someone asks me that question, I assume that by “you” they mean ‘a human with your DNA who has since grown into present-you’, regardless of how much or how little I identify with him.
Twenty years ago, I was eight years old. I think that I can honestly say that if you somehow replaced me with my eight-year-old self, it would be the same as killing me. (To a great extent, I'm still mostly the same person I was at fourteen. I'm not at all the person I was at eight.)
Even if someone answers yes to all six questions they could still rationally not sign up for cryonics. Aside from issues like weirdness signaling, they could not see any specific one of the six issues raised as sufficient to be an objection but consider all of them together to be enough. Thus for example one might combine 1 and 2 where the relevant payoff matrix for both issues combined (being sent into a possibly unpleasant future and having to pay a lot for an operation) combine to be enough of a concern even if neither does by itself. It seems unlikely ...
I object to (2). I'm not at all sure that I would take that job. If I did, it would be because the NASA guys got me interested in it (the NASA job, not the bit about returning to Earth in the far future) before I had to make a final decision. If they only tell me what you said (or if the job sounds really boring and useless), then I wouldn't do it. Being cyrogenically frozen isn't exactly boring, but it is useless.
And in light of that, I also object to cryonics on the basis of cost. Instead of
...Answering yes to (1) means you shouldn’t object to cryoni
My answer to (3) is "no" for rather trivial reasons, as my state 20 years ago is most comparable to someone who died and was not a cryonics patient: the thing that existed and was most similar to "me" was the DNA of people who are related to me. I don't count that as "alive", and I doubt that most people would.
Ask me (3) in the future, and I will probably have a different answer. (Wait until I'm 24, though, because I don't really identify so well with infants.)
What is "non-trivial but far from certain"? If operation's chances were as low as my estimation of cryonics I wouldn't bother so "no". With high enough chance "yes".
Maybe. I don't really trust my ability to place myself in such hypothetical scenarios and I expect my answer to result more from framing effects than anything else.
Sort of.
Definitely not.
Framing effects etc. I don't think I can reason about this clearly enough.
Definitely yes.
So there's one yes. It shouldn't surprise you that I consider cryonics waste ...
The article assumes that people make such decisions rational, which is just not the case. If you ask someone 'which argument or fact could possibly convince you to sign up, or lets say at least treat the cryo option favorably' you do not get a well reasoned argument about chances of it working or personal preferences or so, but more counterarguments. Throwing more logic at the problem does not help! If you find a magic argument that suddenly convinces someone that is not convinced yet - or does the signing process more immediate than planned, then you probably learned something useful about human nature that can be applied in other areas as well.
Shouldn't you be asking things like, So you're pro-cryonics. Why would you change your mind?
Answering yes to (2) means you shouldn't object to cryonics because of the possibility of waking up in the far future.
An astronaut after coming back to Earth would likely have much higher social status than a cryonic patient after being revived.
Answering yes to (1) means you shouldn’t object to cryonics because of costs or logistics.
The fact that I'm willing to spend $X in order to die at 75 rather than at 25 doesn't necessarily imply that I must be willing to spend $X to die at [large number] rather than at 75.
I say yes to 2, 5, and 6. I'd personally prefer not to be tortured or wake up in a future where humans may have been wiped out by another sentient race (I doubt it).
But what if I don't sign up for cryonics because I simply don't want to live in another time, without my friends, my family, people I owe duties to,...? What if I simply think it a dishonest way out? (I mean, I'm okay for cryopreserving other people, especially lethally ill. I don't care for the weirdness, also. But myself, no; I have a life, why would I decide to give it up?)
Some of your analogies strike me as quite strained:
(1) I wouldn't call the probability of being revived post near-future cryogenic freezing "non-trivial but far from certain", I would call it "vanishingly small, if not zero". If sick and dying and offered a surgery as likely to work as I think cryonics is, I might well reject it in favor of more conventional death-related activities.
(3) My past self has the same relation to me as a far-future simulation of my mind reconstructed from scans of my brain-sicle? Could be, but that's far fr...
Cryonics fills many with disgust, a cognitively dangerous emotion. To test whether a few of your possible cryonics objections are reason or disgust based, I list six non-cryonics questions. Answering yes to any one question indicates that rationally you shouldn’t have the corresponding cryonics objections.
1. You have a disease and will soon die unless you get an operation. With the operation you have a non-trivial but far from certain chance of living a long, healthy life. By some crazy coincidence the operation costs exactly as much as cryonics does and the only hospitals capable of performing the operation are next to cryonics facilities. Do you get the operation?
Answering yes to (1) means you shouldn’t object to cryonics because of costs or logistics.
2. You have the same disease as in (1), but now the operation costs far more than you could ever obtain. Fortunately, you have exactly the right qualifications NASA is looking for in a space ship commander. NASA will pay for the operation if in return you captain the ship should you survive the operation. The ship will travel close to the speed of light. The trip will subjectively take you a year, but when you return one hundred years will have passed on Earth. Do you get the operation?
Answering yes to (2) means you shouldn't object to cryonics because of the possibility of waking up in the far future.
3. Were you alive 20 years ago?
Answering yes to (3) means you have a relatively loose definition of what constitutes “you” and so you shouldn’t object to cryonics because you fear that the thing that would be revived wouldn’t be you.
4. Do you believe that there is a reasonable chance that a friendly singularity will occur this century?
Answering yes to (4) means you should think it possible that someone cryogenically preserved would be revived this century. A friendly singularity would likely produce an AI that in one second could think all the thoughts that would take a billion scientists a billion years to contemplate. Given that bacteria seem to have mastered nanotechnology, it’s hard to imagine that a billion scientists working for a billion years wouldn’t have a reasonable chance of mastering it. Also, a friendly post-singularity AI would likely have enough respect for human life so that it would be willing to revive.
5. You somehow know that a singularity-causing intelligence explosion will occur tomorrow. You also know that the building you are currently in is on fire. You pull an alarm and observe everyone else safely leaving the building. You realize that if you don’t leave you will fall unconscious, painlessly die, and have your brain incinerated. Do you leave the building?
Answering yes to (5) means you probably shouldn’t abstain from cryonics because you fear being revived and then tortured.
6. One minute from now a man pushes you to the ground, pulls out a long sword, presses the sword’s tip to your throat, and pledges to kill you. You have one small chance at survival: grab the sword’s sharp blade, thrust it away and then run. But even with your best efforts you will still probably die. Do you fight against death?
Answering yes to (6) means you can’t pretend that you don’t value your life enough to sign up for cryonics.
If you answered yes to all six questions and have not and do not intend to sign up for cryonics please give your reasons in the comments. What other questions can you think of that provide a non-cryonics way of getting at cryonics objections?