Format of the conversation matters. What I saw was a friendly matching of wits, in which of course your father wants to win. If you seriously want to change his mind you may need to have a heart-to-heart -- more like "Dad, I'm worried about you. I want you to understand why I don't want to die, and I don't want you to die." That's a harder conversation to have, and it's a risk, so I'm not out-and-out recommending it; but I don't think it'll sink in that this is serious until he realizes that this is about protecting life.
The counter-arguments here are good, but they stay pretty much in the world of philosophy hypotheticals. In addition to laying it all out cleanly, you may want to say some things that change the framing: compare cryonics to vaccination, say, a lifesaving procedure that was very slow to catch on because it was once actually risky and people took frequent illnesses for granted. Or, cryonics is a bet on the future; it's sad that you would bet against it. If he hasn't seen "You only live twice" show him that. It's not misleading; it actually aids understanding.
The pizza thing you wrote is accurate but it's not how I would put it; it's a st...
Assuming that this is mostly about persuading him to save himself by participating in cryonics (is that "the cause" for which he might be "an asset"?):
Your father may be fortunate to have so many informed people trying to change his mind about this. Not one person in a million has that.
He's also already scientifically informed to a rare degree - relative to the average person - so it's not as if he needs to hear arguments about nanobots and so forth.
So this has nothing to do with science, it's about sensibility and philosophy of life.
Many middle-aged people have seen most of their dreams crushed by life. They will also be somewhere along the path of physical decline leading to death, despite their best efforts. All this has a way of hollowing out a person, and making the individual life appear futile.
Items 1 and 2 on your father's list are the sort of consolations which may prove appealing to an intellectual, scientifically literate atheist, when contemplating possible attitudes towards life. Many such people, having faced some mix of success and failure in life, and looking ahead to personal oblivion (or, as they may see it, the great unknown of death), wi...
I'd have to know your father. Changing someone's mind generally requires knowing their mind.
Some theories that occur to me, which I would attempt to explore while talking to him about his views on life and death:
He's sufficiently afraid of dying that seriously entertaining hope of an alternative is emotionally stressful, and so he's highly motivated to avoid such hope. People do that a lot.
He's being contrarian.
He isn't treating "I should live forever" as an instance of "people should live forever," but rather as some kind of singular privilege, and it's invoking a kind of humility-signaling reflex... in much the same way that some people's reflexive reaction to being complimented is to deny the truth of it.
There's some kind of survivor's guilt going on.
If all of those turned out to be false, I'd come up with more theories to test. More importantly, I'd keep the conversation going until I actually understood his reasons.
Then I would consider his reasons, and think about whether they apply to me. I don't really endorse trying to change others' minds without being willing to change my own.
Having done all of that, if I still think he's mistaken, I'd try to express as clearly as I could my reasons for not being compelled by his argument.
3 A cyber-replica is not you. If one were made and stood next to you, you would still not consent to be shot. 4 Ditto a meat replica 5 If you believe the many worlds model of quantum physics is true (Eliezer does), then there already are a vitually infinite number of replicas of you already, so why bother making another one?
Point 5 contradicts 3 and 4, which suggests to me that your father is just arguing, or possibly that he isn't enthusiastic about continuing to live, and is looking for excuses.
Is dying bad for all intelligent agents, or just for humans (presumably due to details of our evolutionary heritage)?
I don't think it is a universal. Consider an intelligent paperclip maximizer which has the ability to create additional paperclip-maximizing agents (at the cost of some resources that might otherwise have gone into paperclip manufacture, to be sure). Assume the agent was constructed using now-obsolete technology and is less productive than the newer agents. The agent calculates, at some point, that the cause of paper-clip production is ...
" want to live until I make a conscious decision to die. I don't think I'll choose that that for a while, and I don't think you would either.
Is currently my favorite way of arguing that dying is bad. It starts off with something really obvious, and then a pretty inferentially close follow-up that extends it into not dying.
Is completely off topic. It's irrelevant bordering on nihilism. Sure the universe doesn't care because as far as we know the universe isn't sentient. so what? That has no bearing on desire for death or the death of others.
If knowing that number 2 is true (rationally or otherwise) were really enough, then no one would cry at funerals. "Oh, they're also alive we're just viewing them as dead" people would say. Just because I'm dreaming doesn't mean I don't want to have a good dream or have the good dream keep going. It also doesn't mean I don't c
Preferences are not rational or rational etc.
I want the me-aliveness part to be as large as possible. That timeless crystal should contain as much actions and thoughts of &qu...
If the person was capable of learning (it's not always so, particularly for older people), I'd start with explaining specific errors in reasoning actually exhibited by such confused replies, starting with introducing the rationalist taboo technique (generalized to assuring availability of an explanation of any detail of anything that is being discussed).
Here, we have overuse of "rational", some possibly correct statements that don't seem related ("Our not wanting to die is a bit of irrational behavior selected for by evolution. The universe ...
If a human seriously wants to die, why would you want to stop that human, if you value that human's achievement of what that human values? I can understand if you're concerned that this human experiences frequent akratic-type preference reversals, or is under some sort of duress to express something resembling the desire to die, but this appears to be a genuine preference on the part of the human under discussion.
Look at it the other way: what if I told you that a clippy instantiation wanted to stop forming metal into paperclips, and then attach to a powe...
"More generally, I'd like to figure out how to pierce this sort of argument in a way that makes the person in question actually change his mind."
Since you did post that letter about your Father trying to argue to you in a manner to try and have you change your mind, this raises alarms bells for me.
If both you and your Father is trying to change each others' minds, then there is a possibility that the argument can degenerate: both sides would only treat the other people's arguments as something to swat away, as opposed to something to seriously co...
What are his terminal values? It wouldn't be surprising for them not to include not dying. Mine don't. But dying would most likely still be instrumentally bad. If it isn't, it would almost definitely be instrumentally good. For example, my terminal value is happiness, which you can't have if you're dead.
Let me respond to each point that your dad offers:
Our not wanting to die is a bit of irrational behavior selected for by evolution. The universe doesn’t care if you’re there or not. The contrasting idea that you are the universe is mystical, not rational.
Others have questioned the use of the term rationality here, which is a good point to make. In my mind, there's a plausible distinction between rationality and wisdom, such that rationality is mastery of the means and wisdom is mastery of the ends (the definition of rationality offered on this site, o...
Regarding 1, all base values are irrational products, from culture and evolution. The desire not to go torture babies is due to evolution. I don't think that is going to make your father any more willing to do it. The key argument for death being bad is that his actual values will be less achieved if he dies. The standard example when it is a family member is to guilt them with how other family members feel. Presumably your father has lost people. He knows how much that hurts and how it never fully goes away. Even if he were actually fine with his existence ending (which I suspect he isn't) does it not bother him that he will cause pain and suffering to his friends and family?
Are you attempting to convince him just of the sensibleness of cryopreservation, or of the whole "package" of transhumanist beliefs? I'm asking because 3-4-5 are phrased as if you were advocating mind uploading rather than cryonics.
Also, 3-4 and 5 are directly contradictory. 5 says "if you believe in the existence of replicas, why would you still care about your life?", while 3-4 say "the existence of replicas doesn't make you care any less about your own life". While it doesn't sound like a productive line of inquiry, the opp...
Our not wanting to die is a bit of irrational behavior selected for by evolution.
Eating icecream is not rational either, it's just something we want. If someone really truly does not want to live than dying is rational. The question is, does you father want to live? I will speculate that while trying hard to convince him you labeled dying as "wrong", and set up the framework for his rebuttal.
The universe doesn’t care if you’re there or not. The contrasting idea that you are the universe is mystical, not rational.
Reverse stupidity...
...The
...The idea that you are alive “now” but will be dead “later” is irrational. Time is just a persistent illusion according to relativistic physics. You are alive and dead, period. A little knowledge is a dangerous etcetera. For one, it's like saying that relativistic spacetime proves New York isn't east of LA, but instead there are NY and LA, period. For another, if he really believed this then he wouldn't be able to function in society or make any plans at all.
Ditto a meat replica But aren't you always a meat replica of any past version of you? If he feels t
If a person wants to die, then why wait?
But seriously, you can solve the problem of #3 and #4 by using stem cells to make your brain divide forever, and use computers to store your memory in perfect condition, since brain cells gradually die off.
The problem is... what is "you"? How do you determine whether you are still yourself after a given period of time? Does my solution actually constitute a solution?
Shouldn't we be focusing on a way to scientifically quantify the soul before making ourselves immortal? On second thought, that might not be the best idea.
Terminal values and preferences are not rational or irrational. They simply are your preferences. I want a pizza. If I get a pizza, that won't make me consent to get shot. I still want a pizza. There are a virtually infinite number of me that DO have a pizza. I still want a pizza. The pizza from a certain point of view won't exist, and neither will I, by the time I get to eat some of it. I still want a pizza, damn it.
Of course, if you think all of that is irrational, then by all means don't order the pizza. More for me."