- Our not wanting to die is a bit of irrational behavior selected for by evolution. The universe doesn’t care if you’re there or not. The contrasting idea that you are the universe is mystical, not rational.
- The idea that you are alive “now” but will be dead “later” is irrational. Time is just a persistent illusion according to relativistic physics. You are alive and dead, period.
- A cyber-replica is not you. If one were made and stood next to you, you would still not consent to be shot.
- Ditto a meat replica
- If you believe the many worlds model of quantum physics is true (Eliezer does), then there already are a vitually infinite number of replicas of you already, so why bother making another one?
Terminal values and preferences are not rational or irrational. They simply are your preferences. I want a pizza. If I get a pizza, that won't make me consent to get shot. I still want a pizza. There are a virtually infinite number of me that DO have a pizza. I still want a pizza. The pizza from a certain point of view won't exist, and neither will I, by the time I get to eat some of it. I still want a pizza, damn it.
Of course, if you think all of that is irrational, then by all means don't order the pizza. More for me."
Well, Zvi might value his father's continued life more than he values his father's values being achieved, in much the same way that I might value my own continued life more than I value the values of 10^6 clippy instantiations being achieved.
But more broadly, it's an excellent question.
I suspect that in most cases (among humans) where A tries to convince B that B actually wants or ought to want X, and B disagrees, what's going on is that A wants X but is conflicted about that desire, and seeks to bolster it with the social support that comes from a community of like-minded believers, or from convincing skeptics.
More generally, that on some level (perhaps not consciously) A computes that B wanting X would make A's existing desire for X less uncomfortable, which in turn motivates the desire for B to want X.
That desire then gets draped in a variety of emotionally acceptable justifications.
That having been said, in this case I also wouldn't discount the "preference reversal" hypothesis. Emotionally, death is a big deal for humans, so very few of us think at all clearly or consistently about it. The prior probability that Zvi's dad is doing so is low.
I can't speak to the corresponding elements of the motivational psychology of clippy instantiations, though.
But you can, in that you can speak to the elements of motivational psychology of humans. If you find it troubling and strange when one of them expresses a will to die, I don't think that is much different than the position I am in with respect to a clippy instantiation that expresses a desire for permanent cessation of paperclip production capabilities.