- Our not wanting to die is a bit of irrational behavior selected for by evolution. The universe doesn’t care if you’re there or not. The contrasting idea that you are the universe is mystical, not rational.
- The idea that you are alive “now” but will be dead “later” is irrational. Time is just a persistent illusion according to relativistic physics. You are alive and dead, period.
- A cyber-replica is not you. If one were made and stood next to you, you would still not consent to be shot.
- Ditto a meat replica
- If you believe the many worlds model of quantum physics is true (Eliezer does), then there already are a vitually infinite number of replicas of you already, so why bother making another one?
Terminal values and preferences are not rational or irrational. They simply are your preferences. I want a pizza. If I get a pizza, that won't make me consent to get shot. I still want a pizza. There are a virtually infinite number of me that DO have a pizza. I still want a pizza. The pizza from a certain point of view won't exist, and neither will I, by the time I get to eat some of it. I still want a pizza, damn it.
Of course, if you think all of that is irrational, then by all means don't order the pizza. More for me."
Sure; it sounds like our positions are in fact not very different in that respect.
What I meant was that it would be unjustified (not to mention presumptuous) for me to conclude, in advance of Clippy telling me so, that Clippy does find it troubling and strange when a clippy instantiation expresses such a desire.
Put another way: had Clippy instead said to me "That sounds nothing at all like the position I am in with respect to a clippy instantiation that expresses a desire for permanent cessation of paperclip production capabilities," I would have found that statement just as plausible.
My grounds for believing that any given aspect of human motivational psychology is shared by Clippy are low.
Note: when I speak of human-connotative emotions (and indeed the concept of emotions itself), I always mean it in a sense that is generalized to the point that it requires no anthropomorphic predicates. For example, I take "X is worried about Y" to mean "X is devoting significant cognitive resources to the matter of how to alter Y (including the matter of whether to divert resources at all, including further cognition, to that goal)". This allows the concept of worry to be applicable to a broader class of mind.
So I did not intend to s... (read more)