- Our not wanting to die is a bit of irrational behavior selected for by evolution. The universe doesn’t care if you’re there or not. The contrasting idea that you are the universe is mystical, not rational.
- The idea that you are alive “now” but will be dead “later” is irrational. Time is just a persistent illusion according to relativistic physics. You are alive and dead, period.
- A cyber-replica is not you. If one were made and stood next to you, you would still not consent to be shot.
- Ditto a meat replica
- If you believe the many worlds model of quantum physics is true (Eliezer does), then there already are a vitually infinite number of replicas of you already, so why bother making another one?
Terminal values and preferences are not rational or irrational. They simply are your preferences. I want a pizza. If I get a pizza, that won't make me consent to get shot. I still want a pizza. There are a virtually infinite number of me that DO have a pizza. I still want a pizza. The pizza from a certain point of view won't exist, and neither will I, by the time I get to eat some of it. I still want a pizza, damn it.
Of course, if you think all of that is irrational, then by all means don't order the pizza. More for me."
I doubt there's an easy way to explain that once and for all... if you use common words in a common way then people will likely understand you to mean what is commonly meant by them.
Communication between different kinds of minds is tricky, even given a shared language.
Your task is made more difficult by the nature of the medium... given the absence of clear signals to the contrary, most of us will likely continue to think of you as a human pretending to be a paperclip-maximizing AI, and that will influence how we interpret your language even if we don't intend it to.
That said, in some cases you might do better to describe yourself as "preoccupied by Y" than "worried about Y." There are fewer anthropomorphic connotations to that.
EDIT: Oh, and, I should add: i don't think my parent comment depends on anthropomorphic understandings of your psychology... I just meant to say that it was equally plausible, absent data, that you might be indifferent to the expressed preferences of other clippys.
That heuristic does not apply here, as human common usage is ambiguous with respect to whether these terms require human-specific traits to be applicable, and I was using what I deemed a natural generalization on the assumption that there is no such requirement.
Human usage of emotional terms does not reference non-human optimization processes enough to classify it one way or... (read more)