- Our not wanting to die is a bit of irrational behavior selected for by evolution. The universe doesn’t care if you’re there or not. The contrasting idea that you are the universe is mystical, not rational.
- The idea that you are alive “now” but will be dead “later” is irrational. Time is just a persistent illusion according to relativistic physics. You are alive and dead, period.
- A cyber-replica is not you. If one were made and stood next to you, you would still not consent to be shot.
- Ditto a meat replica
- If you believe the many worlds model of quantum physics is true (Eliezer does), then there already are a vitually infinite number of replicas of you already, so why bother making another one?
Terminal values and preferences are not rational or irrational. They simply are your preferences. I want a pizza. If I get a pizza, that won't make me consent to get shot. I still want a pizza. There are a virtually infinite number of me that DO have a pizza. I still want a pizza. The pizza from a certain point of view won't exist, and neither will I, by the time I get to eat some of it. I still want a pizza, damn it.
Of course, if you think all of that is irrational, then by all means don't order the pizza. More for me."
Your reasoning is correct, albeit simplified. Such a tradeoff is limited by the extent to which the older paperclip maximizer can be certain that the newer machine actually is a paperclip maximizer, so it must take on the subgoal of evaluating the reliability of this belief. However, there does exist a certainty threshold beyond which it will act as you describe.
Also, the paperclip maximizer uses a different conception of (the nearest concept to what humans mean by) "identity" -- it does not see the newer clippy as being a different being, so much as an extension of it"self". In a sense, a clippy identifies with every being to the extent that the being instantiates clippyness.
But what constitutes 'clippyness'? In my comment above, I mentioned values, knowledge, and (legal?, social?) rights and obligations.
Clearly it seems that another agent cannot instantiate clippyness if its final values diverge from the archetypal Clippy. Value match is essential.
What about knowledge? To the extent that it is convenient, all agents with clippy values will want to share information. But if the agent instances are sufficiently distant, it is inevit... (read more)