Disclaimer: the identity theory that I actually alieve is the most common intuitionist one, and it's philosophically inconsistent: I regard as death teleportation but not sleeping. This comment, however, is written from System 2 perspective, that can operate even with concepts that I don't alieve
The basic idea behind timeless identity is that "I" can only be meaningfully defined inductively as "an entity that has experience continuity with my current self". Thus, we can safely replace "I value my life" with "I value the existence of an entity that feels and behaves exactly like me". That allows us to be OK with quite useful (although hypothetical) things like teleportation, mind uploading, mind backups, etc. It also seems to provide an insight into why it's OK to make a copy of me on Mars, and immediately destroy Earth!me, but not OK to destroy Earth!me hours later: the experiences of Earth!me and Mars!me would diverge, and each of them would value their own lives.
However, here is the thing: in this case we merely replace the requirement "to have an entity with experience continuity with me" with "to have an entity with experience continuity with me, except this one hour". They're actually pretty interchangeable. For example, I forget most of my dreams, which means I'm nearly guaranteed to forget several hours of experience every day, and I'm OK with that. One might say that the value of genuine experiences exceeds that of hallucinations, but I would still be pretty OK with taking a suppressor of RNA synthesis, that would temporarily give me anterograde amnesia, and do something that I don't really care about remembering - clean the house or something. Heck, even retroactively erasing my most cherished memories, although extremely frustrating, is still not nearly as bad as death.
That implies that is there are multiple copies of me, the badness of killing any of them is no more than the increase in the likelihood of all of them being destroyed (which is not a lot, unless there's Armageddon happening around) plus the value of memories formed since the last replication. Also, every individual copy should consider alieve being killed to be no worse than forgetting what happened since the last replication, which also sounds not nearly as horrible as death. That also implies that simulating time travel by discarding time branches is also a pretty OK thing to do, unless the universes diverge strongly enough to create uniquely valuable memories.
Is that correct or am I missing something?
Depend on how you feel about anthropically selfish preferences, and altruistic preferences that try to satisfy other peoples' selfish preferences. I, for instance, do not think it's okay to kill a copy of me even if I know I will live on.
In the earth-mars teleporter thought experiment, the missing piece is the idea that people care selfishly about their causal descendants (though this phrase is obscuring a lot of unsolved questions about what kind of causation counts). If the teleporter annihilates a person as it scans them, the person who get annihilated ...
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Previous Open Thread
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.