Depend on how you feel about anthropically selfish preferences, and altruistic preferences that try to satisfy other peoples' selfish preferences. I, for instance, do not think it's okay to kill a copy of me even if I know I will live on.
In the earth-mars teleporter thought experiment, the missing piece is the idea that people care selfishly about their causal descendants (though this phrase is obscuring a lot of unsolved questions about what kind of causation counts). If the teleporter annihilates a person as it scans them, the person who get annihilated has a direct causal descendant on the other side. If it waits ten minutes, gives the original some tea and cake, and then annihilates them, the person who gets annihilated has no direct causal descendant - they really are getting killed off in a way that matters more to them than before.
I, for instance, do not think it's okay to kill a copy of me even if I know I will live on
Not OK in what sense - as in morally wrong to kill sapient beings or as terrifying as getting killed? I tend to care more about people who are closer to me, so by induction I will probably care about my copy more than any other human, but I still alieve the experience of getting killed to be fundamentally different and fundamentally more terrifying than the experience of my copy getting killed.
From the linked post:
...The counterargument is also simple, though: Makin
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Previous Open Thread
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.