shminux comments on Timeless Identity - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (234)
Forget the phrase "personal identity". If I am a powerful AI from the future and I come back to tell you that I will run a simulation of you so we can go bowling together, do you or do you not expect to experience bowling with me in the future, and why?
Presumably you create a sim-me which includes the experience of having this conversation with you (the AI).
Let me interpret the term "expect" concretely as "I better go practice bowling now, so that sim-me can do well against you later" (assuming I hate losing). If I don't particularly enjoy bowling and rather do something else, how much effort is warranted vs doing something I like?
The answer is not unambiguous and depends on how much I (meat-me) care about future sim-me having fun and not embarrassing sim-self. If sim-me continues on after meat-me passes away, I care very much about sim-me's well being. On the other hand, if the sim-me program is halted after the bowling game, then I (meat-me) don't care much about that sim-loser. After all, meat-me (who will not go bowling) will continue to exist, at least for a while. You might feel differently about sim-you, of course. There is a whole range of possible scenarios here. Feel free to specify one in more detail.
TL;DR: If the simulation will be the only copy of "me" in existence, I act as if I expect to experience bowling.