shminux comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (234)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: lavalamp 01 October 2013 08:59:11PM 0 points [-]

Would you step through the transporter? If you answered no, would it be moral to force you through the transporter? What if I didn't know your wishes, but had to extrapolate? Under what conditions would it be okay?

I don't see how any of that depends on the question of which computations (copies of me) get labeled with "personal identity" and which don't.

Also, take the more vile forms of Pascal's mugging and acausal trades. If something threatens torture to a simulation of you, should you be concerned about actually experiencing the torture, thereby subverting your rationalist impulse to shut up and multiply utility?

Depending on specifics, yes. But I don't see how this depends on the labeling question. This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Comment author: [deleted] 01 October 2013 09:17:51PM 0 points [-]

This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Forget the phrase "personal identity". If I am a powerful AI from the future and I come back to tell you that I will run a simulation of you so we can go bowling together, do you or do you not expect to experience bowling with me in the future, and why?

Comment author: shminux 01 October 2013 11:53:55PM -1 points [-]

I come back to tell you that I will run a simulation of you so we can go bowling together

Presumably you create a sim-me which includes the experience of having this conversation with you (the AI).

do you or do you not expect to experience bowling with me in the future, and why?

Let me interpret the term "expect" concretely as "I better go practice bowling now, so that sim-me can do well against you later" (assuming I hate losing). If I don't particularly enjoy bowling and rather do something else, how much effort is warranted vs doing something I like?

The answer is not unambiguous and depends on how much I (meat-me) care about future sim-me having fun and not embarrassing sim-self. If sim-me continues on after meat-me passes away, I care very much about sim-me's well being. On the other hand, if the sim-me program is halted after the bowling game, then I (meat-me) don't care much about that sim-loser. After all, meat-me (who will not go bowling) will continue to exist, at least for a while. You might feel differently about sim-you, of course. There is a whole range of possible scenarios here. Feel free to specify one in more detail.

TL;DR: If the simulation will be the only copy of "me" in existence, I act as if I expect to experience bowling.