linkhyrule5 comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (234)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: lavalamp 01 October 2013 08:59:11PM 0 points [-]

Would you step through the transporter? If you answered no, would it be moral to force you through the transporter? What if I didn't know your wishes, but had to extrapolate? Under what conditions would it be okay?

I don't see how any of that depends on the question of which computations (copies of me) get labeled with "personal identity" and which don't.

Also, take the more vile forms of Pascal's mugging and acausal trades. If something threatens torture to a simulation of you, should you be concerned about actually experiencing the torture, thereby subverting your rationalist impulse to shut up and multiply utility?

Depending on specifics, yes. But I don't see how this depends on the labeling question. This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Comment author: [deleted] 01 October 2013 09:17:51PM 0 points [-]

This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Forget the phrase "personal identity". If I am a powerful AI from the future and I come back to tell you that I will run a simulation of you so we can go bowling together, do you or do you not expect to experience bowling with me in the future, and why?

Comment author: linkhyrule5 01 October 2013 10:56:03PM 1 point [-]

Yes, with probability P(simulation), or no, with probability P(not simulation), depending.