Pentashagon comments on Expected utility and utility after time - Less Wrong

3 Post author: Metus 29 August 2012 01:10PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (24)

You are viewing a single comment's thread. Show more comments above.

Comment author: Pentashagon 30 August 2012 10:19:29PM 0 points [-]

Does that apply only to copies of you or to people in general? Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?

Comment author: shminux 30 August 2012 10:40:20PM -1 points [-]

Does that apply only to copies of you or to people in general?

As I explained, I do not presume to make decisions for others.

Would you choose to torture all of humanity for a finite time, make them forget it, and then you receive 1 utilon?

I would not, see above. A better question would have been "Would you choose to slightly inconvenience a person you dislike for a short time, make them forget it, and then you receive 3^^^3 utilons?" If I answered "yes" (and I probably would), then you could probe further to see where exactly my self-professed non-interference breaks down. This is the standard way of forking the dust specks-vs-torture boundary and showing the resulting inconsistency.

Similar strategies apply to clarifying other seemingly absolute positions, including yours ("I don't consider my similarity to a person as a reason to treat them as a redundant copy.") Presumably at some point the answers become "I don't know", rather than Yes/No.

Comment author: Pentashagon 31 August 2012 12:04:07AM 0 points [-]

I am fairly certain the only way that I would treat a clone of myself differently than another independent person is if we continued to share internal mental experiences. Then again, I would probably stop thinking of myself and a random person off the street as different people if I started sharing mental experiences with them, too.

In other words, while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell. That brings up a rather interesting question; if two people share mental experiences do they achieve double the utility of each person individually, or merely the set union of their individual utilities? Or something else?

Comment author: shminux 31 August 2012 12:13:26AM 0 points [-]

while I would reject sending my fully independent clone to hell in order to gain utility, I might agree to fully share the mental experience with the clone in hell so long as the clone also got to experience the extra utility Omega paid me to balance out hell.

This seems to contradict your earlier assertion that

the second option the same as Omega offering to clone you, put the clone in hell for a finite amount of time and then destroy it, and give you the money immediately

because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), "both" of you reap the rewards.

Comment author: Pentashagon 31 August 2012 05:02:07PM 0 points [-]

because if you and the clone are one and the same (no cloning happened, you were tortured and then memory-wiped), "both" of you reap the rewards.

We are not the same person after the point of the decision. There's no continuity of experience. The tortured me experiences none of the utility, and the enriched me experiences none of the torture. That was why I thought of the cloning interpretation to begin with.