You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Mark_Friedenbach comments on Your transhuman copy is of questionable value to your meat self. - Less Wrong Discussion

12 Post author: Usul 06 January 2016 09:03AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (140)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 06 January 2016 08:55:16PM 0 points [-]

OK imagine somewhere far away in the universe--or maybe one room over, of doesn't matter--there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It's mental states remain exactly identical to yours.

Should you still care? To me it'd still be someone different.

Comment author: ike 07 January 2016 05:06:34AM 0 points [-]

Care in terms of what? You have no way of knowing which one you are, so if you're offered the option to help the one in the left room, you should, because there's a 50% chance that's you. I would say it's not well defined whether you're one or the other, actually, you're both until an "observation/divergence". But what specific decision hinges on the question?

Comment author: dxu 07 January 2016 05:16:08AM *  -2 points [-]

Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won't make any copies until after you agree. Since the copies haven't been made yet, this ensures that you must be the original, and since you don't care about any identical copies of yours since they're technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.

(In the real universe--or at least the universe one level up on the simulation hierarchy--a Mark Friedenbach receives a dollar. This isn't of much comfort to you, of course, seeing as you're dead.)

Comment author: [deleted] 07 January 2016 04:56:55PM 1 point [-]

No, I don't want you to murder a trillion people, even if those people are not me.

Comment author: Usul 07 January 2016 05:57:40AM 1 point [-]

You shouldn't murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I'm a copy or not depends on various aspects of my personality.

Comment author: dxu 07 January 2016 06:02:58AM -1 points [-]

If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with "yes" or "no". You're free to expand on your answer, but first please make sure you give an answer.)

Comment author: Usul 07 January 2016 06:31:06AM 3 points [-]

No. It's a dick move. Same question and they're not copies of me? Same answer.

Comment author: dxu 07 January 2016 06:43:15AM -2 points [-]

Same question and they're not copies of me? Same answer.

As I'm sure you're aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren't copies of you shows that your reason for saying "no" has nothing to do with the purpose of the question. In particular, telling me that "it's a dick move" does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:

Would someone who shares your views on consciousness but doesn't give a crap about other people say "yes" or "no" to my deal?

Comment author: Usul 07 January 2016 07:50:08AM *  3 points [-]

Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.

Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.

New question: Yes, an amoral dick who shares my views on consciousness would say yes.