You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ArisKatsaris comments on How Many of Me Are There? - Less Wrong Discussion

7 Post author: Eneasz 15 April 2011 07:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (42)

You are viewing a single comment's thread. Show more comments above.

Comment author: ArisKatsaris 18 April 2011 03:31:26PM *  4 points [-]

Cool, then one last scenario:

  • If you press the button, you'll be memory-modified into thinking you chose not to press it.

  • If you don't press the button, you'll be memory-modified into thinking you pressed it.

Do you press the button now? With this scenario you'll have a longer experience of remembering yourself violating your values if you don't violate them. If you want to not remember violating your values, you'll need to violate them.

Comment author: Raemon 18 April 2011 04:57:54PM 0 points [-]

I confess that I'm still on the fence about the underlying philosophical question here.

The answer is that I still don't press the button, because I just won't. I'm not sure if that's a decision that's consistent with my other values or not.

Essentially the process is: As I make the decision, I have the knowledge that pressing the button will destroy the world, which makes me sad. I also have the knowledge that I'll spend the rest of my life thinking I'll press the button, which also makes me sad. But knowing (in the immediate future) that I destroyed the world makes me more sad than knowing that I ruined my life, so I still don't press it.

The underlying issue is "do I count as the same person after I've been memory modified?" I don't think I do. So my utility evaluation is "I'm killing myself right now, then creating a world with a new happy person but a world that will be destroyed." I don't get to reap the benefits of any of it, so it's just a question of greater overall utility.

But I realize that I actually modify my own memory in small ways all the time, and I'm not sure how I feel about that. I guess I prefer to live in a world where people don't mindhack themselves to think they do things that harm me without feeling guilty. To help create that world, I try not to mindhack myself to not feel guilty about harming other people.

Comment author: ArisKatsaris 19 April 2011 09:00:36AM 3 points [-]

I think you're striving too much to justify your position on the basis of sheer self-interest (that you want to experience being such a person, that you want to live in such a world) -- that you're missing the more obvious solution that your utility function isn't completely selfish, that you care about the rest of the real world, not just your own subjective experiences.

If you didn't care about other people for themselves, you wouldn't care about experiencing being the sort of person who cares about other people. If you didn't care about the future of humanity for itself, you wouldn't care about whether you're the sort of person who presses or doesn't press the button.

Comment author: Raemon 19 April 2011 01:53:56PM *  1 point [-]

Oh I totally agree. But satisfying my utility function is still based on my own subjective experiences.

The original comment, which I agreed with, wasn't framing things in terms of "do I care more about myself or about saving the world." It was about "do I care about PERSONALLY having experiences or about other people who happen to be similar/identical to me having those experiences?"

If there are multiple copies of me, and one of them dies, I didn't get smaller. One of them died. If I get uploaded to a server and then continue on my life, periodically hearing about how another copy of me is having transhuman sex with every Hollywood celebrity at the same time, I didn't get to have that experience. And if a clone of me saves the world, I didn't get to actually save the world.

I would rather save the world than have a clone do it. (But that preference is not so strong that I'd rather have the world saved less than optimally if it meant I got to do it instead of a clone)

Comment author: AlephNeil 19 April 2011 12:01:24PM 0 points [-]

I entirely agree - I noticed Raemon's comment earlier and was vaguely planning to say something like this, but you've put it very eloquently.