Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

lavalamp comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (244)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: lavalamp 01 October 2013 07:13:06PM 2 points [-]

Can you taboo "personal identity"? I don't understand what important thing you could lose by going under general anesthesia.

Comment author: [deleted] 01 October 2013 07:40:55PM *  0 points [-]

It's easier to explain in the case of multiple copies of yourself. Imagine the transporter were turned into a replicator - it gets stuck in a loop reconstructing the last thing that went through it, namely you. You step off and turn around to find another version of you just coming out. And then another, and another, etc. Each one of you shares the same memories, but from that moment on you have diverged. Each clone continues life with their own subjective experience until that experience is terminated by that clone's death.

That sense of subjective experience separate from memories or shared history is what I have been calling “personal identity.” It is what gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and what separates me from my clones. You are welcome to suggest a better term.

The replicator / clone thought experiment shows that “subjective experience of identity” is something different from the information pattern that represents your mind. There is something, although at this moment that something is not well defined, which makes you the same “you” that will exist five minutes in the future, but which separates you from the “you”s that walked out of the replicator, or exist in simulation, for example.

The first step is recognizing this distinction. Then turn around and apply it to less fantastical situations. If the clone is “you” but not you (meaning no shared identity, and my apologies for the weak terminology), then what's to say that a future simulation of “you” would also be you? What about cryonics, will your unfrozen brain still be you? That might depend on what they do to repair damage from vitrification. What about general anesthesia? Again, I need to learn more about how general anesthesia works, but if they shut down your processing centers and then restart you later, how is that different from the teleportation or simulation scenario? After all we've already established that whatever provides personal identity, it's not physical continuity.

Comment author: TheOtherDave 01 October 2013 07:48:45PM 1 point [-]

That sense of subjective experience separate from memories or shared history is what I have been calling “personal identity.” It is what gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and what separates me from my clones.

Well, OK. So suppose that, after I go through that transporter/replicator, you ask the entity that comes out whether it has the belief, real or illusory, that it is the same person in this moment that it was at the moment it walked into the machine, and it says "yes".

If personal identity is what creates that belief, and that entity has that belief, it follows that that entity shares my personal identity... doesn't it?

Comment author: [deleted] 01 October 2013 08:17:54PM *  0 points [-]

Well, OK. So suppose that, after I go through that transporter/replicator, you ask the entity that comes out whether it has the belief, real or illusory, that it is the same person in this moment that it was at the moment it walked into the machine, and it says "yes".

If personal identity is what creates that belief, and that entity has that belief, it follows that that entity shares my personal identity... doesn't it?

Not quite. If You!Mars gave it thought before answering, his thinking probably went like this: “I have memories of going into the transporter, just a moment ago. I have a continuous sequence of memories, from then until now. Nowhere in those memories does my sense of self change. Right now I am experiencing the same sense of self I always remember experiencing, and laying down new memories. Ergo, proof by backwards induction I am the same person that walked into the teleporter.” However for that - or any - line of meta reasoning to hold, (1) your memories need to accurately correspond with the true and full history of reality and (2) you need trust that what occurs in the present also occurred in the past. In other words, it's kinda like saying “my memory wasn't altered because I would have remembered that.” It's not a circular argument per se, but it is a meta loop.

The map is not the territory. What happened to You!Earth's subjective experience is an objective, if perhaps not empirically observable fact. You!Mars' belief about what happened may or may not correspond with reality.

Comment author: TheOtherDave 01 October 2013 09:04:40PM 0 points [-]

What if me!Mars, after giving it thought, shakes his head and says "no, that's not right. I say I'm the same person because I still have a sense of subjective experience, which is separate from memories or shared history, which gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and which separates me from my clones"?

Do you take his word for it?
Do you assume he's mistaken?
Do you assume he's lying?

Comment author: [deleted] 01 October 2013 09:26:03PM *  0 points [-]

Assuming that he acknowledges that clones have a separate identity, or in other words he admits that there can be instances of himself that are not him, then by asserting the same identity as the person that walked into the teleporter, he is making an extrapolation into the past. He is expressing a belief that by whatever definition he is using the person walking into the teleporter meets a standard of meness that the clones do not. Unless the definition under consideration explicitly reference You!Mars' mental state (e.g. "by definition" he has shared identity with people he remembers having shared identity with), then the validity of that belief is external: it is either true or false. The map is not the territory.

Under an assumption of pattern or causal continuity, for example, it would be explicitly true. For computational continuity it would be false.

Comment author: TheOtherDave 01 October 2013 10:43:15PM 0 points [-]

If I understood you correctly, then on your account, his claim is simply false, but he isn't necessarily lying.

Yes?

It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.

Yes?

Comment author: [deleted] 01 October 2013 10:57:29PM *  0 points [-]

If I understood you correctly, then on your account, his claim is simply false, but he isn't necessarily lying.

Yes, in the sense that it is a belief about his own history which is either true or false like any historical fact. Whether it actually false depends on the nature of “personal identity”. If I understand the original post correctly, I think Eliezer would argue that his claim is true. I think Eliezer's argument lacks sufficient justification, and there's a good chance his claim is false.

It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.

Yes. My question is: is that belief justified?

If your memory were altered such to make you think you won the lottery, that doesn't make you any richer. Likewise You!Mars' memory was constructed by the transporter machine in such a way, following the transmitted design as to make him remember stepping into the transporter on Earth as you did, and walking out of it on Mars in seamless continuity. But just because he doesn't remember the deconstruction, information transmission, and reconstruction steps doesn't mean they didn't happen. Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.

Comment author: TheOtherDave 01 October 2013 11:34:31PM 0 points [-]

It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
Yes. My question is: is that belief justified?

OK, understood.

Here's my confusion: a while back, you said:

That sense of subjective experience separate from memories or shared history is what I have been calling “personal identity.” It is what gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and what separates me from my clones.

And yet, here's Dave!Mars, who has a sense of subjective experience separate from memories or shared history which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.

But on your account, he might not have Dave's personal identity.

So, where is this sense of subjective experience coming from, on your account? Is it causally connected to personal identity, or not?

Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.

Yes, that's certainly true. By the same token, if I convince you that I placed you in stasis last night for... um... long enough to disrupt your personal identity (a minute? an hour? a millisecond? a nanosecond? how long a period of "computational discontinuity" does it take for personal identity to evaporate on your account, anyway?), you would presumably conclude that you aren't the same person who went to bed last night. OTOH, if I placed you in stasis last night and didn't tell you, you'd conclude that you're the same person, and live out the rest of your life none the wiser.

Comment author: lavalamp 01 October 2013 08:21:59PM 0 points [-]

That experiment shows that "personal identity", whatever that means, follows a time-tree, not a time-line. That conclusion also must hold if MWI is true.

So I get that there's a tricky (?) labeling problem here, where it's somewhat controversial which copy of you should be labeled as having your "personal identity". The thing that isn't clear to me is why the labeling problem is important. What observable feature of reality depends on the outcome of this labeling problem? We all agree on how those copies of you will act and what beliefs they'll have. What else is there to know here?

Comment author: [deleted] 01 October 2013 08:44:31PM 0 points [-]

Would you step through the transporter? If you answered no, would it be moral to force you through the transporter? What if I didn't know your wishes, but had to extrapolate? Under what conditions would it be okay?

Also, take the more vile forms of Pascal's mugging and acausal trades. If something threatens torture to a simulation of you, should you be concerned about actually experiencing the torture, thereby subverting your rationalist impulse to shut up and multiply utility?

Comment author: lavalamp 01 October 2013 08:59:11PM 0 points [-]

Would you step through the transporter? If you answered no, would it be moral to force you through the transporter? What if I didn't know your wishes, but had to extrapolate? Under what conditions would it be okay?

I don't see how any of that depends on the question of which computations (copies of me) get labeled with "personal identity" and which don't.

Also, take the more vile forms of Pascal's mugging and acausal trades. If something threatens torture to a simulation of you, should you be concerned about actually experiencing the torture, thereby subverting your rationalist impulse to shut up and multiply utility?

Depending on specifics, yes. But I don't see how this depends on the labeling question. This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Comment author: [deleted] 01 October 2013 09:17:51PM 0 points [-]

This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Forget the phrase "personal identity". If I am a powerful AI from the future and I come back to tell you that I will run a simulation of you so we can go bowling together, do you or do you not expect to experience bowling with me in the future, and why?

Comment author: lavalamp 01 October 2013 09:29:25PM 1 point [-]

I'll give a 50% chance that I'll experience that. (One copy of me continues in the "real" world, another copy of me appears in a simulation and goes bowling.)

(If you ask this question as "the AI is going to run N copies of the bowling simulation", then I'm not sure how to answer-- I'm not sure how to weight N copies of the exact same experience. My intuition is that I should still give a 50% chance, unless the simulations are going to differ in some respect, then I'd give a N/(N+1) chance.)

Comment author: [deleted] 01 October 2013 11:19:45PM 0 points [-]

I need to think about your answer, as right now it doesn't make any sense to me. I suspect that whatever intuition underlies it is the source of our disagreement/confusion.

@linkhyrule5 had an answer better than the one I had in mind. The probability of us going bowling together is approximately equal to the probability that you are already in said simulation, if computational continuity is what matters.

If there were a 6th Day like service I could sign up for where if anything were to happen to me, a clone/simulation of with my memories would be created, I'd sign up for it in a heartbeat. Because if something were to happen to me I wouldn't want to deprive my wife of her husband, or my daughters of their father. But that is purely altruistic: I would have P(~0) expectation that I would actually experience that resurrection. Rather, some doppelganger twin that in every outward way behaves like me will take up my life where I left off. And that's fine, but let's be clear about the difference.

If you are not the simulation the AI was referring to, then you and it will not go bowling together, period. Because when said bowling occurs, you'll be dead. Or maybe you'll be alive and well and off doing other things while the simulation is going on. But under no circumstances should you expect to wake up as the simulation, as we are assuming them to be causally separate.

At least from my way of thinking. I'm not sure I understand yet where you are coming from well enough to predict what you'd expect to experience.

Comment author: lavalamp 01 October 2013 11:37:51PM 0 points [-]

@linkhyrule5 had an answer better than the one I had in mind. The probability of us going bowling together is approximately equal to the probability that you are already in said simulation, if computational continuity is what matters.

You could understand my 50% answer to be expressing my uncertainty as to whether I'm in the simulation or not. It's the same thing.

I don't understand what "computational continuity" means. Can you explain it using a program that computes the digits of pi as an example?

Rather, some doppelganger twin that in every outward way behaves like me will take up my life where I left off. And that's fine, but let's be clear about the difference.

I think you're making a distinction that exists only in the map, not in the territory. Can you point to something in the territory that this matters for?

Comment author: TheOtherDave 01 October 2013 10:51:12PM *  1 point [-]

Suppose that my husband and I believe that while we're sleeping, someone will paint a blue dot on either my forehead, or my husband's, determined randomly. We expect to see a blue dot when we wake up... and we also expect not to see a blue dot when we wake up. This is a perfectly reasonable state for two people to be in, and not at all problematic.

Suppose I believe that while I'm sleeping, a powerful AI will duplicate me (if you like, in such a way that both duplicates experience computational continuity with the original) and paint a blue dot on one duplicate's forehead. When I wake up, I expect to see a blue dot when I wake up... and I also expect not to see a blue dot when I wake up. This is a perfectly reasonable state for a duplicated person to be in, and not at all problematic.

Similarly, I both expect to experience bowling with you, and expect to not experience bowling with you (supposing that the original continues to operate while the simulation goes bowling).

Comment author: [deleted] 01 October 2013 11:29:35PM *  0 points [-]

The situation isn't analogous, however. Let's posit that you're still alive when the simulation is ran. In fact, aside from technology there's no reason to put it in the future or involve an AI. I'm a brain scanning researcher that shows up at your house tomorrow, with all the equipment to do a non-destructive mind upload and whole-brain simulation. I tell you that I am going to scan your brain, start the simulation, then don VR goggles and go virtual-bowling with “you”. Once the scanning is done you and your husband are free to go to the beach or whatever, while I go bowling with TheVirtualDave.

What probability would you put on you ending up bowling instead of at the beach?

Comment author: lavalamp 01 October 2013 11:42:28PM 0 points [-]

Prediction: TheOtherDave will say 50%, Beach!Dave and Bowling!Dave would both consider both to be the "original". Assuming sufficiently accurate scanning & simulating.

Comment author: TheOtherDave 01 October 2013 11:48:31PM 0 points [-]

Here's what TheOtherDave actually said.

Comment author: TheOtherDave 01 October 2013 11:46:48PM 1 point [-]

Well, let's call P1 my probability of actually going to the beach, even if you never show up. That is, (1-P1) is the probability that traffic keeps me from getting there, or my car breaks down, or whatever. And let's call P2 my probability of your VR/simulation rig working. That is, (1-P2) is the probability that the scanner fails, etc. etc.

In your scenario, I put a P1 probability of ending up at the beach, and a P2 probability of ending up bowling. If both are high, then I'm confident that I will do both.

There is no "instead of". Going to the beach does not prevent me from bowling. Going bowling does not prevent me from going to the beach. Someone will go to the beach, and someone will go bowling, and both of those someones will be me.

Comment author: lavalamp 01 October 2013 11:54:43PM 0 points [-]

Your probabilities add up to more than 1...

Comment author: shminux 02 October 2013 12:09:21AM *  -1 points [-]

As I alluded to in another reply, assuming perfectly reliable scanning, and assuming that you hate losing in bowling to MarkAI, how do you decide whether to go practice bowling or to do something else you like more?

Comment author: linkhyrule5 01 October 2013 10:56:03PM 1 point [-]

Yes, with probability P(simulation), or no, with probability P(not simulation), depending.

Comment author: shminux 01 October 2013 11:53:55PM -1 points [-]

I come back to tell you that I will run a simulation of you so we can go bowling together

Presumably you create a sim-me which includes the experience of having this conversation with you (the AI).

do you or do you not expect to experience bowling with me in the future, and why?

Let me interpret the term "expect" concretely as "I better go practice bowling now, so that sim-me can do well against you later" (assuming I hate losing). If I don't particularly enjoy bowling and rather do something else, how much effort is warranted vs doing something I like?

The answer is not unambiguous and depends on how much I (meat-me) care about future sim-me having fun and not embarrassing sim-self. If sim-me continues on after meat-me passes away, I care very much about sim-me's well being. On the other hand, if the sim-me program is halted after the bowling game, then I (meat-me) don't care much about that sim-loser. After all, meat-me (who will not go bowling) will continue to exist, at least for a while. You might feel differently about sim-you, of course. There is a whole range of possible scenarios here. Feel free to specify one in more detail.

TL;DR: If the simulation will be the only copy of "me" in existence, I act as if I expect to experience bowling.