TheOtherDave comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (234)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 01 October 2013 09:17:51PM 0 points [-]

This just boils down to "what do I expect to experience in the future?" which I don't see as being related to "personal identity".

Forget the phrase "personal identity". If I am a powerful AI from the future and I come back to tell you that I will run a simulation of you so we can go bowling together, do you or do you not expect to experience bowling with me in the future, and why?

Comment author: TheOtherDave 01 October 2013 10:51:12PM *  1 point [-]

Suppose that my husband and I believe that while we're sleeping, someone will paint a blue dot on either my forehead, or my husband's, determined randomly. We expect to see a blue dot when we wake up... and we also expect not to see a blue dot when we wake up. This is a perfectly reasonable state for two people to be in, and not at all problematic.

Suppose I believe that while I'm sleeping, a powerful AI will duplicate me (if you like, in such a way that both duplicates experience computational continuity with the original) and paint a blue dot on one duplicate's forehead. When I wake up, I expect to see a blue dot when I wake up... and I also expect not to see a blue dot when I wake up. This is a perfectly reasonable state for a duplicated person to be in, and not at all problematic.

Similarly, I both expect to experience bowling with you, and expect to not experience bowling with you (supposing that the original continues to operate while the simulation goes bowling).

Comment author: [deleted] 01 October 2013 11:29:35PM *  0 points [-]

The situation isn't analogous, however. Let's posit that you're still alive when the simulation is ran. In fact, aside from technology there's no reason to put it in the future or involve an AI. I'm a brain scanning researcher that shows up at your house tomorrow, with all the equipment to do a non-destructive mind upload and whole-brain simulation. I tell you that I am going to scan your brain, start the simulation, then don VR goggles and go virtual-bowling with “you”. Once the scanning is done you and your husband are free to go to the beach or whatever, while I go bowling with TheVirtualDave.

What probability would you put on you ending up bowling instead of at the beach?

Comment author: lavalamp 01 October 2013 11:42:28PM 0 points [-]

Prediction: TheOtherDave will say 50%, Beach!Dave and Bowling!Dave would both consider both to be the "original". Assuming sufficiently accurate scanning & simulating.

Comment author: TheOtherDave 01 October 2013 11:48:31PM 0 points [-]

Here's what TheOtherDave actually said.

Comment author: lavalamp 01 October 2013 11:55:21PM 0 points [-]

Yes, looks like that prediction is falsified. At least the first sentence. :)

Comment author: TheOtherDave 01 October 2013 11:46:48PM 1 point [-]

Well, let's call P1 my probability of actually going to the beach, even if you never show up. That is, (1-P1) is the probability that traffic keeps me from getting there, or my car breaks down, or whatever. And let's call P2 my probability of your VR/simulation rig working. That is, (1-P2) is the probability that the scanner fails, etc. etc.

In your scenario, I put a P1 probability of ending up at the beach, and a P2 probability of ending up bowling. If both are high, then I'm confident that I will do both.

There is no "instead of". Going to the beach does not prevent me from bowling. Going bowling does not prevent me from going to the beach. Someone will go to the beach, and someone will go bowling, and both of those someones will be me.

Comment author: lavalamp 01 October 2013 11:54:43PM 0 points [-]

Your probabilities add up to more than 1...

Comment author: TheOtherDave 01 October 2013 11:59:46PM 1 point [-]

Of course they do. Why shouldn't they?

What is your probability that you will wake up tomorrow morning?
What is your probability that you will wake up Friday morning?
I expect to do both, so my probabilities of those two things add up to ~2.

In Mark's scenario, I expect to go bowling and I expect to go to the beach.
My probabilities of those two things similarly add up to ~2.

Comment author: lavalamp 02 October 2013 12:13:58AM 1 point [-]

I think we have the same model of the situation, but I feel compelled to normalize my probability. A guess as to why:

I can rephrase Mark's question as, "In 10 hours, will you remember having gone to the beach or having bowled?" (Assume the simulation will continue running!) There'll be a you that went bowling and a you that went to the beach, but no single you that did both of those things. Your successive wakings example doesn't have this property.

I suppose I answer 50% to indicate my uncertainty about which future self we're talking about, since there are two possible referents. Maybe this is unhelpful.

Comment author: TheOtherDave 02 October 2013 12:44:38AM *  1 point [-]

Yes, that seems to be what's going on.

That said, normalizing my probability as though there were only going to be one of me at the end of the process doesn't seem at all compelling to me. I don't have any uncertainty about which future self we're talking about -- we're talking about both of them.

Suppose that you and your husband are planning to take the day off tomorrow, and he is planning to go bowling, and you are planning to go to the beach, and I ask the two of you "what's y'all's probability that one of y'all will go bowling, and what's y'all's probability that one of y'all will go to the beach?" It seems the correct answers to those questions will add up to more than 1, even though no one person will experience bowling AND going to the beach. In 10 hours, one of you will will remember having gone to the beach, and one will remember having bowled.

This is utterly unproblematic when we're talking about two people.

In the duplication case, we're still talking about two people, it's just that right now they are both me, so I get to answer for both of them. So, in 10 hours, I (aka "one of me") will remember having gone to the beach. I will also remember having bowled. I will not remember having gone to the beach and having bowled. And my probabilities add up to more than 1.

I recognize that it doesn't seem that way to you, but it really does seem like the obvious way to think about it to me.

Comment author: lavalamp 02 October 2013 12:59:53AM 0 points [-]

I recognize that it doesn't seem that way to you, but it really does seem like the obvious way to think about it to me.

I think your description is coherent and describes the same model of reality I have. :)

Comment author: [deleted] 02 October 2013 12:52:47AM *  0 points [-]

I can rephrase Mark's question as, "In 10 hours, will you remember having gone to the beach or having bowled?"

Yes. Probabilities aside, this is what I was asking.

I suppose I answer 50% to indicate my uncertainty about which future self we're talking about, since there are two possible referents.

I was asking a disguised question. I really wanted to know: "which of the two future selfs do you identify with, and why?"

Comment author: lavalamp 02 October 2013 12:55:33AM *  1 point [-]

I was asking a disguised question. I really wanted to know: "which of the two future selfs do you identify with, and why?"

Oh, that's easy. Both of them, equally. Assuming accurate enough simulations etc., of course.

ETA: Why? Well, they'll both think that they're me, and I can't think of a way to disprove the claim of one without also disproving the claim of the other.

Comment author: [deleted] 02 October 2013 08:00:20PM -1 points [-]

ETA: Why? Well, they'll both think that they're me, and I can't think of a way to disprove the claim of one without also disproving the claim of the other.

Any of the models of consciousness-as-continuity would offer a definitive prediction.

Comment author: shminux 02 October 2013 12:09:21AM *  -1 points [-]

As I alluded to in another reply, assuming perfectly reliable scanning, and assuming that you hate losing in bowling to MarkAI, how do you decide whether to go practice bowling or to do something else you like more?

Comment author: TheOtherDave 02 October 2013 12:31:06AM 0 points [-]

If it's important to me not to lose in bowling, I practice bowling, since I expect to go bowling. (Assuming uninteresting scanning tech.)
If it's also important to me to show off my rocking abs at the beach, I do sit-ups, since I expect to go to the beach.
If I don't have the time to do both, I make a tradeoff, and I'm not sure exactly how I make that tradeoff, but it doesn't include assuming that the going to the beach somehow happens more or happens less or anything like that than the going bowling.

Admittedly, this presumes that the bowling-me will go on to live a normal lifetime. If I know the simulation will be turned off right after the bowling match, I might not care so much about winning the bowling match. (Then again, I might care a lot more.) By the same token, if I know the original will be shot tomorrow morning I might not care so much abuot my abs. (Then again, I might care more. I'm really not confident about how the prospect of upcoming death affects my choices; still less how it does so when I expect to keep surviving as well.)