Mark_Friedenbach comments on Continuity in Uploading - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (87)
Ok, what if - like Eternal Sunshine of the Spotless Mind - I slowly over a period of time eliminate your memories. Then maybe - like Dark City - I go in and insert new memories, maybe generic, maybe taken from someone else. This can be done either fast or slowly if it matters.
This future continuation of your current self will have nothing other than a causal & computational connection to your current identity. No common memories whatsoever. Would you expect to experience what this future person experiences?
Based on your other comments, I infer that you consider this question entirely different from the question "Are you willing to consider this future person you?" Confirm?
Correct.
Cool, thanks. Given that, and answering for my own part: I'm not sure what any person at any time would possibly ever observe differentially in one case or the other, so I honestly have no idea what I'd be expecting or not expecting in this case. That is, I don't know what the question means, and I'm not sure it means anything at all.
That's fair enough. You got the point with your first comment, which was to point out that issues of memory-identity and continuous-experience-identity are / could be separate.
Perhaps I understand more than I think I do, then.
It seems to me that what I'm saying here is precisely that those issues can't be separated, because they predict the same sets of observations. The world in which identity is a function of memory is in all observable ways indistinguishable from the world in which identity is a function of continuous experience. Or, for that matter, of cell lineages or geographical location or numerological equivalence.
And I'm saying that external observations are not all that matters. Indeed it feels odd to me to hold that view when the phenomenon under consideration is subjective experience itself.
I didn't say "external observations".
I said "observations."
If you engage with what I actually said, does it feel any less odd?
You said "predict the same set of observations" which I implicitly took to mean "tell me something I can witness to update my beliefs about which theory is correct," to which the answer is: there is nothing you - necessarily external - can witness to know whether my upload is death-and-creation or continuation. I alone am privy to that experience (continuation or oblivion), although the recorded memory is the same in either case so there's no way clone could tell you after.
You could use a model of consciousness and a record of events to infer which outcome occurred. And that's the root issue here, we have different models of consciousness and therefore make different inferences.
You keep insisting on inserting that "external" into my comment, just as if I had said it, when I didn't. So let me back up a little and try to be clearer.
Suppose the future continuation you describe of my current self (let's label him "D" for convenience) comes to exist in the year 2034.
Suppose D reads this exchange in the archives of LessWrong, and happens to idly wonder whether they, themselves, are in fact the same person who participated in LessWrong under the username TheOtherDave back in January of 2014, but subsequently went through the process you describe.
"Am I the same person as TheOtherDave?" D asks. "Is TheOtherDave having my experiences?"
What ought D expect to differentially observe if the answer were "yes" vs. "no"? This is not a question about external observations, as there's no external observer to make any such observations. It's simply a question about observations.
And as I said initially, it seems clear to me that no such differentially expected observations exist... not just no external observations, but no observations period. As you say, it's just a question about models -- specifically, what model of identity D uses.
Similarly, whether I expect to be the same person experiencing what D experiences is a question about what model of identity I use.
And if D and I disagree on the matter, neither of us is wrong, because it's not the sort of question we can be wrong about. We're "not even wrong," as the saying goes. We simply have different models of identity, and there's no actual territory for those models to refer to. There's no fact of the matter.
Similarly, if I decide that you and I are really the same person, even though I know we don't share any memories or physical cells or etc., because I have a model of identity that doesn't depend on any of that stuff... well, I'm not even wrong about that.