This is written right  after reading Rob Bensinger's relevant post and andeslodes' comments. That discussion touched on a topic I have long held a strong belief about. I purposed first-person perspective is not physically reducible and ought to be regarded as primitive. Following that, questions such like "which future mind is me?" or "which mind ought to be regarded as myself in the future?" does not have an unequivocal logical answer. 

To demonstrate this position imagine this: You are Elon Musk instead of whoever you actually are. Here it is not suggesting that your physical body and Elon's switch places. The world is still objectively the same. But instead of experiencing it from the perspective of your current body (e.g. in my case that of Dadadarren's), now you do it from that of Elon's. The subjective experience felt and consciousness accessible is now from the Billionaire's physical point of view instead of your current case, viz. you are Elon. 

Everyone but Elon himself would say the above is a different scenario from reality. Each of us  knows which body our first-person perspective resides in. And that is clearly not the physical human being referred as Elon Musk. But the actual and imaginary scenarios are not differentiated by any physical difference of the world, as the universe is objectively identical. 

So to quote Arnold Zuboff (not verbatim), it is a question of  "Why is that you are you and I am me?" (hopefully with the above context this doesn't sound like a tautological question begging). It is something without a physical explanation. I have long held this "which person is me?" is primitively known. (The more appropriately worded question should be "which thing is me" as self identification happens prior to even the conception of personhood) It is a fiat fact so fundamentally clear to each one that doesn't have or need any explanation: the only accessible experience comes from this body, and that's me. Nothing more to it. 

In problems involving brain-copying machines, "which brain is me?" ought to be answered the same way: once the copying process is over and finding myself waking up as one of the brains, "which brain is me?" would be apparent. But short of that, without subjectively experiencing from the perspective of one of the brains, there is no way to analyze which of the two I would wake up to be (other than out right stipulations). 

This experience-based primitivity also means inter-temporal self identification only goes one way. Since there is no access to subjective experience from the future, I cannot directly identify which/who would be my future self. I can only say which person is me in the past, as I have the memory of experiencing from its perspective. 

Treating the future human being who will recognize the current me as his past self—the one whose body continuously evolved from mine— as myself in the future is something everyone seem to practice. It provides better chances of survival for said physical body, which explains why it is such a common intuition. I purposely refer to it as an intuition as it is neither a rigorously deduced logical conclusion nor a primitive identification. It is a practice so common we rarely have to justify it; a consensus. 

Devices such as mind uploading and teletransportation goes beyond the traditional intuition. Our intuition was form in an idiosyncratic circumstance, and it proved useful in such situation.  Answers to questions of future self involving those devices cannot be purely derived from our old consensus. It would invariably involves reinterpreting and/or expanding it. And that is not a strictly logical exercise but a heavily value-laden one.

The consensus is no more. One might say the traditional intuition still holds water without the same, continuously evolved physical body. So that I shall regard teletransported copy of me as my future self without a problem. Others might held that the traditional intuition doesn't depend on physical subgrade, I must regard any mind in the future who consider my current self as their past a future me.  Such that I shall regard the uploaded mind as a future self no less than my old-fashioned carbon body. But others might say the better survival chance for the physical body is what drives the original intuition, it makes no sense to disregard it: so neither does an uploaded mind or the teletransportation qualifies as myself in the future... None of that would be, logically speaking, wrong. They just diverged at the axiom level; their difference stems not from distinctions in their respective optimization logic, but which objective was set to be optimized at the very beginning. 

So we should be skeptical to claims of solving such questions by superior logic. "Which future minds ought to be regarded as myself in the future", is more a discussion of whose starting point is better than whose reasoning is. Proponent of a particular camp is, at the end of day, promoting an implied set of objectives that ought to be pursued.  

New Comment
12 comments, sorted by Click to highlight new comments since:

But instead of experiencing it from the perspective of your current body (e.g. in my case that of Dadadarren's), now you do it from that of Elon's. The subjective experience felt and consciousness accessible is now from the Billionaire's physical point of view instead of your current case, viz. you are Elon. 

I don't think this dualism is justified.  There IS NO distinct "me", separate from the embodiment in the brain that is physically separated from Elon's embodied experiences.  My subjective experience is inseparable from the perspective of my current body.  Elon's experience (presumably) is also simply part of his physical state.   In other words, I am Elon, when Elon's body is the subject of the sentence.

I fully agree that intuitions about identity and continuity are not trained on upload and copy situations.  Those things are just out-of-domain for our beliefs about how it works.

If one regard physics as a detached description of the world— like a non-interacting yet apt depiction of the objective reality, (assuming that exists and is attainable) then yes there is no distinct "me".  And any explanation of subject experience ought to be explained by physical processes, such that everyone's "MEness" must be ultimately reduced to the physical body. 

However my entire position  stems from a different logical starting point.  It starts with "me". It is an undeniable and fundamental fact that I am this particular thing, which I later referred to it as a human being called Dadadarren.  (Again I assume the same goes for everyone) Everything I know about the world is through that thing's interaction with its environment, which leads to accessible subjective experience. Even physics is learned in such a way, as well as the conception that other things could have perspective different from mine own.  I am not interacting with the world as someone of something else, from those thing's perspective, is just a simple realization after that. 

This way physics would not be taken as the detached fundamental description of objective reality. The description has to originate from a given thing's perspective, working based on its interaction from the environment. That given perspective could be mine, could be Elon's, could be a thermometer's or an electron's.  We strive for concepts and formulas that works from a wide range of perspectives. That's what physical objectivity should mean. 

So it follows that physics cannot explain why I am Dadadarren and not Elon:  because perspective is a prior.  This makes way more sense to me personally: the physical knowledge about the two human beings doesn't even touch on why I am Dadadarren and not Elon. (and that was the purpose of the thought experiment) At least better than the alternative: that they is no ME, or that I am Elon just as I am me,  in some convoluted sense such as open individualism. 

So from where I stand, it is physicalism that requires justification. 

Oh, I wonder if our crux is 

an undeniable and fundamental fact that I am this particular thing, which I later referred to it as a human being called Dadadarren

I don't dispute that you have an undeniable and fundamental fact that you experience things which you summaraize as Dadadarren.  I do question what you mean by "this particular thing".  I don't know that our model of physics is true or complete (in fact, I suspect it's not), but I don't see any reason to believe that conscious experiences (me-ness) are somehow separate from underlying physical processes.

Have you seen Relativity Theory for What the Future 'You' Is and Isn't? It's making pretty much exactly the same point, in response to the same post. I didn't get around to commenting there, so here we are.

You are technically correct, but I don't think that's the best kind of correct.

You could decide that future you isn't you if it's wearing a blue hat, but that would make no sense at all. So I don't think logic is out of play here. Technically those could be your values, but it would be really remarkable to find someone that genuinely held those values.

Threaten someone today, then tomorrow tell them you're a different person, because that's what your values say. They'll look at you funny and go on distrusting and disliking you.

If you threaten someone, then go through a perfect duplication process, they'll dislike and distrust both of you. I doubt you'll find this relevant, but just to note that both perfect clones are you from an outside perspective. Almost no one would disagree and trust and like the clone that happened to be the duplicate, not the original.

"But which am I, really?" you ask again. I submit that the you of today is logically both, or neither, depending on your definition. Your conscious experience of right now does not "have" the conscious experience of you tomorrow. Each moment of consciousness is linked only by memories, beliefs, and having very similar patterns. No two moments of consciousness are the same; when we say they're the same individual, that's all we mean.

This sounds absurd. Of course the you of tomorrow is you, and a clone is not you.

But people can and do treat the them of tomorrow as not-them. The most dramatic is extreme short-term decision-making, as in the meme where someone leaves a note for drunk them to drink water and eat food, and in the morning find a note back from drunk-them saying "fuck you, morning self!". Other I'll-pay-for-this-later decisions are partly in the same boat.

So, which perfect clone is you tomorrow? Both if you take a standard view of identity and why to care about future you. Neither if you don't. But choosing one as "you" and not the other is just like disowning future you if they wear a blue hat. 

I think this discussion is focusing  on what other's would behave towards me, and derive what ought to be regarded as my future self from there. That is certainly a valid discussion to be had. However my post is taking about a different (thought related) topic. 

For example, if I for whatever crazy reason thinks that me from tomorrow:—the one with (largely) the same physical body and no trick on memory whatsoever— not my future self. Then I would do a bunch of irresponsible things that would lead to others' dislike or hostility toward me that could eventually lead to my demise. But so what? If I regard that as a different person, then to hell with him. The current me wouldn't even care. So being detrimental to that future person would not compel current me to regard him as my future self. 

Luckily we do not behave that way.  Everyone, rational ones at least, considers the person with the same physical body and memories of the current self as themself in the future. That is the survival instinct, that is the consensus. 

But that consensus is about an idiosyncratic situation: where memory(experience) and physical body are bound together. Take that away, and we no longer have a clear, unequivocal basis to start a logical discussion. Someone's basis could be the survival of the same physical body would not step into the teletranporter, even if it is greatly convenient and could benefit the one steps out of it. Someone else could start from a different basis. They may believe that only patterns matters. So mind uploading into a silicon machine to make easy copies at the cost of adversely affecting the carbon body would be welcomed. None of these positions could be rebutted by the cost/benefit analysis of some future minds. Because they may or may not care about those minds, at different levels, in the first place. 

Sure, logic is not entirely irrelevant. It comes into play after you pick the basis of your decision. But values, instead of logic, largely determines the answer to the question. 

I agree with all of that.

It's probably relevant to think about why we tend to value our future selves in the first place. I think it's that each of us has memories (and the resulting habits) of thinking "wow, past self really screwed me over. I hate that. I think I'll not screw future self over so that doesn't happen again". We care because there's a future self that will hate us if we do, and we can imagine it very vividly. In addition, there's an unspoken cultural assusmption that it's logical to care about our future selves.

I included some of how other people regard our identity, but that's not my point. My point is that, for almost any reason whatsoever you could come up with to value your physically continuous future self, you'd also value a physically discontinuous future self that maintains the same mind-pattern. That's except for deciding "I no longer care about anything that teleports", which is possible and consistent, but no more sensible than stopping caring about anything wearing blue hats.

So sure, people aren't necessarily logically wrong if they value their physically continuous future self over a perfect clone (or upload). But they probably are making a logic error, if they have even modestly consistent values.

To demonstrate this position imagine this: You are Elon Musk instead of whoever you actually are. Here it is not suggesting that your physical body and Elon's switch places. The world is still objectively the same. But instead of experiencing it from the perspective of your current body (e.g. in my case that of Dadadarren's), now you do it from that of Elon's. The subjective experience felt and consciousness accessible is now from the Billionaire's physical point of view instead of your current case, viz. you are Elon. 

It's not clear what you mean here. As we are talking about the question "Which future mind is me?", could you make the example in its terms? For instance, consider this:

  1. Dadadarren and Elon Musk participate in an experiment where their brains are swapped between their bodies. Both go to sleep, then the operation is performed and then they awake. Dadadarren's first person perspective goes to sleep in Dadadarren's body and awakens in Elon Musk's, while having memories of being Dadadarren. As a result both perspective inhabitiung bodies of Dadadarren and Elon Musk notice the change.
  2. Nothing has physically changed about the brains and body of Elon Musk or Dadadarren, even though Dadadarren's and Elon Musks first person perspective were swapped. Now it's not Elon Musk's perspective who awakens in Elon Musk's body, but Dadadarren's. However noone notices anything. When Elon Musk's body awakens, its perspective remembers being Elon Musk and not Dadadarren because memories are physical and therefore were not changed. 

In the 2. case how do we know that anything happened at all? How do we know that the perspectives were swapped? Or, for that matter, how do we know that all perspective are not constantly being swapped a hundred times every second? Here "First person perspective" seems to be not just epiphenomenon but epiepiphenomenon, completely irrelevant to anything even to conscious experience of a person in question. Which is probably not what you meant, so I'm a bit at a loss. 

Suffice to say, thinking about such scenario doesn't make me see the question "Why is that you are you and I am me?" as less of a tautology.

Everyone but Elon himself would say the above is a different scenario from reality. Each of us knows which body our first-person perspective resides in. And that is clearly not the physical human being referred as Elon Musk. But the actual and imaginary scenarios are not differentiated by any physical difference of the world, as the universe is objectively identical.

They are either differentiated by a physically different location of some part of your experience - like your memory being connected to Elon's sensations, or your thought being executed in other location - or it would be wrong to say that this scenario is different from reality: what you imagine would just correctly correspond to Elon's experiences also being real.

This experience-based primitivity also means inter-temporal self identification only goes one way. Since there is no access to subjective experience from the future, I cannot directly identify which/who would be my future self. I can only say which person is me in the past, as I have the memory of experiencing from its perspective. 

While there is large difference in practice between recalling past event and anticipating future event, on conceptual level there is no meaningful difference. You don't have direct access to past events, memory is just an especially simple and reliable case of inference.

Would you say that the continuity of your consciousness (as long as you're instantiated by only one body) only exists by consensus?

What if the consensus changed? Would you cease to have the continuity of consciousness?

If the continuity of your consciousness currently doesn't depend on consensus, why think that your next conscious experience is undefined in case of a duplication? (Rather than, let's say, assigning even odds to finding yourself to be either copy?)

Also, I see no reason for thinking the idea of your next subjective experience being undefined (there being no case on the matter as to which conscious experience, if any, you'll have) is even a coherent possibility. It's clear what it would mean for your next conscious experience to be something specific (like feeling pain while seeing blue). It's also clear what would it mean for it to be NULL (like after a car accident). But it being undefined doesn't sound like a coherent belief.

[-]Seed10

You are Elon Musk instead of whoever you actually are.

This is a combination of descriptions only locally accurate in two different worlds and not coherent as a thought experiment asking about the one world fitting those descriptions.

Agreed. The premise that physical reality is identical is false, unless you're a hardcore non-physicalist of some sort. Something physical needs to be piping the sensory experience from Musk's body to your brain, because there's little question that your brain is what makes (or whose unfolding pattern, is) your conscious experience.