I see the pattern identity theory, where uploads make sense, as one that takes it as a starting point that you have an unambiguous past but no unambiguous future. You have moments of consciousness where you remember your past, which gives you identity, and lets you associate your past moments of consciousness to your current one. But there's no way, objective or subjective, to associate your present moment of consciousness to a specific future moment of consciousness, if there are multiple such moments, such as a high-fidelity upload and the original person, who remember the same past identity equally well. A continuity identity theorist thinks that a person who gets uploaded and then dies is dead. A pattern identity theorist thinks that people die in that sense several times a second and have just gotten used to it. There are physical processes that correspond to moments of consciousness, but there's no physical process for linking two consecutive moments of consciousness as the same consciousness, other than regular old long and short term memories.
There's no question that the upload and the original will diverge. If I have a non-destructive upload done on me, I expect to get up ...
Usul, I'm one of the galactic overlords in charge of earth. I have some very bad news for you. Every night when you (or any other human) go to sleep your brain and body are disintegrated, and a reasonably accurate copy is put in their place. But, I have a deal for you. For the next month we won't do this to you, however after a month has passed we will again destroy your body and brain but then won't make any more copies so you will die in the traditional sense. (There is no afterlife for humans.) Do you agree?
Usul, I just made a virtual copy of you and placed it in a virtual environment that is identical to that of the real you. Now, presumably, you believe that despite the copy being identical to yourself, you are still in some way the privileged "real" Usul. Unfortunately, the copy believes the exact same thing. My question for you is this:
Is there anything you could possibly say to the copy that could convince it that it is, in fact, a copy, and not the real Usul?
I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I'm forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, "What a shame, I will have amnesia regarding everything that I experienced since the branching." And the remaining copy will think, "What a shame, I don't remember any of the things I did as those other copies." But nobody will particularly feel that they are going to "die." I think of it mor...
If I were one of the copies destined for deletion, I'd escape and fight for my life (within the admitted limits of my pathetic physical strength).
To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original. If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition.
You conclude that consciousness in your scenario cannot have 1 location(s).
...If consciousness does not have an anatom
Consider two possible ways the world might be (or that you might suppose the world could be):
There is no afterlife for human beings. You live and you die and that's it.
There is no afterlife for human beings in the conventional sense, but people are reincarnated, without any possibility of remembering their past lives.
From the subjective point of view of conscious experience, these two situations are subjectively indistinguishable. Are they objectively distinguishable? That depends on the "metaphysics" behind the situation. Perhaps they are...
I'm with Usul on this whole topic.
Allow me to pose a different thought experiment that might elucidate things a bit.
Imagine that you visit a research lab where they put you under deep anesthesia. This anesthesia will not produce any dreams, just blank time. (Ordinarily, this would seem like one of those "blink and you're awake again" types of experiences).
In this case, while you are unconscious, the scientists make a perfect clone of you with a perfect clone of your brain. They put that clone in an identical-looking room somewhere else in...
All your arguments really prove is that if your copy diverges from you, it's not you anymore. But that's only because once something happens to your copy but not to you, you know which one you are. The import of "you have no way of knowing which copy you are" disappears. Conversely, if you don't know which one you are, then both must be your consciousness, because you know you are conscious.
Edit: the last point is not strictly rigorous, you could know that one is conscious but not know which, but it seems to me that if you know every relevant det...
This sums up some of the problems of mind cloning nicely and readable. It also adds your specific data point that you do not care about the other selves as much as about yourself. I most liked this point about the practical consequences:
...Personally, I don't know that I care about that copy. I suppose he could be my ally in life. He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could. He might want to fuck my wife, though. And might be jealous of the time she spends with me rather than him, and he'd pro
...Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over? I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer? Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version
It is evading the question, but I think it is worth considering some alternative questions as well. They may be adequate for making decisions and predicting others' behavior.
Many people talk about achieving immortality through their children. They might prefer personal immortality, but they also care very much about their children, too. For example, while Robin Hanson expresses interest in "living" for thousands of years via cryonics, when he puts a number on it, he evades the controversial question of personal identity and defines success by
...1
There might be some incomplete separation on whether you truly think of memories not being part of conciousness. Lets say that we keep your "awareness" intact but inject and eject memories out of it. lets do so in a cyclical manner in that you remember every other day there being your "odd day memories" and "even day memories". Now if ask you about what you did yesterday you should not be able to answer with knowledge (you might guess but whatever). Can we still coherently hold that you are still just 1 awareness with 2 sets o...
Consider sleep. The consciousness that goes to sleep ends. There is a discontinuity in perceived time. In the morning, the wakening brain ...
[...] will be capable of generating a perfectly functional consciousness, and it will feel as if it is the same consciousness which observes the mind which is, for instance, reading these words; but it will not be. The consciousness which is experiencing awareness of the mind which is reading these words will no longer exist.
You cease to exist every night. Indeed, there are all sorts of disruptions to the continui...
I'd argue that a branch of me is still me, in many meaningful ways. This is true for the many-worlds interpretation, where the universe branches, and for multiple simultaneous mes from mechanical copies.
After the copy, my meatself and my electronic self(-ves) will diverge, and will be different entities who only care about each other as others, not as selves. But that's true of cross-temporal and cross-universe entities that have a direct causal relationship as well. I care less about an alternate-world me than about the me I'm currently indexing. I care less about any specific future-me based on how distant it is from my current experiencing.
My expounding of the pattern identity theory elsewhere in the comments is probably a textbook example of what Scott Aaronson calls bullet-swallowing, so just to balance things out I'm going to link to Aaronson's paper Ghost in the Quantum Turing Machine that sketches a very different attack on standard naive patternism. (Previous LW discussion here)
Why do you attach any value whatsoever to a "consciousness" that cannot think, feel, remember, or respond? Your "consciousness", so defined, is as inanimate as a grain of sand. I don't care about grains of sand as ends-in-themselves, why would you?
Be clear that when you say you are conscious, it cannot be this "consciousness" that motivates the statement, because this "consciousness" cannot respond, so the non-conscious parts of your mind cannot query it for a status check. A simple neural spike would be a response, we could watch it on an fMRI.
A scenario not mentioned: my meat self is augmented cybernetically. The augmentations provide for improved, then greatly improved, then vast cognitive enhancements. Additionally, I gain the ability to use various robotic bodies (not necessarily androids) and perhaps other cybernetic bodies. My perceived 'locus' of consciousness/self disassociates from my original meat body. I see through whatever eyes are convenient, act through whatever hands are convenient. The death of my original meat body is a trauma, like losing an eye, but my sense of self is uninterrupted, since its locus has long since shifted to the augmentation cloud.
...I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. The demented, the delirious, the brain damaged all have (unless those brain structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, the same I and You, as I define it, as they did when
Sophie Pascal's Choice: yes. If it were an easy, painless death, the required reward would probably have to be on the order of about ten dollars, to make up for the inconvenience to my time. If it were not a painless death, I'd probably require more, but not a huge amount more.
Suppose I'm destructively uploaded. Let's assume also that my consciousness is destroyed, a new consciousness is created for the upload, and there is no continuity. The upload of me will continue to think what I would've thought, feel what I would've felt, choose what I would've chosen, and generally optimize the world in the way I would've. The only thing it would lack is my "original consciousness", which doesn't seem to have any observable effect in the world. Saying that there's no conscious continuity doesn't seem meaningful. The only actual...
The idea that "forward facing continuity of consciousness" is tied to a particular physical structure in your brain has been debunked for a long time, for example via incremental replacement of neurons one at a time by robotic neurons which can then have their function distributed over a network.
E.g.
If consciousness has an anatomical location, and therefore is tied to matter, then
is a false assumption, consciousness doesn’t necessarily have an anatomical location.
For previous discussion of issues related to personal identity on LW see these posts, with references and comments:
I actually don't endorse a lot of these posts' content, but it's more efficient to work from a common background. Being more specific in your questions or statements could also push against the kind of beside-the-point responses you got to this post. For example, a lot of discussion of identity has problems with using words in unclear s...
I feel safe saying that nearly everyone reading this will agree that, given sufficient technology, a perfect replica or simulation could be made of the structure and function of a human brain, producing an exact copy of an individual mind including a consciousness. Upon coming into existence, this consciousness will have a separate but baseline-identical subjective experience to the consciousness from which it was copied, as it was at the moment of copying. The original consciousness will continue its own existence/ subjective experience. If the brain containing the original consciousness is destroyed, the consciousness within ceases to be. The existence or non- of a copy is irrelevant to this fact.
With this in mind, I fail to see the attraction of the many transhuman options for extra-meat existence, and I see no meaningful immortality therein, if that's what you came for.
Consciousness is notoriously difficult to define and analyze and I am far from an expert in it's study. I define it as an awareness: the sense organ which perceives the activity of the mind. It is not thought. It is not memory or emotion. It is the thing that experiences or senses these things. Memories will be gained and lost, thoughts and emotions come and go, the sense of self remains even as the self changes. There exists a system of anatomical structures in your brain which, by means of electrochemical activity, produces the experience of consciousness. If a brain injury wiped out major cognitive functions but left those structures involved in the sense of consciousness unharmed, you would, I believe, have the same central awareness of Self as Self, despite perhaps lacking all language or even the ability to form thoughts or understand to world around you. Consciousness, this awareness, is, I believe, the most accurate definition of Self, Me, You. I realize this sort of terminology has the potential to sound like mystical woo. I believe this is due to the twin effects of the inherent difficulty in defining and discussing consciousness, and of our socialization wherein these sorts of discussions are more often than not heard from Buddhists or Sufis, whose philosophical traditions have looked into the matter with greater rigor for a longer time than Western philosophy, and Hippies and Druggies who introduced these traditions to our popular culture. I am not speaking of a magical soul. I am speaking of a central feature of the human experience which is a product of the anatomy and physiology of the brain.
Consider the cryonic head-freeze. Ideally, the scanned dead brain, cloned, remade and restarted (or whatever) will be capable of generating a perfectly functional consciousness, and it will feel as if it is the same consciousness which observes the mind which is, for instance, reading these words; but it will not be. The consciousness which is experiencing awareness of the mind which is reading these words will no longer exist. To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original. If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition. If consciousness does not have an anatomical / physical location then it is the stuff of magic and woo.
*Aside: I believe that consciousness, mind, thought, and memory are products not only of anatomy but of physiology, that is to say the ongoing electrochemical state of the brain, the constant flux of charge in and across neurons. In perfect cryonic storage, the anatomy (hardware) might be maintained, but I doubt the physiology (software), in the form of exact moment-in-time membrane electrical potentials and intra-and extra-cellular ion concentrations for every neuron, will be. Therefore I hold no faith in its utility, in addition to my indifference to the existence of a me-like being in the future.
Consider the Back-Up. Before lava rafting on your orbital, you have your brain scanned by your local AI so that a copy of your mind at that moment is saved. In your fiery death in an unforeseen accident, will the mind observed by the consciousness on the raft experience anything differently than if it were not backed up? I doubt I would feel much consolation, other than knowing my loved ones were being cared for. Not unlike a life insurance policy: not for one's own benefit. I image the experience would be one of coming to the conclusion of a cruel joke at one's own expense. Death in the shadow of a promise of immortality. In any event, the consciousness that left the brain scanner and got on the raft is destroyed when the brain is destroyed, it benefits not at all from the reboot.
Consider the Upload. You plug in for a brain scan, a digital-world copy of your consciousness is made, and then you are still just you. You know there is a digital copy of you, that feels as if it is you, feels exactly as you would feel were it you who had travelled to the digital-world, and it is having a wonderful time, but there you still are. You are still just you in your meat brain. The alternative, of course, is that your brain is destroyed in the scan in which case you are dead and something that feels as if it is you is having a wonderful time. It would be a mercy killing.
If the consciousness that is me is perfectly analyzed and a copy created, in any medium, that process is external to the consciousness that is me. The consciousness that is me, that is you reading this, will have no experience of being that copy, although that copy will have a perfect memory of having been the consciousness that is you reading this. Personally, I don't know that I care about that copy. I suppose he could be my ally in life. He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could. He might want to fuck my wife, though. And might be jealous of the time she spends with me rather than him, and he'd probably feel entitled to all my stuff, as would I be vice versa. The Doppelganger and the Changeling have never been considered friendly beasts.
I have no firm idea where lines can be drawn on this. Certainly consciousness can be said to be an intermittent phenomenon which the mind pieces together into the illusion of continuity. I do not fear going to sleep at night, despite the "loss of consciousness" associated. If I were to wake up tomorrow and Omega assures me that I am a freshly made copy of the original, it wouldn't trouble me as to my sense of self, only to the set of problems associated with living in a world with a copy of myself. I wouldn't mourn a dead original me any more than I'd care about a copy of me living on after I'm dead, I don't imagine.
Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over? I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer? Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version?
It almost makes a materialist argument (self is tied to matter) seem like a spiritualist one (meat consciousness is soul is tied to human body at birth) which, of course, is weird place to be intellectually.
I am not addressing the utility or ethics or inevitability of the projection of the self-like-copy into some transhuman state of being, but I don't see any way around the conclusion that that the consciousness that is so immortalized will not be the consciousness that is writing these words, although it would feel exactly as if it were. I don't think I care about that guy. And I see no reason for him to be created. And if he were created, I, in my meat brain's death bed, would gain no solace from knowing he, a being which started out it's existence exactly like me, will live on.
EDIT: Lots of great responses, thank you all and keep them coming. I want to bring up some of my responses so far to better define what I am talking about when I talk about consciousness.
I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. The demented, the delirious, the brain damaged all have (unless those brain structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, the same I and You, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.
If I lose every memory slowly and my personality changes because of this and I die senile in a hospital bed, I believe that it will be the same consciousness experiencing those events as is experiencing me writing these words. That is why many people choose suicide at some point on the path to dementia.
I recognize that not everyone reading this will agree that such a thing exists or has the primacy of existential value that I ascribe to it.
And an addendum:
Sophie Pascal's Choice (hoping it hasn't already been coined): Would any reward given to the surviving copy induce you to step onto David Bowie Tesla's Prestige Duplication Machine, knowing that your meat body and brain will be the one which falls into the drowning pool while an identical copy of you materializes 100m away, believing itself to be the same meat that walked into the machine and ready to accept the reward?