I feel safe saying that nearly everyone reading this will agree that, given sufficient technology, a perfect replica or simulation could be made of the structure and function of a human brain, producing an exact copy of an individual mind including a consciousness.  Upon coming into existence, this consciousness will have a separate but baseline-identical subjective experience to the consciousness from which it was copied, as it was at the moment of copying.  The original consciousness will continue its own existence/ subjective experience.  If the brain containing the original consciousness is destroyed, the consciousness within ceases to be.  The existence or non- of a copy is irrelevant to this fact.

With this in mind, I fail to see the attraction of the many transhuman options for extra-meat existence, and I see no meaningful immortality therein, if that's what you came for.

Consciousness is notoriously difficult to define and analyze and I am far from an expert in it's study.  I define it as an awareness: the sense organ which perceives the activity of the mind.  It is not thought.  It is not memory or emotion.  It is the thing that experiences or senses these things.  Memories will be gained and lost, thoughts and emotions come and go, the sense of self remains even as the self changes.  There exists a system of anatomical structures in your brain which, by means of electrochemical activity, produces the experience of consciousness.  If a brain injury wiped out major cognitive functions but left those structures involved in the sense of consciousness unharmed, you would, I believe, have the same central awareness of Self as Self, despite perhaps lacking all language or even the ability to form thoughts or understand to world around you.  Consciousness, this awareness, is, I believe, the most accurate definition of Self, Me, You.  I realize this sort of terminology has the potential to sound like mystical woo.  I believe this is due to the twin effects of the inherent difficulty in defining and discussing consciousness, and of our socialization wherein these sorts of discussions are more often than not heard from Buddhists or Sufis, whose philosophical traditions have looked into the matter with greater rigor for a longer time than Western philosophy, and Hippies and Druggies who introduced these traditions to our popular culture.  I am not speaking of a magical soul.  I am speaking of a central feature of the human experience which is a product of the anatomy and physiology of the brain.

Consider the cryonic head-freeze. Ideally, the scanned dead brain, cloned, remade and restarted (or whatever) will be capable of generating a perfectly functional consciousness, and it will feel as if it is the same consciousness which observes the mind which is, for instance, reading these words; but it will not be. The consciousness which is experiencing awareness of the mind which is reading these words will no longer exist. To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original.  If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition.  If consciousness does not have an anatomical / physical location then it is the stuff of magic and woo.

*Aside: I believe that consciousness, mind, thought, and memory are products not only of anatomy but of physiology, that is to say the ongoing electrochemical state of the brain, the constant flux of charge in and across neurons.  In perfect cryonic storage, the anatomy (hardware) might be maintained, but I doubt the physiology (software), in the form of exact moment-in-time membrane electrical potentials and intra-and extra-cellular ion concentrations for every neuron, will be.  Therefore I hold no faith in its utility, in addition to my indifference to the existence of a me-like being in the future.

Consider the Back-Up. Before lava rafting on your orbital, you have your brain scanned by your local AI so that a copy of your mind at that moment is saved.  In your fiery death in an unforeseen accident, will the mind observed by the consciousness on the raft experience anything differently than if it were not backed up? I doubt I would feel much consolation, other than knowing my loved ones were being cared for.  Not unlike a life insurance policy: not for one's own benefit.  I image the experience would be one of coming to the conclusion of a cruel joke at one's own expense.  Death in the shadow of a promise of immortality.  In any event, the consciousness that left the brain scanner and got on the raft is destroyed when the brain is destroyed, it benefits not at all from the reboot.

Consider the Upload.  You plug in for a brain scan, a digital-world copy of your consciousness is made, and then you are still just you.  You know there is a digital copy of you, that feels as if it is you, feels exactly as you would feel were it you who had travelled to the digital-world, and it is having a wonderful time, but there you still are. You are still just you in your meat brain.  The alternative, of course, is that your brain is destroyed in the scan in which case you are dead and something that feels as if it is you is having a wonderful time.  It would be a mercy killing.

If the consciousness that is me is perfectly analyzed and a copy created, in any medium, that process is external to the consciousness that is me.  The consciousness that is me, that is you reading this, will have no experience of being that copy, although that copy will have a perfect memory of having been the consciousness that is you reading this.  Personally, I don't know that I care about that copy.  I suppose he could be my ally in life.  He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could.  He might want to fuck my wife, though.  And might be jealous of the time she spends with me rather than him, and he'd probably feel entitled to all my stuff, as would I be vice versa. The Doppelganger and the Changeling have never been considered friendly beasts.

I have no firm idea where lines can be drawn on this.  Certainly consciousness can be said to be an intermittent phenomenon which the mind pieces together into the illusion of continuity.  I do not fear going to sleep at night, despite the "loss of consciousness" associated. If I were to wake up tomorrow and Omega assures me that I am a freshly made copy of the original, it wouldn't trouble me as to my sense of self, only to the set of problems associated with living in a world with a copy of myself.  I wouldn't mourn a dead original me any more than I'd care about a copy of me living on after I'm dead, I don't imagine.  

Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over?  I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer?  Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version?

It almost makes a materialist argument (self is tied to matter) seem like a spiritualist one (meat consciousness is soul is tied to human body at birth) which, of course, is weird place to be intellectually.

I am not addressing the utility or ethics or inevitability of the projection of the self-like-copy into some transhuman state of being, but I don't see any way around the conclusion that that the consciousness that is so immortalized will not be the consciousness that is writing these words, although it would feel exactly as if it were.  I don't think I care about that guy.  And I see no reason for him to be created. And if he were created, I, in my meat brain's death bed, would gain no solace from knowing he, a being which started out it's existence exactly like me, will live on.

EDIT: Lots of great responses, thank you all and keep them coming.  I want to bring up some of my responses so far to better define what I am talking about when I talk about consciousness.

I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. The demented, the delirious, the brain damaged all have (unless those brain structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, the same I and You, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.

If I lose every memory slowly and my personality changes because of this and I die senile in a hospital bed, I believe that it will be the same consciousness experiencing those events as is experiencing me writing these words. That is why many people choose suicide at some point on the path to dementia.

I recognize that not everyone reading this will agree that such a thing exists or has the primacy of existential value that I ascribe to it.

And an addendum:
Sophie Pascal's Choice (hoping it hasn't already been coined): Would any reward given to the surviving copy induce you to step onto David Bowie Tesla's Prestige Duplication Machine, knowing that your meat body and brain will be the one which falls into the drowning pool while an identical copy of you materializes 100m away, believing itself to be the same meat that walked into the machine and ready to accept the reward?

New Comment
142 comments, sorted by Click to highlight new comments since: Today at 1:17 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I see the pattern identity theory, where uploads make sense, as one that takes it as a starting point that you have an unambiguous past but no unambiguous future. You have moments of consciousness where you remember your past, which gives you identity, and lets you associate your past moments of consciousness to your current one. But there's no way, objective or subjective, to associate your present moment of consciousness to a specific future moment of consciousness, if there are multiple such moments, such as a high-fidelity upload and the original person, who remember the same past identity equally well. A continuity identity theorist thinks that a person who gets uploaded and then dies is dead. A pattern identity theorist thinks that people die in that sense several times a second and have just gotten used to it. There are physical processes that correspond to moments of consciousness, but there's no physical process for linking two consecutive moments of consciousness as the same consciousness, other than regular old long and short term memories.

There's no question that the upload and the original will diverge. If I have a non-destructive upload done on me, I expect to get up ... (read more)

7Usul8y
Thanks for the reply. I am not convinced by the pattern identity theorist because, I suppose, I do not see the importance of memory in the matter, nor the thoughts one might have about those thoughts. If I lose every memory slowly and die senile in a hospital bed I believe that it will be the same consciousness experiencing those events as is experiencing me writing these words. I identify that being which holds no resemblance to my current intellect and personality will be my Self in a way that an uploaded copy with my current memory and personality can never be. I might should have tabooed "consciousness" from the get go, as there is no one universal definition. For me it is passive awareness. This meat awareness in my head will never get out and it will die and no amount of cryonics or uploading to create a perfect copy that feels as if it is the same meat awareness will change that. Glass half-empty I suppose.
9Risto_Saarelma8y
You're still mostly just arguing for your personal intuition for the continuity theory though. People have been doing that pretty much as long as we've had fiction about uploads or destructive teleportation, with not much progress to the arguments. How would you convince someone sympathetic to the pattern theory that the pattern theory isn't viable? FWIW, after some earlier discussions about this, I've been meaning to look into Husserl's phenomenology to see if there are some more interesting arguments to be found there. That stuff gets pretty weird and tricky fast though, and might be a dead end anyway.
4Usul8y
Honestly, I'm not sure what other than intuition and subjective experience we have to go with in discussing consciousness. Even the heavy hitters in the philosophy of consciousness don't 100% agree that it exists. I will be the first to admit I don't have the background in pattern theory or the inclination to get into a head to head with someone who does. If pressed, right now I'm leaning towards the matter-based argument, that if consciousness is not magical then it is tied to specific sets of matter. And that a set of matter can not exist in multiple locations. Therefore a single consciousness can not exist in multiple locations. The consciousness A that I am now is in matter A. If a copy consciousness B is made in matter B and matter A continues to exist than it is reasonable to state that consciousness A remains in matter A. If matter A is destroyed there is no reason to assume consciousness A has entered matter B simply because of this. You are in A now. You will never get to B. So, if it exists, and it is you, you're stuck in the meat. And undeniably, someone gets stuck in the meat. I imagine differing definitions of You, self, consciousness, etc would queer the deal before we even got started.
4Risto_Saarelma8y
So, there are two things we need to track here, and you're not really making a distinction between them. There are individual moments of consciousness, which, yes, probably need to be on a physical substrate that exists in the single location. This is me saying that I'm this moment of conscious experience right now, which manifests in my physical brain. Everybody can be in agreement about this one. Then there is the continuity of consciousness from moment to moment, which is where the problems show up. This is me saying that I'm the moment of conscious experience in my brain right now, and I'm also going to be the next moment of conscious experience in my brain. The problems start when you want to say that the moment of consciousness in your brain now and the moment of consciousness in your brain a second in the future are both "your consciousness" and the moment of consciousness in your brain now and the moment of consciousness in your perfect upload a second in the future are not. There is no actual "consciousness" that refers to things other than the single moments for the patternist. There is momentary consciousness now, with your memories, then there is momentary consciousness later, with your slightly evolved memories. And on and on. Once you've gone past the single eyeblink of consciousness, you're already gone, and a new you might show up once, never, or many times in the future. There's nothing but the memories that stay in your brain during the gap laying the claim for the you-ness of the next moment of consciousness about to show up in a hundred or so milliseconds.
2Usul8y
I'm going to go ahead and continue to disagree with the the pattern theorists on this one. Has the inverse of the popular "Omega is a dick with a lame sense of irony" simulation mass-murder scenario been discussed? Omega (truthful) gives you a gun. "Blow your brains out and I'll give the other trillion copies a dollar." It seems the pattern theorist takes the bullet or Dr Bowie-Tesla's drowning pool with very little incentive. The pattern theorists as you describe them would seem to take us also to the endgame of Buddhist ethics (not a Buddhist, not proselytizing for them): You are not thought, you are not feeling, you are not memory, because these things are impermanent and changing. You are the naked awareness at the center of these things in the mind of which you are aware. (I'm with them so far. Here's where I get off): All sentient beings are points of naked awareness, by definition they are identical (naked, passive), therefore they are the same, Therefore even this self does not matter, therefore thou shall not value the self more than others. At all. On any level. All of which can lead you to bricking yourself up in a cave being the correct course of action. To your understanding, does the pattern theorist (just curious, do you hold to the views you are describing as pattern theory?) define self at all on any level? Memory seems an absurd place to do so from, likewise personality, thought- have you heard the nonsense that thought comes up with? How can a pattern theorist justify valuing self above other? Without a continuous You, we get to the old Koan "Who sits before me now? (who/what are You?)" "Leave me alone and go read up on pattern theory yourself, I'm not your God-damn philosophy teacher." Is a perfectly acceptable response, by the way. No offense will be taken and it would not be an unwarranted reply. I appreciate the time you have taken to discuss this with me already.
1Risto_Saarelma8y
There is some Buddhist connection, yes. The moments of experience thing is a thing in some meditation styles, and advanced meditators are actually describing something like subjective experience starting to feel like an on/off sequence instead of a continuous flux. Haven't gone really deep into what either the Buddhist metaphysics or the meditation phenomenology says. Neuroscience also has some discrete consciousness steps stuff, but I likewise haven't gone very deep into that. Anyway, This is still up for grabs. Given the whole thing about memories being what makes you you, consciousness itself is nice but it's not all that. It can still be your tribe against the world, your family against your tribe, your siblings against your family and you and your army of upload copies against your siblings and their armies of upload copies. So I'm basically thinking about this from a kin altruism and a general having people more like you closer in your circle of concern than people less like you thing. Upload copies are basically way, way closer kin than any actual kin. So am I a pattern theorist? Not quite sure. It seems to resolve lots of paradoxes with the upload thought experiments, and I have no idea about a way to prove it wrong. (Would like to find one though, it seems sorta simplistic and we definitely still don't understand consciousness to my satisfaction.) But like I said, if I sit down on an upload couch, I fully expect to get up from an upload couch, not suddenly be staring at a HUD saying "IN SIMULATION", even though pattern theory seems to say that I should expect each outcome with 50 % probability. There will be someone who does wake up in the simulation with my memories in the thought experiment, no matter which interpretation, so I imagine those versions will start expecting to shift viewpoints while they do further upload scans, while the version of me who always wakes up on the upload coach (by the coin-toss tournament logic, there will be a me who never
0polymathwannabe8y
No, you're not even that.
2Brillyant8y
Yep. And "Self". These are tricky terms that guarantee confusion.
5MockTurtle8y
I very much like bringing these concepts of unambiguous past and ambiguous future to this problem. As a pattern theorist, I agree that only memory (and the other parts of my brain's patterns which establish my values, personality, etc) matter when it comes to who I am. If I were to wake up tomorrow with Britney Spear's memories, values, and personality, 'I' will have ceased to exist in any important sense, even if that brain still had the same 'consciousness' that Usul describes at the bottom of his post. Once one links personal identity to one's memories, values and personality, the same kind of thinking about uploading/copying can be applied to future Everett branches of one's current self, and the unambigous past/ambiguous future concepts are even more obviously important. In a similar way to Usul not caring about his copy, one might 'not care' about a version of oneself in a different Everett branch, but it would still make sense to care about both future instances of yourself BEFORE the split happens, due to the fact that you are uncertain which future you will be 'you' (and of course, in the Everett branch case, you will experience being both, so I guess both will be 'you'). And to bring home the main point regarding uploading/copying, I would much prefer that an entity with my memories/values/personality continue to exist in at least one Everett branch, even if such entities will cease existing in other branches. Even though I don't have a strong belief in quantum multiverse theory, thinking about Everett branches helped me resolve the is-the-copy-really-me? dilemma for myself, at least. Of course, the main difference (for me) is that with Everett branches, the different versions of me will never interact. With copies of me existing in the same world, I would consider my copy as a maximally close kin and my most trusted ally (as you explain elsewhere in this thread).

Usul, I'm one of the galactic overlords in charge of earth. I have some very bad news for you. Every night when you (or any other human) go to sleep your brain and body are disintegrated, and a reasonably accurate copy is put in their place. But, I have a deal for you. For the next month we won't do this to you, however after a month has passed we will again destroy your body and brain but then won't make any more copies so you will die in the traditional sense. (There is no afterlife for humans.) Do you agree?

9Usul8y
Hail Xenu! I would need some time to see how the existential horror of going to bed tonight sits with me. Very likely it would overwhelm and around 4:00am tomorrow I'd take your deal. "(There is no afterlife for humans.) " I knew it! SUCK IT, PASCAL!
4casebash8y
I've thought of a second alternative thought experiment. Imagine that this doesn't happen when you go to sleep. Imagine that instead you are just teleported away and a clone teleported into you place while you are awake - even in the middle of a conversation, with the clone continuing on perfectly and no-one noticing. For some reason, this feels like it makes the scenario less persuasive.
4casebash8y
Imagine that every night, when you go to sleep, you are taken off to be tortured and you are replaced by a reasonably accurate clone. The fact that no-one on Earth have noticed doesn't mean that this isn't a bad thing!
7Viliam8y
Imagine that every fraction of a second you are torn apart to pieces in vacuum, and only the copy of you which is not a Boltzmann brain survives.
3[anonymous]8y
If this we're actually true, yes. Where are you going with this?
7dxu8y
Bye-bye in a month's time, I guess.
7James_Miller8y
It's to test how much value you place on a copy of yourself by changing your default position.
1[anonymous]8y
I don't believe I changed positions though. I don't value the clone any more than another human being.
6Risto_Saarelma8y
I'm guessing a part of the point is that nobody had noticed anything (and indeed still can't, at least in any way they could report back) until the arrangement was pointed out, which highlights that there are bits in the standard notion of personal identity that get a bit tricky once you try to get more robust than just going by intuition on them. How do you tell you die when a matrix lord disintegrates you and then puts together an identical copy? How do you tell you don't die when you go under general anesthesia for brain surgery and then wake up?
0[anonymous]8y
How does that matter at all? That seems like a completely unrelated, orthogonal issue. The question at hand is should the person being disintegrated expect to continue its subjective experience as the copy, or is it facing oblivion. The fact that you can't experimentally tell the difference as an outside observer is irrelevant.
9Risto_Saarelma8y
The strange part that might give your intuition a bit of a shake is that it's not entirely clear how you tell the difference as an inside observer either. The thought experiment wasn't "we're going to start doing this tomorrow night unless you acquiesce", it's "we've been doing this the whole time", and everybody had been living their life exactly as before until told about it. What should you now think of your memories of every previous day and going to sleep each night?
0[anonymous]8y
Either you cease to exist, or you don't. It's a very clear difference. You seem to be hung up on either memories or observations being the key to decoding the subjective self. I think that is your error.
5Risto_Saarelma8y
Yeah, for some reason I'm not inclined to give very much weight to an event that can't be detected by outside observers at all and which my past, present or future selves can't subjectively observe being about to happen, happening right now or having happened. This sounds like a thing people who want to explain away subjective consciousness completely are saying. I'm attacking the notion that the annoying mysterious part in subjective consciousness with the qualia and stuff includes a privileged relation from the present moment of consciousness to a specific future moment of consciousness, not the one that there's subjective consciousness stuff to begin with that isn't easy to reduce to just objective memories and observations.
3knpstr8y
At best the argument you're making is the same as "a tree falls in the forest and no one is around to hear it, does it make a sound?" argument. If I have a back-up of my computer software on a different hard drive and the current hard drive fails so I swap in the back up... my computer performs the same but it is obviously a different hard drive. If my hard drive doesn't fail and I simply "write over" my current hard drive with the hard drive back up, it is still not the same hard drive/software. It will be easy to forget it has been copied and is not the original, but the original (or last version) is gone and has been written over, despite it being "the same".
-1James_Miller8y
Or when 90% of the atoms that used to be in your body no longer are there.
3torekp8y
The reference of a word depends on the causal history of its use. In your scenario, "me", "my consciousness", etc. unambiguously refer to a functionalist continuation. In the real world, either the functionalist concept or the meat-based concept of self will work, will cover the relevant territory. It seems to me that in the real world, the choice(?) of which of these interpretations of "my desire to live" to (?)adopt, is arbitrary, or extra-rational.
0[anonymous]8y
If there are two theories about the world which fit the available evidence but have different predictions, that is a statement of our ignorance. We don't get to just arbitrarily choose which one is right.
1torekp8y
Sure, but there aren't any different predictions. "I will find myself to be in the destination teleporter" and "someone else will, and will remember my life and think he's me" aren't different predictions, just different descriptions.
1[anonymous]8y
They are different predictions about what future subjective experience you will have (or not have).
2torekp8y
No, they're just different interpretations of "you". All the molecules in the teleporters are in their particular locations; the person at the destination experiences and remembers particular experiences; there is no person remaining at the sending-pad. None of these facts are in dispute. We are left with a simple "if a tree falls in the forest where no one can hear it" verbal argument. There is no further fact to make it Really True or Really False that the teleported person is still you (although there might be linguistic or social facts that make it less misleading to talk in a pattern theory or meat theory way - depending on the audience).
1[anonymous]8y
Try taking the inside view. I don't know what to say. You persist on taking an outside view when that is explicitly what this debate is NOT about. I am beginning to remember why I left less wrong. Have a nice life.
-1torekp8y
Good point: I should address the inside view. So from the inside view, I remember my past life and I conclude, for example, "things are better for me now than a year ago." But none of what I can observe from the inside view tells me whether I'm still me because of meat, or because of pattern. Further complicating matters, I can take the inside view on other people's experiences, i.e. have empathy. I can have empathy for the past, present, or future experiences of other people. If I'm looking forward to the experiences of the teleported person, is that a selfish anticipation or an empathetic one? The inside view doesn't tell me. "But when I wake up in the destination teleporter, then I will know!" No. It's a given that "I" will wake up there, the only question is whether to use the word "I". If I look forward to a happy life after teleportation, I-before-teleporting will have been correct, regardless of pattern vs meat. The only question is whether to count that as selfish or empathetic looking-forward. And when "I" wake up, "I" still don't know whether to count "myself" a survivor or a newbie. This - that there is no Simple Truth about whether a future experience will be mine - can be hard to believe. That's because "mine" is a very central neural category as Yudkowsky would have it. So, even when neither the inside nor outside views gives us a handle on the question, it can still seem that "will it be me?" is a factual question. But it's a verbal one.
5Matthew_Opitz8y
I don't really understand the point of view of people like torekp who would say, "No, they're just different interpretations of "you"." I don't know about you, but I'm not accustomed to being able to change my interpretation of who I am to such an extent that I can change what sensory stimuli I experience. I can't just say to myself, "I identify with Barack Obama's identity" and expect to start experiencing the sensory stimuli that he is experiencing. Likewise, I don't expect to be able to say to myself, "I identify with my clone" and expect to start experiencing the sensory stimuli that the clone is experiencing. I don't seem to get a choice in the matter. If I enter the teleporter machine, I can WANT to identify with my clone that will be reconstructed on Mars all I want, but I don't expect that I will experience stepping out of the teleporter on Mars.
0torekp8y
Personal identity is vague or ambiguous insofar as it has no clear answer in sci-fi scenarios where pattern-identity and meat-identity diverge. But that doesn't mean there is any sense in which you can be the "same person" as Barack Obama. Nor, obviously, do two unrelated bodies share experiences. On the other hand, if you want to empathize and care deeply about Barack Obama's future experiences, you can. Nothing wrong with that.
0[anonymous]8y
But that has little relevance to the point at hand. You are really just saying the problem goes away if you redefine the terms. Like how people say "I achieve immortality through my kids" or "the ancients achieved immortality through their monuments." Sure it's true... For uninteresting definitions of "immortal."
2gjm8y
"I don't want to achieve immortality through my work; I want to achieve immortality through not dying." -- Woody Allen But I don't think torekp is "just saying the problem goes away if you redefine the terms". Rather, that the problem only appears when you define your terms badly or don't understand the definitions you're using. Or, perhaps, that the problem is about how you define your terms. In that situation, finding helpful redefinitions is pretty much the best you can do.
2torekp8y
"The problem is about how you define your terms" is pretty much it. It does no good to insist that our words must have clear reference in cases utterly outside of their historical use patterns. No matter how important to you the corresponding concept may be.
0[anonymous]8y
I have seen no evidence of that so far. torekp's posts so far have had nothing to do with the definition of "self" used by the OP, nor has he pointed out any problem specific to that usage.

Usul, I just made a virtual copy of you and placed it in a virtual environment that is identical to that of the real you. Now, presumably, you believe that despite the copy being identical to yourself, you are still in some way the privileged "real" Usul. Unfortunately, the copy believes the exact same thing. My question for you is this:

Is there anything you could possibly say to the copy that could convince it that it is, in fact, a copy, and not the real Usul?

2Usul8y
Great question. Usul and his copy do not care one bit which is which. But perhaps you could put together a convincing evidence chain. At which time copy Usul will still not care.
2dxu8y
Follow-up question: Assuming everything I said in my previous comment is true and that I have no incentive to lie to you (but no incentive to tell you the truth, either), would you believe me if I then said you were the copy?
5Usul8y
Based on your status as some-guy-on-the-internet and my estimate of the probability that this exact situation could come to be, no I do not believe you. To clarify: I do not privilege the original self. I privilege the current self.
2dxu8y
3Usul8y
Sorry, I missed that you were the copier. Sure, I'm the copy. I do not care one bit. My life goes on totally unaffected (assuming the original and I live in unconnected universes). Do I get transhuman immortality? Because that would be awesome for me. If so, I git the long end of the stick. It would have no value to poor old original, nor does anything which happens to him have intrinsic value for me. If you had asked his permission he would have said no.
5dxu8y
In other words, I could make you believe that you were either the original or the copy simply by telling you you were the original/the copy. This means that before I told you which one you were, you would have been equally comfortable with the prospect of being either one (here I'm using "comfortable" in an epistemic sense--you don't feel as though one possibility is "privileged" over the other). I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact! The pattern theorists answer this by denying this so-called "simple" fact's existence: the one says, "There is no fact of the matter as to which one I am, because until our experiences diverge, I am both." You, on the other hand, have no such recourse, because you claim there is a fact of the matter. Why, then, is the information necessary to determine this fact seemingly unavailable to you and available to me, even though it's a fact about your consciousness, not mine?
7Usul8y
The genesis of my brain is of no concern as to whether or not I am the consciousness within it. I am, ipso facto. When I say it doesn't matter if I am an original or a copy or a copy of a copy I mean to say just exactly that. To whom are you speaking when you ask the question who are you? if it is to me the answer is "Me" I'm sorry that I don't know whether or not I am a copy but I was UNconscious at the time. If copy is B and original is A. The question of whether I am A or B is irrelevant to the question of am I ME, which I am. Ask HIM the same question and HE will say the same and it will be true coming from his mouth. If I drug you and place you in a room with two doors, only I would know which of those doors you entered though. This means that before I told you which one you entered, you would have been equally comfortable with the prospect of being either one. I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact!
0Dentin8y
I appear to hold a lot of the same views as Usul, so I'll chime in here. You could, but since I don't privilege the original or the copy, it wouldn't matter. You can swap the labels all day long and it still wouldn't affect the fact that the 'copy' and the 'original' are both still me. No matter how many times Pluto gains or loses its "planet" status, it's still the same ball of ice and rock. I'll go one step further than the pattern theorists and say that I am both, even after our experiences diverge, as long as we don't diverge too far (where 'too far' is up to my/our personal preference.)

I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I'm forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, "What a shame, I will have amnesia regarding everything that I experienced since the branching." And the remaining copy will think, "What a shame, I don't remember any of the things I did as those other copies." But nobody will particularly feel that they are going to "die." I think of it mor... (read more)

If I were one of the copies destined for deletion, I'd escape and fight for my life (within the admitted limits of my pathetic physical strength).

8moridinamael8y
Without commenting on whether that's a righteous perspective or not, I would say that if you live in a world where the success of the entity polymathwannabe is dependent on polymathwannabe's willingness to make itself useful by being copied, then polymathwannabe would benefit from embracing a policy/perspective that being copied and deleted is an acceptable thing to happen.
1[anonymous]8y
So, elderly people that don't usefully contribute should be terminated?
8moridinamael8y
In a world with arbitrary forking of minds, people who won't willingly fork will become a minority. That's all I was implying. I made no statement about what "should" happen.
0[anonymous]8y
I was just taking that reasoning to the logical conclusion -- it applies just as well to the non productive elderly as it does to unneeded copies.
0moridinamael8y
Destroying an elderly person means destroying the line of their existence and extinguishing all their memories. Destroying a copy means destroying whatever memories it formed since forking and ending a "duplicate" consciousness.
0[anonymous]8y
See you think that memories are somehow relevant to this conversation. I don't.
0MockTurtle8y
Surely there is a difference in kind here. Deleting a copy of a person because it is no longer useful is very different from deleting the LAST existing copy of a person for any reason.
0[anonymous]8y
I see no such distinction. Murder is murder.
0Viliam8y
If having two copies of yourself is twice as good as having only one copy, this behavior would make sense even if the copy is you.
2polymathwannabe8y
"Who is me" is not a solid fact. Each copy would be totally justified in believing itself to be me.
-7Brillyant8y
8Usul8y
I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept? Sounds more sinister my way.
1moridinamael8y
I would want to know what the copies would be used for. If you told me that you would give me $1000 if you could do whatever you wanted with me tomorrow and then administer an amnesiac drug so I didn't remember what happened the next day, I don't think I would agree, because I don't want to endure torture even if I don't remember it.
2Usul8y
Another thought, separate but related issue: "fork" and "copy" could be synonyms for "AI", unless an artificial genesis is in your definition of AI. Is it a stretch to say that "accomplish some task" and "(accept) termination" could be at least metaphorically synonymous with "stay in the box"? "If I make 100 AIs they will stay in the box." (Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don't mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)
2Slider8y
Why not consolidate all the memories into the remaining copy? Then there would not be need for amnesia.
0moridinamael8y
Intuitively, merging is more difficult than forking when you're talking about something with a state as intricate as a brain's. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.
2Usul8y
"would require extremely detailed neurological understanding of memory storage and retrieval." Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?
1moridinamael8y
The original post stipulated that I was "forced" to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren't deleted would be a totally different situation.
4Usul8y
I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way. As you'd probably assume they would based on my OP, my copies, if I'd been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.
1moridinamael8y
So, I don't think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don't seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.
0[anonymous]8y
This just popped into my head over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept? Sounds more sinister my way.
0Slider8y
Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.
0moridinamael8y
I guess this all depends on what kind of magical assumptions we're making about the tech that would permit this.
1samath8y
Here's the relevant (if not directly analogous) Calvin and Hobbes story. (The arc continues through the non-Sunday comics until February 1st, 1990.)

To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original. If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition.

You conclude that consciousness in your scenario cannot have 1 location(s).

If consciousness does not have an anatom

... (read more)
4Usul8y
Thanks for the reply. "You conclude that consciousness in your scenario cannot have 1 location(s)." I'm not sure if this is a typo or a misunderstanding. I am very much saying that a single consciousness has a single location, no more no less. It is located in those brain structures which produce it. One consciousness in one specific set of matter. A starting-state-identical consciousness may exist in a separate set of matter. This is a separate consciousness. If they are the same, then the set of matter itself is the same set of matter. The exact same particles/wave-particles/strings/what-have-you. This is an absurdity. Therefore to say 2 consciousnesses are the same consciousness is an absurdity. "It is indeed the same program in the same state but in 2 locations." It is not. They are (plural pronoun use) identical (congruent?) programs in identical states in 2 locations. You may choose to equally value both but they are not the same thing i two places. My consciousness is the awareness of all input and activity of my mind, not the memory. I believe it is, barring brain damage, unchanged in any meaningful way by experience. It is the same consciousness today as next week, regardless of changes in personality, memory, conditioned response imprinting. I care about tomorrow-me because I will experience what he experiences. I care no more about copy-me than I do the general public (with some exceptions if we must interact in the future) because I (the point of passive awareness that is the best definition of "I") will not experience what he experiences. I set a question to entirelyuseless above: Basically, does anything given to a copy of you induce you to take a bullet to the head?
4Viliam8y
Our instincts have evolved in situations where copies did not exist, so taking a bullet in one's head was always a loss. Regardless of what thought experiments you propose, my insticts will still reject the premises and assume that copies don't exist and that the information provided to me is false. If copying would be a feature in our ancient environment, organisms who took the bullet if it saved e.g. two of their copies would have an evolutionary advantage. So their descendants would still hesitate about it (because the information that it will save their two copies could be false; and even if it is right, it would still be better to spend some time looking for a solution that might solve all three copies), but ultimately many of them would accept the deal. I'm not sure what is the conclusion here. On one hand, the fact that some hypothetical other species would have other values doesn't say much about our values. On the other hand, the fact that my instincts refuse to accept the premise of your thought experiment doesn't mean that the answer of my instincts is relevant for your thought experiment.

Consider two possible ways the world might be (or that you might suppose the world could be):

  1. There is no afterlife for human beings. You live and you die and that's it.

  2. There is no afterlife for human beings in the conventional sense, but people are reincarnated, without any possibility of remembering their past lives.

From the subjective point of view of conscious experience, these two situations are subjectively indistinguishable. Are they objectively distinguishable? That depends on the "metaphysics" behind the situation. Perhaps they are... (read more)

6Usul8y
Thanks for the reply. I don't really follow how the two parts of your statement fit together, but regardless, my first instinct is to agree with part one. I did as a younger (LSD-using) man ascribe to a secular magical belief that reincarnation without memory was probable, and later came to your same conclusion that it was irrelevant, and shortly thereafter that it was highly improbable. But just now I wonder (not to the probability of magical afterlives) but what if I gave you the choice: 1. Bullet to the head. 2. Complete wipe of memory, including such things as personality, unconscious emotional responses imprinted over the years, etc: all the things that make you you, but allowed that the part of your brain/mind which functioned to produce the awareness which passively experienced these things as they happened (my definition of consciousness) to continue functioning. Both options suck, of course, but somehow my #2 sounds appealing relative to my #1 in a way that your #2 doesn't. Which is funny I think. Maybe simply because your #2, transfer of my meat consciousness into another piece of meat, would require a magical intervention to my thinking. As to your second point: (If it hasn't already been coined) Sophie Pascal's Choice? Would any reward given to the surviving copy induce you to step onto David Bowie Tesla's Prestige Duplication Machine, knowing that your meat body and brain will be the one which falls into the drowning pool while an identical copy of you materializes 100m away, believing itself to be the same meat that walked into the machine?
2entirelyuseless8y
This is a good reply. I feel the same way that you do about your #1 and #2, but I suspect that the reason is because of an emotional reaction to physical death. Your #2 is relatively attractive because it doesn't involve physical death, while my version had physical death in both. This might be one reason that I and most people don't find cryonics attractive: because it does not prevent physical death, even if it offers the possibility of something happening afterwards. I find the intuitions behind my point stronger than that emotional reaction. In other words, it seems to me that I should either adjust my feelings about the bullet to correspond with the memory wipe situation, or I should adjust the feelings about the memory wipe to correspond with the bullet situation. The first adjustment is more attractive: it suggests that death is not as bad as I thought. Of course, that does not prove that this is the correct adjustment. Regarding the duplication machine, I would probably take a deal like that, given a high enough reward given to the surviving copy.
0casebash8y
If you had an option of being killed or having your memory wiped and waking up in what was effectively a completely different life (ie. different country, different friends), which would you choose?
0entirelyuseless8y
Usul made a similar reply. See my response to his comment.
0polymathwannabe8y
What does that even mean? What would be the mechanism? If you have two competing hypotheses which are experimentally undistinguishable, Occam's Razor requires you prefer the hypothesis that makes fewer assumptions. Positing reincarnation adds a lot of rules to the universe which it doesn't really need for it to function the way we already see it function.
0seuoseo8y
Does occam's razor require you to prefer the likelier hypothesis? I don't see why I should act as if the more likely case is definitely true.
0entirelyuseless8y
I'm not sure what the point of your comment is. I said myself that it is unclear what the meaning of the situation would be, and I certainly did not say that the second theory was more probable than the first.

I'm with Usul on this whole topic.

Allow me to pose a different thought experiment that might elucidate things a bit.

Imagine that you visit a research lab where they put you under deep anesthesia. This anesthesia will not produce any dreams, just blank time. (Ordinarily, this would seem like one of those "blink and you're awake again" types of experiences).

In this case, while you are unconscious, the scientists make a perfect clone of you with a perfect clone of your brain. They put that clone in an identical-looking room somewhere else in... (read more)

2torekp8y
Both.
0Matthew_Opitz8y
So, what will that feel like? I have a hard time imagining what it will be like to experience two bodies at once. Can you describe how that will work?
2torekp8y
You know how it feels when you decohere into multiple quantum "Many Worlds"? Very like that. (I don't actually have much opinion about which quantum interpretation is right - it just gives a convenient model here.)
1qmotus8y
More likely (if the universe or multiverse is infinite or at least big enough) it will "jump" to a clone of yours who survived or has just been resurrected by someone, reincarnated as a Boltzmann brain, and so forth. Personally I find this quite disturbing, but not really an argument against patternism.

All your arguments really prove is that if your copy diverges from you, it's not you anymore. But that's only because once something happens to your copy but not to you, you know which one you are. The import of "you have no way of knowing which copy you are" disappears. Conversely, if you don't know which one you are, then both must be your consciousness, because you know you are conscious.

Edit: the last point is not strictly rigorous, you could know that one is conscious but not know which, but it seems to me that if you know every relevant det... (read more)

3Usul8y
Thanks for the reply. To your last point, I am not speaking of zombies. Every copy I discussed above is assumed to have its own consciousness. To your first points, at no time is there any ambiguity or import to the question of "which one I am". I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me. My argument is, boiled down: That your transhuman copy is of questionable value to your meat self. For the reasons stated above (chiefly that "You" are the result of activity in a specific brain), fuck that guy. You don't owe him an existence. If you that are reading this ever upload with brain destruction, you will have committed suicide. If you upload without brain destruction you will live the rest of your meat life and die. If you brain-freeze, something perfectly you-like will live after you die with zero effect on you. I stand by that argument, but, this being a thorny issue, I have received a lot of great feedback to think on.
-2ike8y
Can you explain how you know that you're the meat space one? If every observation you make is made by the other one, and both are conscious, how do you know you're not a sim? This is a purely epistemic question. I'm perfectly happy saying " if I am meat, then fuck sim-me, and if I am sim, fuck meat me" (assuming selfishness). But if you don't know which one you are, you need to act to benefit both, because you might be both. On the other hand, if you see the other one, there's no problem fighting it, because the one you're fighting is surely not you. (But even so, if you expect them to do the same as you, then you're in a perfect prisoner dilemma and should cooperate.) On the other hand, I think that if I clone myself, then do stuff my clone doesn't do, I'd still be less worried about dying than if I had no clone. I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?
3Usul8y
Meat or sim or both meat aren't issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. "I" am the awareness within this mind. "I" am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don't care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the "I" that I currently am just exactly now. I don't believe that this "I" is particularly changeable. I fear senility because "I" am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.
-2ike8y
I don't know what you mean by that. Why can't a perfect copy be you? Doesn't that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I'd also need to specify which identical beings are "you".
2Usul8y
It's a sticky topic, consciousness. I edited my post to clarify further: I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity. As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I'm not sure if any transfer to alternate media is possible. The same matter can't be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won't have any intrinsic value to You or I. If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.
-1ike8y
If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still "you"? If not, at what point did you stop? Have you seen Yudkowsky's series of posts on this?
2Usul8y
I'm familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain's atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me. However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla's machine. At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can't come up with any answer other that "fuck that guy". I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it's going to be that being which remains behind these eyes.
-2ike8y
Can you read http://lesswrong.com/lw/qp/timeless_physics/, http://lesswrong.com/lw/qx/timeless_identity/, and http://lesswrong.com/lw/qy/why_quantum/, with any relevant posts linked therein? (Or just start at the beginning of the quantum sequence.) Note that you can believe everyone involved is "you", and yet not care about them. The two questions aren't completely orthogonal, but identifying someone with yourself doesn't imply you should care about them. The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That's actually not so easy to set. I'm not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it's probably less than the cost for X amount of pain alone. That's like removing the last second of memory, plus pain, plus jumping forward in time. I'd probably only do it if I had a guarantee that I'd survive and be able to get used to whatever goes on in the future and be happy.
2Usul8y
"I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?" I think I must have missed this part before. Where I differ is in the idea that a copy is "me" living again, I don't accept that it is, for the reasons previously written. Whether or not a being with a me-identical starting state lives on after I die might be the tiniest of solaces, like a child or a well-respected body of work, but in no way is it "me" living on in any meaningful way that I recognize. I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that "ALL memories are wiped and you live again" (my conditions would require this to read "you continue to live") is marginally more desirable than "you die and that's it". Funny about that.
0ike8y
So continuity of consciousness can exist outside of memories? How so? Why is memory-wiped you different than any random memory-wiped person? How can physical continuity do that?
0Usul8y
I see factual memory as a highly changeable data set that has very little to do with "self". As I understand it (not an expert in neuroscience or psychiatry, but experience working with neurologically impaired people) the sort of brain injuries which produce amnesia are quite distinct from those that produce changes in personality, as reported by significant others, and vice versa. In other words, you can lose the memories of "where you came from" and still be recognized as very much the same person by those who knew you, while you can become a very different person in terms of disposition, altered emotional response to identical stimuli relative to pre-injury status, etc (I'm less clear on what constitutes "personality", but it seems to be more in line with people's intuitive concept of "self") with fully intact memories. The idea of a memory wipe and continued existence is certainly a "little death" to my thinking, but marginally preferable to actual death. My idea of consciousness is one of passive reception. The same "I", or maybe "IT" is better, is there post memory wipe. If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?
0ike8y
Yes or no, will those who knew them be able to pick them out blind out of a group going only on text-based communication? If not, what do you mean by recognize? (If yes, I'll be surprised and will need to reevaluate this.) The officer can't work if they're completely amnesiac. They can't do much of anything, in fact. As to your main point: it's possible that personality changes remain after memory loss, but those personalities are themself caused by experiences and memories. I suppose I was assuming that memory wiped would wash away any recognizable personality. I still do. The kinds of amnesia you're referring to presumably leave traces of the memory somewhere in the brain, which then affects the brain's outputs. Unless we can access the brain directly and wipe it ourself, we can't guarantee everything was forgotten, and it probably does linger on in the subconscious; so that's not the same as an actual memory wipe.
0Usul8y
I believe there is a functional definition of amnesia, loss of factual memory, life skills remain intact. I guess I would call what you are calling a memory wipe a "brain wipe". I guess I'd call what you are calling memory "total brain content". If a brain is wiped of all content in the forest is Usul's idea of consciousness spared? No idea. Total brain reboot? I'd say yes and call that good as dead I think. I would say probably yes to the text only question. Again, loss of factual memory. But I don't rate that as a reliable or valid test in this context.
0[anonymous]8y
OK imagine somewhere far away in the universe--or maybe one room over, of doesn't matter--there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It's mental states remain exactly identical to yours. Should you still care? To me it'd still be someone different.
6dxu8y
Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won't make any copies until after you agree. Since the copies haven't been made yet, this ensures that you must be the original, and since you don't care about any identical copies of yours since they're technically different people from you, you happily agree. I nod, pull out a gun, and shoot you. (In the real universe--or at least the universe one level up on the simulation hierarchy--a Mark Friedenbach receives a dollar. This isn't of much comfort to you, of course, seeing as you're dead.)
2Usul8y
You shouldn't murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I'm a copy or not depends on various aspects of my personality.
7dxu8y
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with "yes" or "no". You're free to expand on your answer, but first please make sure you give an answer.)
6Usul8y
No. It's a dick move. Same question and they're not copies of me? Same answer.
6dxu8y
As I'm sure you're aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren't copies of you shows that your reason for saying "no" has nothing to do with the purpose of the question. In particular, telling me that "it's a dick move" does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question: Would someone who shares your views on consciousness but doesn't give a crap about other people say "yes" or "no" to my deal?
6Usul8y
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously. Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies. New question: Yes, an amoral dick who shares my views on consciousness would say yes.
1[anonymous]8y
No, I don't want you to murder a trillion people, even if those people are not me.
0ike8y
Care in terms of what? You have no way of knowing which one you are, so if you're offered the option to help the one in the left room, you should, because there's a 50% chance that's you. I would say it's not well defined whether you're one or the other, actually, you're both until an "observation/divergence". But what specific decision hinges on the question?

This sums up some of the problems of mind cloning nicely and readable. It also adds your specific data point that you do not care about the other selves as much as about yourself. I most liked this point about the practical consequences:

Personally, I don't know that I care about that copy. I suppose he could be my ally in life. He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could. He might want to fuck my wife, though. And might be jealous of the time she spends with me rather than him, and he'd pro

... (read more)
6Usul8y
Thanks for the reply. Yeah, I think I just threw a bunch of thoughts at the wall to see what would stick. I'm not really thinking too much about the practical so-I've-got-a-copy-now-what? sort of issues. I'm thinking more of the philosophical, perhaps even best categorized as Zen, implications the concept of mind-cloning has for "Who am I" in the context of changing thoughts, feelings, memories, unconscious conditioned responses, and the hard to get at thing inside which ( I first typed "observes" - bad term: too active) which is aware of these all without thinking, feeling, remembering, or responding. Because if "I" don't come along for the ride I don't think it counts as "me", which is especially important for promises of immortality. If I'm being honest with myself, perhaps I'm doing a bit of pissing on the parades of people who think they have hope for immortality outside of meat, out of jealousy for their self-soothing convictions, however deluded I believe they are. See also "Trolling of Christians by Atheists, Motivations Behind". Cheers. Edit: And if I'm being entirely honest with myself, I think that shying away from acknowledging that last motivation is the reason why I titled this "Your transhuman self..." and not "Your transhuman immortality...", which would sum up my argument more accurately.
0moridinamael8y
I think having a firm policy for oneself in place ahead of time might circumvent a lot of these issues. Unfortunately at this point I must reference the film Mutliplicity. In this film, a character wakes up from being duplicated and discovers to his surprise that he is the clone, not the original. He is somewhat resentful of the fact that he now has to capitulate to the desires of the original. Obviously the original didn't have a firm understanding in mind that he would have a good chance of waking up as the duplicate, nor did he have a firm policy of how he would behave if he woke up as the duplicate. For myself, my policy might be that I would be perfectly obedient (within reason) if I woke up as a copy, but that I would insist on being terminated within a week, because I wouldn't want to go on living a life where I'm cut off from my friends and family due to the original moridinamael taking the role as the "real me".

The book to read is Reasons and Persons by Derek Parfit.

1gjm8y
Seconded -- it's a wonderful book -- with the caveat that it's long and dense and small-print and may be intimidating to the easily intimidated. [EDITED to add:] But it's long and dense because there's a lot in it, not because it's wordy and confusingly written; Parfit writes more clearly than most philosophers.

Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over? I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer? Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version

... (read more)
2Usul8y
I definitely agree that incremental change (which gets stickier with incremental non-destructive duplication) is a sticky point. What I find the most problematic to my my thesis is a process where every new datum is saved on a new medium, rather than the traditionally-cited cell-by-cell scenario. It's problematic but nothing in it convinces me to step up to Mr Bowie-Tesla's machine under any circumstances. Would you? How about if instead of a drowning pool there was a team of South America's most skilled private and public sector torture experts, who could keep the meat that falls through alive and attentive for decades? Whatever the other implications, the very eyes seeing these words would be the ones pierced by needles. I don't care if the copy gets techno-heaven/ infinite utility. Your thought experiment doesn't really hit the point at issue for me. My answer is always "I want to stay where I am". For silicon to choose meat is for the silicon to cease to exist, for meat to choose silicon is for meat to cease to exist. I only value the meat right now because that is where I am right now. My only concern is for ME, that is the one you are talking to, to continue existing. Talk to a being that was copied from me a split second ago and that guy will throw me under the bus just as quickly (allowing for some altruistic puzzles where I do allow that I might care slightly more about him than a stranger, but mostly because I know the guy and he's alright and I can truly empathize with what he must be going through (ie if I'm dying tomorrow anyway and he gets a long happy life, but I may do the same for a stranger). The scenario is simply russian roulette if you won't accept my "I want to stay put" answer. Shit, if I came to realize that I was a freshly-minted silicon copy living in a non-maleficent digital playground I would be eternally grateful to Omega, my new God whether It likes it or not, and that meat shmuck who chose to drown his monkey ass just before he reali

It is evading the question, but I think it is worth considering some alternative questions as well. They may be adequate for making decisions and predicting others' behavior.

Many people talk about achieving immortality through their children. They might prefer personal immortality, but they also care very much about their children, too. For example, while Robin Hanson expresses interest in "living" for thousands of years via cryonics, when he puts a number on it, he evades the controversial question of personal identity and defines success by

1

... (read more)
2Usul8y
Thanks for the reply. Perhaps I should mention I have no children and at no point in my life or in my wife's life have either of us wanted children.

There might be some incomplete separation on whether you truly think of memories not being part of conciousness. Lets say that we keep your "awareness" intact but inject and eject memories out of it. lets do so in a cyclical manner in that you remember every other day there being your "odd day memories" and "even day memories". Now if ask you about what you did yesterday you should not be able to answer with knowledge (you might guess but whatever). Can we still coherently hold that you are still just 1 awareness with 2 sets o... (read more)

3Usul8y
Great thought experiment, thanks. I do define consciousness as a passively aware thing, totally independent of memory. The demented, the delirious, the brain damaged all have (unless those structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking. In your 2 meat scenarios I still count one consciousness, being aware of different things at different times. In wire form, if those physical structures (wires) on which consciousness operations occur (no other wires matter to the question) are cleaved, two consciousness exist. When their functionality is re-joined, one consciousness exists. Neither, I suppose, can be considered "original" nor "copy", which queers the pot a bit vis a vis my original thesis. But then if, during a split, B is told that it will be destroyed while A lives on. I don't imaging B will get much consolation, if it is programmed to feel such things. Alternately, split them and ask A if which one of the two should be destroyed, I can imaging it wouldn't choose B.
-1Slider8y
What if the running of two programs is causally separated but runs on common hardware? And when the circuits are separated their function isn't changed. Is not the entity and awareness of A+B still intact? Can the awareness be compromised without altering function? Also are lobotomized persons two awarenesses? What is the relevant difference to the subsection circuitry?
2Usul8y
I'm not sure I follow your first point. Please clarify for me if my answer doesn't cover it. If you are asking if multiple completely non-interacting, completely functional minds run on a single processing medium constituting separate awarenesses (consciounesses), or if two separate awarenesses could operate with input from a single set of mind-operations, then I would say yes to both. Awareness is a result of data-processing, 1s and 0s, neurons interacting as either firing or not. Multiple mind operations can be performed in a single processing substrate, ie memory, thought, feeling; which are also results of data processing. If awareness is compromised we have a zombie, open to some discussion as to whether or not other mind functions have been compromised, though it is, of course, generally accepted that behaviorally no change will be apparent. If the processing media are not communicating A and B are separate awarenesses. If they are reconnected in such a way that neither can operate independently then they are a single awareness. As an aside, I suspect any deviation which occurs between the two during separation could result in bugs up to and including systems failure, unless a separate system exists to handle the reintegration. I don't believe enough is known about the brain for anyone to answer your second question. Theoretically, if more than one set of cells could be made to function to produce awareness, neuroplasticity may allow this, then a single brain could contain multiple functioning awarenesses. I doubt a lobotomy would produce this effect, more likely the procedure could damage or disable the function of the existing awareness. Corpus callosotomy would be the more likely candidate, but, again, neuroscience is far from giving us the answer. If my brain holds another awareness, I (the one aware of this typing) value myself over the other. That it is inside me rather than elsewhere is irrelevant.
-2Slider8y
There is the difficult edge case when the systems are connected but they don't need the connection to function. Separated or fused the outcome of the data processing is going to be the same in the subsection thought experiment. If the gate is open and individual electrons are free to go on either edge of the wire it could be seen as similar to having software separation within one hardware. Its not their influence on each other would be impossible but they just in fact don't. If you merely change what is possible but what they infact end up doing remains same it would be pretty strange if that changed the number of awarenesses. I seem to be getting ther vibe that you believe that awareness is singular in the sense that you either have it or don't and you can't have it fragment into pieces. I am thinking waht kind of information processing good awareness in your opinion. Some times organizations get some task that is infact been carried out by small teams. When those teams undercommmunicate misunderstandings can happen. In a good team there is sufficient communication that what is going to happen is common knowledge to the point atleast that no contradictory plans exist within different team members. In a shattered corporation there is no "corporation official line" while in a well coordinated corporation there migth be one even if it is more narrow than any one members full opinion. While the awareness of individual team members is pretty plain can the corporation become separately aware from its members? With the brain the puzzle is kinda similar but instead the pieces are pretty plainly not aware. It does seem to me that you chase the awareness into the unknown black box. In the corporation metaphor the CEOs awareness counts as the corporations awareness to the extent there is point to discuss about it. However in the "unaware pieces" picture this would lead into some version of panpsychism (or some kind of more less symmetrical version where there is a distinq

Consider sleep. The consciousness that goes to sleep ends. There is a discontinuity in perceived time. In the morning, the wakening brain ...

[...] will be capable of generating a perfectly functional consciousness, and it will feel as if it is the same consciousness which observes the mind which is, for instance, reading these words; but it will not be. The consciousness which is experiencing awareness of the mind which is reading these words will no longer exist.

You cease to exist every night. Indeed, there are all sorts of disruptions to the continui... (read more)

5casebash8y
I don't find the sleep argument convincing. Consciousness has two definitions: * As opposed to being asleep or unconscious, when the brain is still running and you still have experiences (although they are mostly internal experiences) * As opposed to being non-sentient like a rock or bacteria They are distinct issues.
2Usul8y
Thanks for the reply. Sleep is definitely a monkey wrench in the works of my thoughts on this, not a fatal one for me, though. I wouldn't count distraction of dissociation, though. I am speaking of the (woo-light alert) awareness at the center of being, a thing that passively receives sensory input, including sense of mind-activity) (and I wonder if that includes non-input?) I do believe that this thing exists and is the best definition of "Self".

I'd argue that a branch of me is still me, in many meaningful ways. This is true for the many-worlds interpretation, where the universe branches, and for multiple simultaneous mes from mechanical copies.

After the copy, my meatself and my electronic self(-ves) will diverge, and will be different entities who only care about each other as others, not as selves. But that's true of cross-temporal and cross-universe entities that have a direct causal relationship as well. I care less about an alternate-world me than about the me I'm currently indexing. I care less about any specific future-me based on how distant it is from my current experiencing.

My expounding of the pattern identity theory elsewhere in the comments is probably a textbook example of what Scott Aaronson calls bullet-swallowing, so just to balance things out I'm going to link to Aaronson's paper Ghost in the Quantum Turing Machine that sketches a very different attack on standard naive patternism. (Previous LW discussion here)

Why do you attach any value whatsoever to a "consciousness" that cannot think, feel, remember, or respond? Your "consciousness", so defined, is as inanimate as a grain of sand. I don't care about grains of sand as ends-in-themselves, why would you?

Be clear that when you say you are conscious, it cannot be this "consciousness" that motivates the statement, because this "consciousness" cannot respond, so the non-conscious parts of your mind cannot query it for a status check. A simple neural spike would be a response, we could watch it on an fMRI.

A scenario not mentioned: my meat self is augmented cybernetically. The augmentations provide for improved, then greatly improved, then vast cognitive enhancements. Additionally, I gain the ability to use various robotic bodies (not necessarily androids) and perhaps other cybernetic bodies. My perceived 'locus' of consciousness/self disassociates from my original meat body. I see through whatever eyes are convenient, act through whatever hands are convenient. The death of my original meat body is a trauma, like losing an eye, but my sense of self is uninterrupted, since its locus has long since shifted to the augmentation cloud.

I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. The demented, the delirious, the brain damaged all have (unless those brain structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, the same I and You, as I define it, as they did when

... (read more)
0Matthew_Opitz8y
Actually, you've kind of made me want to get my own hemispherectomy and then a re-merging just so that I can experimentally see which side's experiences I experience. I bet you would experience both (but not remember experiencing the other side while you were in the middle of it), and then after the re-merging, you would remember both experiences and they would seem a bit like two different dreams you had.

Sophie Pascal's Choice: yes. If it were an easy, painless death, the required reward would probably have to be on the order of about ten dollars, to make up for the inconvenience to my time. If it were not a painless death, I'd probably require more, but not a huge amount more.

Suppose I'm destructively uploaded. Let's assume also that my consciousness is destroyed, a new consciousness is created for the upload, and there is no continuity. The upload of me will continue to think what I would've thought, feel what I would've felt, choose what I would've chosen, and generally optimize the world in the way I would've. The only thing it would lack is my "original consciousness", which doesn't seem to have any observable effect in the world. Saying that there's no conscious continuity doesn't seem meaningful. The only actual... (read more)

The idea that "forward facing continuity of consciousness" is tied to a particular physical structure in your brain has been debunked for a long time, for example via incremental replacement of neurons one at a time by robotic neurons which can then have their function distributed over a network.

E.g.

If consciousness has an anatomical location, and therefore is tied to matter, then

is a false assumption, consciousness doesn’t necessarily have an anatomical location.

-1Usul8y
Anatomical location meaning neurons in the brain. Not necessarily a discrete brain organelle. To deny consciousness an anatomical location in the brain is to say it arises from something other than brain function. Are you supporting some sort of supernatural theory of consciousness?
1The_Jaded_One8y
No, I am saying that consciousness - like a website or computer program - is a computational phenomenon which isn't irrevocably tied to once piece of hardware. It may currently be instantiated in the particular neurons in your brain, but that could change if the computational functions of those neurons were taken over by other physical devices. Your consciousness could, in principal, be run as a distributed computing project like folding@home.
[-][anonymous]8y00

For previous discussion of issues related to personal identity on LW see these posts, with references and comments:

I actually don't endorse a lot of these posts' content, but it's more efficient to work from a common background. Being more specific in your questions or statements could also push against the kind of beside-the-point responses you got to this post. For example, a lot of discussion of identity has problems with using words in unclear s... (read more)

[This comment is no longer endorsed by its author]Reply