Comment author: Brillyant 07 January 2016 05:00:31PM 0 points [-]

Sleep or the destructive teleporter (in your example) don't do away with a brain's memories. In my view, this is a key component to what makes up "identity" for people.

Comment author: Usul 08 January 2016 06:04:24AM *  2 points [-]

If retention of memory is a key component of identity, then what are the implications for identity:

When decades of new memories have been made (if loss of memory=loss of identity does gain of memory also=change of identity)? When old memories have changed beyond all recognition (unaware to the current rememberer he doesn't recall Suzy Smith from 1995 in 2015 the same way he recalled her in 2000)? When senile dementia causes gradual loss of memory? When mild brain injury causes sudden loss of large areas of memory while personality remains unchanged post injury? When said memory returns?

Tricky stuff, identity. Without a clear continuity to hang it on why should I care about what happens to me in five minutes, much less five years? Why do I work to benefit me tomorrow more than I do to benefit you next week? That's why I like hanging it on passive conscious awareness (assuming that thing exists), but damned if I know.

Comment author: ike 07 January 2016 05:03:01AM -1 points [-]

To your first points, at no time is there any ambiguity or import to the question of "which one I am". I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.

Can you explain how you know that you're the meat space one? If every observation you make is made by the other one, and both are conscious, how do you know you're not a sim? This is a purely epistemic question.

I'm perfectly happy saying " if I am meat, then fuck sim-me, and if I am sim, fuck meat me" (assuming selfishness). But if you don't know which one you are, you need to act to benefit both, because you might be both.

On the other hand, if you see the other one, there's no problem fighting it, because the one you're fighting is surely not you. (But even so, if you expect them to do the same as you, then you're in a perfect prisoner dilemma and should cooperate.)

On the other hand, I think that if I clone myself, then do stuff my clone doesn't do, I'd still be less worried about dying than if I had no clone. I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?

Comment author: Usul 08 January 2016 05:18:13AM 1 point [-]

"I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?"

I think I must have missed this part before. Where I differ is in the idea that a copy is "me" living again, I don't accept that it is, for the reasons previously written. Whether or not a being with a me-identical starting state lives on after I die might be the tiniest of solaces, like a child or a well-respected body of work, but in no way is it "me" living on in any meaningful way that I recognize. I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that "ALL memories are wiped and you live again" (my conditions would require this to read "you continue to live") is marginally more desirable than "you die and that's it". Funny about that.

Comment author: ike 07 January 2016 01:21:55PM 0 points [-]

If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still "you"? If not, at what point did you stop?

Have you seen Yudkowsky's series of posts on this?

Comment author: Usul 08 January 2016 04:42:02AM 1 point [-]

I'm familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain's atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.

However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla's machine.

At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can't come up with any answer other that "fuck that guy". I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it's going to be that being which remains behind these eyes.

Comment author: moridinamael 06 January 2016 02:53:48PM *  6 points [-]

I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I'm forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, "What a shame, I will have amnesia regarding everything that I experienced since the branching." And the remaining copy will think, "What a shame, I don't remember any of the things I did as those other copies." But nobody will particularly feel that they are going to "die." I think of it more as how memories propagate forward.

If I forked and then the forks persisted for several weeks and accumulated lots of experiences and varying shifts in perspective, I'd be more prone to calling the forks different "people."

Comment author: Usul 08 January 2016 04:02:25AM 1 point [-]

Another thought, separate but related issue: "fork" and "copy" could be synonyms for "AI", unless an artificial genesis is in your definition of AI. Is it a stretch to say that "accomplish some task" and "(accept) termination" could be at least metaphorically synonymous with "stay in the box"?

"If I make 100 AIs they will stay in the box."

(Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don't mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)

Comment author: Risto_Saarelma 07 January 2016 06:26:58PM *  2 points [-]

If pressed, right now I'm leaning towards the matter-based argument, that if consciousness is not magical then it is tied to specific sets of matter. And that a set of matter can not exist in multiple locations. Therefore a single consciousness can not exist in multiple locations. The consciousness A that I am now is in matter A.

So, there are two things we need to track here, and you're not really making a distinction between them. There are individual moments of consciousness, which, yes, probably need to be on a physical substrate that exists in the single location. This is me saying that I'm this moment of conscious experience right now, which manifests in my physical brain. Everybody can be in agreement about this one.

Then there is the continuity of consciousness from moment to moment, which is where the problems show up. This is me saying that I'm the moment of conscious experience in my brain right now, and I'm also going to be the next moment of conscious experience in my brain.

The problems start when you want to say that the moment of consciousness in your brain now and the moment of consciousness in your brain a second in the future are both "your consciousness" and the moment of consciousness in your brain now and the moment of consciousness in your perfect upload a second in the future are not. There is no actual "consciousness" that refers to things other than the single moments for the patternist. There is momentary consciousness now, with your memories, then there is momentary consciousness later, with your slightly evolved memories. And on and on. Once you've gone past the single eyeblink of consciousness, you're already gone, and a new you might show up once, never, or many times in the future. There's nothing but the memories that stay in your brain during the gap laying the claim for the you-ness of the next moment of consciousness about to show up in a hundred or so milliseconds.

Comment author: Usul 08 January 2016 03:35:07AM 1 point [-]

I'm going to go ahead and continue to disagree with the the pattern theorists on this one. Has the inverse of the popular "Omega is a dick with a lame sense of irony" simulation mass-murder scenario been discussed? Omega (truthful) gives you a gun. "Blow your brains out and I'll give the other trillion copies a dollar." It seems the pattern theorist takes the bullet or Dr Bowie-Tesla's drowning pool with very little incentive.

The pattern theorists as you describe them would seem to take us also to the endgame of Buddhist ethics (not a Buddhist, not proselytizing for them): You are not thought, you are not feeling, you are not memory, because these things are impermanent and changing. You are the naked awareness at the center of these things in the mind of which you are aware. (I'm with them so far. Here's where I get off): All sentient beings are points of naked awareness, by definition they are identical (naked, passive), therefore they are the same, Therefore even this self does not matter, therefore thou shall not value the self more than others. At all. On any level. All of which can lead you to bricking yourself up in a cave being the correct course of action.

To your understanding, does the pattern theorist (just curious, do you hold to the views you are describing as pattern theory?) define self at all on any level? Memory seems an absurd place to do so from, likewise personality, thought- have you heard the nonsense that thought comes up with? How can a pattern theorist justify valuing self above other? Without a continuous You, we get to the old Koan "Who sits before me now? (who/what are You?)"

"Leave me alone and go read up on pattern theory yourself, I'm not your God-damn philosophy teacher." Is a perfectly acceptable response, by the way. No offense will be taken and it would not be an unwarranted reply. I appreciate the time you have taken to discuss this with me already.

Comment author: Slider 07 January 2016 10:38:06AM 0 points [-]

What if the running of two programs is causally separated but runs on common hardware? And when the circuits are separated their function isn't changed. Is not the entity and awareness of A+B still intact? Can the awareness be compromised without altering function?

Also are lobotomized persons two awarenesses? What is the relevant difference to the subsection circuitry?

Comment author: Usul 08 January 2016 03:03:00AM 1 point [-]

I'm not sure I follow your first point. Please clarify for me if my answer doesn't cover it. If you are asking if multiple completely non-interacting, completely functional minds run on a single processing medium constituting separate awarenesses (consciounesses), or if two separate awarenesses could operate with input from a single set of mind-operations, then I would say yes to both. Awareness is a result of data-processing, 1s and 0s, neurons interacting as either firing or not. Multiple mind operations can be performed in a single processing substrate, ie memory, thought, feeling; which are also results of data processing. If awareness is compromised we have a zombie, open to some discussion as to whether or not other mind functions have been compromised, though it is, of course, generally accepted that behaviorally no change will be apparent.

If the processing media are not communicating A and B are separate awarenesses. If they are reconnected in such a way that neither can operate independently then they are a single awareness. As an aside, I suspect any deviation which occurs between the two during separation could result in bugs up to and including systems failure, unless a separate system exists to handle the reintegration.

I don't believe enough is known about the brain for anyone to answer your second question. Theoretically, if more than one set of cells could be made to function to produce awareness, neuroplasticity may allow this, then a single brain could contain multiple functioning awarenesses. I doubt a lobotomy would produce this effect, more likely the procedure could damage or disable the function of the existing awareness. Corpus callosotomy would be the more likely candidate, but, again, neuroscience is far from giving us the answer. If my brain holds another awareness, I (the one aware of this typing) value myself over the other. That it is inside me rather than elsewhere is irrelevant.

Comment author: Kyre 07 January 2016 05:09:29PM 2 points [-]

Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over? I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer? Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version?

I would follow this line of questioning. For example, say someone does an incremental copy process to you, but the consciousness generated does not know whether or not the original biological consciousness has been destroyed, and has to choose which one to keep. If it chooses the biological one and the biology has been destroyed, bad luck you are definitely gone. What does your consciousness, running either just on the silicon, or identically on the silicon and in the biology, choose ?

Let's say you are informed that there is 1% chance that the biological version has been destroyed. Well, you're almost certainly fine then, you keep the biological version, the silicon version is destroyed, and you live happily ever after until you become senile and die.

On the other hand, say you are informed that the biological version has definitely been destroyed. On your current theory, this means that that the consciousness realises that it has been mistaken about its identity, and is actually only a few minutes old. It's sad that the progenitor person is gone, but it is not suicidal, so it chooses the silicon version.

At what point on the 1% to 100% slider would your consciousness choose the silicon version ?

(Hearing the though-experiment of incremental transfer (or alternatively duplication) was one of the things that changed my mind to pattern-identity from some sort of continuity-identity theory. I remember hearing an interview with Marvin Minsky where he described an incremental transfer on a radio program).

Comment author: Usul 08 January 2016 02:31:37AM 1 point [-]

I definitely agree that incremental change (which gets stickier with incremental non-destructive duplication) is a sticky point. What I find the most problematic to my my thesis is a process where every new datum is saved on a new medium, rather than the traditionally-cited cell-by-cell scenario. It's problematic but nothing in it convinces me to step up to Mr Bowie-Tesla's machine under any circumstances. Would you? How about if instead of a drowning pool there was a team of South America's most skilled private and public sector torture experts, who could keep the meat that falls through alive and attentive for decades? Whatever the other implications, the very eyes seeing these words would be the ones pierced by needles. I don't care if the copy gets techno-heaven/ infinite utility.

Your thought experiment doesn't really hit the point at issue for me. My answer is always "I want to stay where I am". For silicon to choose meat is for the silicon to cease to exist, for meat to choose silicon is for meat to cease to exist. I only value the meat right now because that is where I am right now. My only concern is for ME, that is the one you are talking to, to continue existing. Talk to a being that was copied from me a split second ago and that guy will throw me under the bus just as quickly (allowing for some altruistic puzzles where I do allow that I might care slightly more about him than a stranger, but mostly because I know the guy and he's alright and I can truly empathize with what he must be going through (ie if I'm dying tomorrow anyway and he gets a long happy life, but I may do the same for a stranger). The scenario is simply russian roulette if you won't accept my "I want to stay put" answer.

Shit, if I came to realize that I was a freshly-minted silicon copy living in a non-maleficent digital playground I would be eternally grateful to Omega, my new God whether It likes it or not, and that meat shmuck who chose to drown his monkey ass just before he realized he'd taken the Devil's Bargain.

Not that "meat" has any meaning other than "separate entity" here. If I am sim-meat I want to stay this piece of sim meat.

Comment author: Risto_Saarelma 07 January 2016 07:18:04AM *  5 points [-]

You're still mostly just arguing for your personal intuition for the continuity theory though. People have been doing that pretty much as long as we've had fiction about uploads or destructive teleportation, with not much progress to the arguments. How would you convince someone sympathetic to the pattern theory that the pattern theory isn't viable?

FWIW, after some earlier discussions about this, I've been meaning to look into Husserl's phenomenology to see if there are some more interesting arguments to be found there. That stuff gets pretty weird and tricky fast though, and might be a dead end anyway.

Comment author: Usul 07 January 2016 08:26:52AM 2 points [-]

Honestly, I'm not sure what other than intuition and subjective experience we have to go with in discussing consciousness. Even the heavy hitters in the philosophy of consciousness don't 100% agree that it exists. I will be the first to admit I don't have the background in pattern theory or the inclination to get into a head to head with someone who does. If pressed, right now I'm leaning towards the matter-based argument, that if consciousness is not magical then it is tied to specific sets of matter. And that a set of matter can not exist in multiple locations. Therefore a single consciousness can not exist in multiple locations. The consciousness A that I am now is in matter A. If a copy consciousness B is made in matter B and matter A continues to exist than it is reasonable to state that consciousness A remains in matter A. If matter A is destroyed there is no reason to assume consciousness A has entered matter B simply because of this. You are in A now. You will never get to B.

So, if it exists, and it is you, you're stuck in the meat. And undeniably, someone gets stuck in the meat.

I imagine differing definitions of You, self, consciousness, etc would queer the deal before we even got started.

Comment author: dxu 07 January 2016 06:43:15AM -2 points [-]

Same question and they're not copies of me? Same answer.

As I'm sure you're aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren't copies of you shows that your reason for saying "no" has nothing to do with the purpose of the question. In particular, telling me that "it's a dick move" does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:

Would someone who shares your views on consciousness but doesn't give a crap about other people say "yes" or "no" to my deal?

Comment author: Usul 07 January 2016 07:50:08AM *  3 points [-]

Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.

Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.

New question: Yes, an amoral dick who shares my views on consciousness would say yes.

Comment author: moridinamael 06 January 2016 02:53:48PM *  6 points [-]

I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I'm forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, "What a shame, I will have amnesia regarding everything that I experienced since the branching." And the remaining copy will think, "What a shame, I don't remember any of the things I did as those other copies." But nobody will particularly feel that they are going to "die." I think of it more as how memories propagate forward.

If I forked and then the forks persisted for several weeks and accumulated lots of experiences and varying shifts in perspective, I'd be more prone to calling the forks different "people."

Comment author: Usul 07 January 2016 07:33:20AM 5 points [-]

I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?

Sounds more sinister my way.

View more: Prev | Next