If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with "yes" or "no". You're free to expand on your answer, but first please make sure you give an answer.)
No. It's a dick move. Same question and they're not copies of me? Same answer.
I don't believe that this "I" is particularly changeable
I don't know what you mean by that.
Why can't a perfect copy be you? Doesn't that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I'd also need to specify which identical beings are "you".
It's a sticky topic, consciousness. I edited my post to clarify further:
I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.
Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.
As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I'm not sure if any transfer to alternate media is possible. The same matter can't be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won't have any intrinsic value to You or I.
If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.
The original post stipulated that I was "forced" to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren't deleted would be a totally different situation.
I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.
As you'd probably assume they would based on my OP, my copies, if I'd been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.
Assuming everything I said in my previous comment is true
Sorry, I missed that you were the copier. Sure, I'm the copy. I do not care one bit. My life goes on totally unaffected (assuming the original and I live in unconnected universes). Do I get transhuman immortality? Because that would be awesome for me. If so, I git the long end of the stick. It would have no value to poor old original, nor does anything which happens to him have intrinsic value for me. If you had asked his permission he would have said no.
Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won't make any copies until after you agree. Since the copies haven't been made yet, this ensures that you must be the original, and since you don't care about any identical copies of yours since they're technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.
(In the real universe--or at least the universe one level up on the simulation hierarchy--a Mark Friedenbach receives a dollar. This isn't of much comfort to you, of course, seeing as you're dead.)
You shouldn't murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I'm a copy or not depends on various aspects of my personality.
Follow-up question:
Assuming everything I said in my previous comment is true and that I have no incentive to lie to you (but no incentive to tell you the truth, either), would you believe me if I then said you were the copy?
Based on your status as some-guy-on-the-internet and my estimate of the probability that this exact situation could come to be, no I do not believe you.
To clarify: I do not privilege the original self. I privilege the current self.
To your first points, at no time is there any ambiguity or import to the question of "which one I am". I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.
Can you explain how you know that you're the meat space one? If every observation you make is made by the other one, and both are conscious, how do you know you're not a sim? This is a purely epistemic question.
I'm perfectly happy saying " if I am meat, then fuck sim-me, and if I am sim, fuck meat me" (assuming selfishness). But if you don't know which one you are, you need to act to benefit both, because you might be both.
On the other hand, if you see the other one, there's no problem fighting it, because the one you're fighting is surely not you. (But even so, if you expect them to do the same as you, then you're in a perfect prisoner dilemma and should cooperate.)
On the other hand, I think that if I clone myself, then do stuff my clone doesn't do, I'd still be less worried about dying than if I had no clone. I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?
Meat or sim or both meat aren't issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. "I" am the awareness within this mind. "I" am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don't care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the "I" that I currently am just exactly now. I don't believe that this "I" is particularly changeable. I fear senility because "I" am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.
Usul, I just made a virtual copy of you and placed it in a virtual environment that is identical to that of the real you. Now, presumably, you believe that despite the copy being identical to yourself, you are still in some way the privileged "real" Usul. Unfortunately, the copy believes the exact same thing. My question for you is this:
Is there anything you could possibly say to the copy that could convince it that it is, in fact, a copy, and not the real Usul?
Great question. Usul and his copy do not care one bit which is which. But perhaps you could put together a convincing evidence chain. At which time copy Usul will still not care.
This sums up some of the problems of mind cloning nicely and readable. It also adds your specific data point that you do not care about the other selves as much as about yourself. I most liked this point about the practical consequences:
Personally, I don't know that I care about that copy. I suppose he could be my ally in life. He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could. He might want to fuck my wife, though. And might be jealous of the time she spends with me rather than him, and he'd probably feel entitled to all my stuff, as would I be vice versa.
But your post falls short in not making some clear distinctions. It doesn't differentiate between copies yo do and don't interact with (the above quote vs. Omega-style simulations). It also mixes philosophical aspects with individual and practical aspects and it is not clear for me how these are meant to explain/inform each other.
Thanks for the reply. Yeah, I think I just threw a bunch of thoughts at the wall to see what would stick. I'm not really thinking too much about the practical so-I've-got-a-copy-now-what? sort of issues. I'm thinking more of the philosophical, perhaps even best categorized as Zen, implications the concept of mind-cloning has for "Who am I" in the context of changing thoughts, feelings, memories, unconscious conditioned responses, and the hard to get at thing inside which ( I first typed "observes" - bad term: too active) which is aware of these all without thinking, feeling, remembering, or responding. Because if "I" don't come along for the ride I don't think it counts as "me", which is especially important for promises of immortality.
If I'm being honest with myself, perhaps I'm doing a bit of pissing on the parades of people who think they have hope for immortality outside of meat, out of jealousy for their self-soothing convictions, however deluded I believe they are. See also "Trolling of Christians by Atheists, Motivations Behind". Cheers.
Edit: And if I'm being entirely honest with myself, I think that shying away from acknowledging that last motivation is the reason why I titled this "Your transhuman self..." and not "Your transhuman immortality...", which would sum up my argument more accurately.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
In other words, I could make you believe that you were either the original or the copy simply by telling you you were the original/the copy. This means that before I told you which one you were, you would have been equally comfortable with the prospect of being either one (here I'm using "comfortable" in an epistemic sense--you don't feel as though one possibility is "privileged" over the other). I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact!
The pattern theorists answer this by denying this so-called "simple" fact's existence: the one says, "There is no fact of the matter as to which one I am, because until our experiences diverge, I am both." You, on the other hand, have no such recourse, because you claim there is a fact of the matter. Why, then, is the information necessary to determine this fact seemingly unavailable to you and available to me, even though it's a fact about your consciousness, not mine?
The genesis of my brain is of no concern as to whether or not I am the consciousness within it. I am, ipso facto. When I say it doesn't matter if I am an original or a copy or a copy of a copy I mean to say just exactly that. To whom are you speaking when you ask the question who are you? if it is to me the answer is "Me" I'm sorry that I don't know whether or not I am a copy but I was UNconscious at the time.
If copy is B and original is A. The question of whether I am A or B is irrelevant to the question of am I ME, which I am. Ask HIM the same question and HE will say the same and it will be true coming from his mouth.
If I drug you and place you in a room with two doors, only I would know which of those doors you entered though. This means that before I told you which one you entered, you would have been equally comfortable with the prospect of being either one. I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact!