caffemacchiavelli comments on A rational approach to the issue of permanent death-prevention - Less Wrong

-4 Post author: Nanashi 11 February 2015 12:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (28)

You are viewing a single comment's thread.

Comment author: caffemacchiavelli 11 February 2015 02:30:37PM 3 points [-]

Even if you, personally, happen to die, you've still got a copy of yourself in backup that some future generation will hopefully be able to reconstruct.

Is there a consensus on the whole brain backup identity issue?

I can't say that trying to come up with intuition pumps about life extension has made me less confused about consciousness, but it does seem fairly obvious to me that if I'm backing up my brain, I'm just creating a second version who shares my values and capacities, not actually extending the life of version A. Being able to have both versions alive at the same time seems a clear indicator that they're not the same, and that when source A dies, copy B just goes on with their life and doesn't suddenly become A.

Unfortunately, I'm not sure the same argument doesn't apply to one brain at different points in time, too. If you atomize my brain now and put it back together later, am I still A or is A dead? What about koma, sleep, or any other interruption of consciousness?

It's all kind of a blur to me.

Comment author: SodaPopinski 11 February 2015 04:36:51PM 3 points [-]

The idea of a persistent personal identity has no physical basis. I am not questioning consciousness only saying that the mental construct that there is an ownership to some particular sequence of conscious feelings over time is inconsistent with reality (as I would argue all the teleporter-type thought experiments show). So in my view all that matters is how much a certain entity X decides (or instinctually feels) it should care about some similar seeming later entity Y.

Comment author: Nanashi 11 February 2015 02:57:44PM *  1 point [-]

Is there a consensus on the whole brain backup identity issue?

No, and thank you for pointing out the potential for confusion in this post. I have edited some key wording: "results in the continuation of the perception of consciousness." has now been changed to "results in a perception of consciousness functionally indistinguishable to an outside observer," which much more closely reflects my intent.

So in other words, if John Doe went into a locked room, created a copy of himself, incinerated the original version, disposed of all the ashes, and then walked out of the room, the copy would be indistinguishable from the original John Doe from your perspective as an outside observer.

How John Doe himself perceives that interaction is an extremely difficult question to answer (or even to really formulate scientifically).

Comment author: [deleted] 12 February 2015 03:56:15PM 0 points [-]

How John Doe himself perceives that interaction is an extremely difficult question to answer (or even to really formulate scientifically).

But that does not make it any less relevant a question.

Comment author: Lumifer 11 February 2015 05:03:29PM 0 points [-]

from your perspective as an outside observer

"Outside observers" can be very different. You probably need to define that observer a bit better.

Comment author: [deleted] 12 February 2015 03:54:39PM *  0 points [-]

Is there a consensus on the whole brain backup identity issue?

NO.

There are many like me who see what the OP advocates as a gigantic holocaust. "Murder the entire population of the world and replace them with artificial copies" is a terrifying outcome.