army1987 comments on Timeless Identity - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (234)
Eliezer...the main issue that keeps me from cryonics is not whether the "real me" wakes up on the other side.
The first question is about how accurate the reconstruction will be. When you wipe a hard drive with a magnet, you can recover some of the content, but usually not all of it. Recovering "some" of a human, but not all of it, could easily create a mentally handicapped, broken consciousness.
But lets set that aside, as it is a technical problem. There is an second issue. If and when immortality and AI are achieved, what value would my revived consciousness contribute to such a society?
You've thus far established that death isn't a bad thing when a copy of the information is preserved and later revived. You've explained that you are willing to treat consciousness much like you would a computer file - you've explained that you would be willing to destroy one of two redundant duplicates of yourself.
Tell me, why exactly is it okay to destroy a redundant duplicate of yourself? You can't say that it's okay to destroy it simply because it is redundant, because that also destroys the point of cryonics. There will be countless humans and AIs that will come into existence, and each of those minds will require resources to maintain. Why is it so important that your, or my, consciousness be one among this swarm? Is that not similarly redundant?
For the same reasons that you would be willing to destroy one of two identical copies of yourself because having two copies is redundant, I am wondering just how much I care that my own consciousness survives forever. My mind is not exceptional among all the possible consciousnesses that resources could be devoted to. Keeping my mind preserved through the ages seems to me just as redundant as making twenty copies of yourself and carefully preserving each one.
I'm not saying I don't want to live forever...I do want to. I'm saying that I feel one aught to have a reason for preserving ones consciousness that goes beyond the simple desire for at least one copy of ones consciousness to continue existing.
When we deconstruct the notion of consciousness as thoroughly as we are doing in this discussion, the concept of "life" and "death" become meaningless over-approximations, much like "free will". Once society reaches that point, we are going to have to deconstruct those ideas and ask ourselves why it is so important that certain information never be deleted. Otherwise, it's going to get a little silly...a "21st century human brain maximizer" is not that much different from a paperclip maximizer, in the grand scheme of things.
How do you go to sleep at night, not knowing if it is the "real you" that wakes up on the other side of consciousness?
Your comment would make more sense to me if I removed the word "not" from the sentence you quote. (Also, if I don't read past that sentence of someonewrongonthenet's comment.)
That said, I agree completely that the kinds of vague identity concerns about cryonics that the quoted sentence with "not" removed would be raising would also arise, were one consistent, about routine continuation of existence over time.
There are things that when I go to bed to wake up eight hours later are very nearly preserved but if I woke up sixty years later wouldn't be, e.g. other people's memories of me (see I Am a Strange Loop) or the culture of the place where I live (see Good Bye, Lenin!).
(I'm not saying whether this is one of the main reasons why I'm not signed up for cryonics.)
Point.