someonewrongonthenet comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (234)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: someonewrongonthenet 15 August 2012 07:21:23PM *  2 points [-]

Eliezer...the main issue that keeps me from cryonics is not whether the "real me" wakes up on the other side.

The first question is about how accurate the reconstruction will be. When you wipe a hard drive with a magnet, you can recover some of the content, but usually not all of it. Recovering "some" of a human, but not all of it, could easily create a mentally handicapped, broken consciousness.

But lets set that aside, as it is a technical problem. There is an second issue. If and when immortality and AI are achieved, what value would my revived consciousness contribute to such a society?

You've thus far established that death isn't a bad thing when a copy of the information is preserved and later revived. You've explained that you are willing to treat consciousness much like you would a computer file - you've explained that you would be willing to destroy one of two redundant duplicates of yourself.

Tell me, why exactly is it okay to destroy a redundant duplicate of yourself? You can't say that it's okay to destroy it simply because it is redundant, because that also destroys the point of cryonics. There will be countless humans and AIs that will come into existence, and each of those minds will require resources to maintain. Why is it so important that your, or my, consciousness be one among this swarm? Is that not similarly redundant?

For the same reasons that you would be willing to destroy one of two identical copies of yourself because having two copies is redundant, I am wondering just how much I care that my own consciousness survives forever. My mind is not exceptional among all the possible consciousnesses that resources could be devoted to. Keeping my mind preserved through the ages seems to me just as redundant as making twenty copies of yourself and carefully preserving each one.

I'm not saying I don't want to live forever...I do want to. I'm saying that I feel one aught to have a reason for preserving ones consciousness that goes beyond the simple desire for at least one copy of ones consciousness to continue existing.

When we deconstruct the notion of consciousness as thoroughly as we are doing in this discussion, the concept of "life" and "death" become meaningless over-approximations, much like "free will". Once society reaches that point, we are going to have to deconstruct those ideas and ask ourselves why it is so important that certain information never be deleted. Otherwise, it's going to get a little silly...a "21st century human brain maximizer" is not that much different from a paperclip maximizer, in the grand scheme of things.

Comment author: hairyfigment 30 September 2013 04:55:39PM 0 points [-]

It seems you place less value on your life than I do on mine. I'm glad we've reached agreement.

Comment author: someonewrongonthenet 01 October 2013 03:14:40AM *  0 points [-]

I agree, it's quite possible that someone might deconstruct "me" and "life" and "death" and "subjective experience" to the same extent that I have and still value never deleting certain information that is computationally descended from themselves more than all the other things that would be done with the resources that are used to maintain them.

Hell, I might value it to that extent. This isn't something I'm certain about. I'm still exploring this. My default answer is to live forever - I just want to make sure that this is really what I want after consideration and not just a kicking, screaming survival instinct (AKA a first order preference)