Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Sebastian_Hagen2 comments on Timeless Identity - Less Wrong

23 Post author: Eliezer_Yudkowsky 03 June 2008 08:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (244)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Sebastian_Hagen2 04 June 2008 07:28:00AM 0 points [-]

What if cryonics were phrased as the ability to create an identical twin from your brain at some point in the future, rather than 'you' waking up. If all versions of people are the same, this distinction should be immaterial. But do you think it would have the same appeal to people?

I don't know, and unless you're trying to market it, I don't think it matters. People make silly judgements on many subjects, blindly copying the majority in this society isn't particularly good advice.

Each twin might feel strong regard for the other, but there's no way they would actually be completely indifferent between pain for themselves and pain for their twin.

Any reaction of this kind is either irrational, based on divergence which has already taken place, or based on value systems very different from my own. In real life, you'd probably get a mix of the first two, and possibly also the last, from most people.

If another 'me' were created on mars and then got a bullet in the head, this would be sad, but no more so than any other death. It wouldn't feel like a life-extending boon when he was created, nor a horrible blow to my immortality when he was destroyed.

For me, this would be a quantitative judgement: it depends on how much both instances have changed since the split. If the time lived before the split is significantly longer than that after, I would consider the other instance a near-backup, and judge the relevance of its destruction accordingly. Aside from the aspect of valuing the other person as a human like any other that also happens to share most of your values, it's effectively like losing the only (and somewhat out-of-date) backup of a very important file: No terrible loss if you can keep the original intact until you can make a new backup, but an increased danger in the meantime.

If you truly believe that 'the same atoms means its 'you' in every sense', suppose I'm going to scan you and create an identical copy of you on mars. Would you immediately transfer half your life savings to a bank account only accessible from mars? What if I did this a hundred times?

Maybe, maybe not, depends on the exact strategy I'd mapped out beforehand for what each of the copies will do after the split. If I didn't have enough foresight to do that beforehand, all of my instances would have to agree on the strategy (including allocation of initial resources) over IRC or wiki or something, which could get messy with a hundred of them - so please, if you ever do this, give me a week of advance warning. Splitting it up evenly might be ok in the case of two copies (assuming they both have comparable expected financial load and income in the near term), but would fail horribly for a hundred; there just wouldn't be enough money left for any of them to matter at all (I'm a poor university student, currently; I don't really have "life savings" in transferrable format).