AdeleneDawner comments on Welcome to Heaven - Less Wrong

23 Post author: denisbider 25 January 2010 11:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (242)

You are viewing a single comment's thread. Show more comments above.

Comment author: AdeleneDawner 28 January 2010 06:27:04PM 0 points [-]

I don't think those kinds of situations will need to be spelled out in advance, actually. Coming up with a plan that's acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that's acceptable to just one version, once you're using the right kind of framework to think about it. (You should be about equally willing to take either role, in other words, otherwise your clone is likely to rebel, and since they're considered independent from the get-go (and not bound by any contracts they didn't sign, I assume), there's not much you can do about that.)

Setting up four-way relationships would definitely be interesting. Another scenario that I like is one where you make a clone to pursue an alternate life-path that you suspect might be better but think is too risky - after a year (or whatever), whichever of you is less happy could suicide and give their wealth to the other one, or both could decide that their respective paths are good and continue with half-wealth.

Comment author: Blueberry 28 January 2010 06:46:16PM 0 points [-]

The more I think about this, the more I want to make a bunch of clones of myself. I don't even see why I'd need to destroy them. I shouldn't have to pay for them; they can get their own jobs, so wealth isn't that much of a concern.

Coming up with a plan that's acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that's acceptable to just one version, once you're using the right kind of framework to think about it.

The concern is that immediately after you clone, both copies agree that Copy 1 should live and Copy 2 should die, but afterwards, Copy 2 doesn't want to lose those experiences. If you decide beforehand that you only want one of you around, and Copy 2 is created specifically to be destroyed, there should be a way to bind Copy 2 to suicide.

Comment author: AdeleneDawner 28 January 2010 06:50:38PM 0 points [-]

there should be a way to bind Copy 2 to suicide.

Disagree. I would class that as murder, not suicide, and consider creating a clone who would be subject to such binding to be unethical.

Comment author: Blueberry 28 January 2010 06:56:50PM 0 points [-]

Calling it murder seems extreme, since you end up surviving. What's the difference between binding a copy to suicide and binding yourself to take a sleep-amnesia pill?

Comment author: AdeleneDawner 28 January 2010 07:19:00PM *  0 points [-]

If it's not utterly voluntary when committed, I don't class it as suicide. (I also consider 'driving someone to suicide' to actually be murder.)

My solution to resolving the ethical dilemma is, to reword it, to give the clone full human rights from the moment it's created (actually a slightly expanded version of current human rights, since we're currently prohibited from suiciding). I assume that it's not currently possible to enforce a contract that will directly cause one party's death; that aspect of inter-human interaction should remain. The wealth-split serves as a balance in two ways: Suddenly having your wealth halved would be traumatic for almost anyone, which gives a clone that had planned to suicide extra impetus to do so, and also should strongly discourage people from taking unnecessary risks when making clones. In other words, that's not a bug, it's a feature.

The difference between what you proposed and the sleeping pill scenario is that in the latter, there's never a situation where an individual is deprived of rights.

Comment author: Blueberry 28 January 2010 07:50:56PM 0 points [-]

If it's not utterly voluntary when committed, I don't class it as suicide.

I'm still unclear why you classify it as death at all. You end up surviving it.

I think you're thinking of a each copy as an individual. I'm thinking of the copies collectively as a tool used by an individual.

The difference between what you proposed and the sleeping pill scenario is that in the latter, there's never a situation where an individual is deprived of rights.

Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. You have someone there to enforce it if necessary. The next day, you change your mind, and the person forces you to take the pill anyway. Have you been deprived of rights? (If it helps, substitute eating dessert, or gambling, or doing heroin for taking the pill.)

Comment author: AdeleneDawner 28 January 2010 08:02:21PM 0 points [-]

I think you're thinking of a each copy as an individual. I'm thinking of the copies collectively as a tool used by an individual.

Yes, I am, and as far as I can tell mine's the accurate model. Each copy is separately alive and conscious; they should no more be treated as the same individual than twins are treated as the same individual. (Otherwise, why is there any ethical question at all?)

Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. ... Have you been deprived of rights?

This kind of question comes up every so often here, and I still haven't heard or thought of an answer that satisfies me. I don't see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn't be coerced.

Comment author: Blueberry 28 January 2010 08:14:43PM *  0 points [-]

Yes, I am, and as far as I can tell mine's the accurate model.

But if my copies and I don't think that way, is it still accurate for us? We agree to be bound by any original agreement, and we think any of us are still alive as long as one of us is, so there's no death involved. Well, death of a living organism, but not death of a person.

I don't see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn't be coerced.

It's the same question, because I'm assuming both copy A and copy B agree to be bound by the agreement immediately after copying (which is the same as the original making a plan immediately before copying). Both copies share a past, so if you can be bound by your past agreements, so can each copy. Even if the copies are separate individuals, they don't have separate pasts.

Comment author: AdeleneDawner 28 January 2010 08:30:36PM 1 point [-]

If you and all your copies think that way, then you shouldn't have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that's what you really believe, though? Sure enough to bet 1/2 your wealth?

My concern with having specific copies be bound to past agreements is that I don't trust that people won't abuse that: It's easy not to see the clone as 'yourself', but as an easily exploitable other. Here's a possible solution to that problem (though one that I don't like as well as not having the clone bound by prior agreements at all): Clones can only be bound by prior agreements that randomly determine which one acts as the 'new' clone and which acts as the 'old' clone. So, if you split off a clone to go review a movie for you, and pre-bind the clone to die after reporting back, there's a 50% chance - determined by a coin flip - that it's you, the original, who will review the movie, and the clone who will continue with your life.

Comment author: Blueberry 28 January 2010 09:12:47PM *  -1 points [-]

There isn't an "original". After the copying, there's Copy A and Copy B. Both are me. I'm fine with randomly selecting whether Copy A or Copy B goes to see the movie, but it doesn't matter, since they're identical (until one sees the movie). In fact, there is no way to not randomly select which copy sees the movie.

From the point of view of the clone who sees the movie (say it's bad), "suiciding" is the same as him going back in time and not seeing the movie. So I'd always stick to a prior agreement in a case like that.

If you and all your copies think that way, then you shouldn't have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that's what you really believe, though? Sure enough to bet 1/2 your wealth?

I don't really have any wealth to speak of. But they're all me. If I won't defect, then they won't. The question is just whether or not we might disagree on what's best for me. In which case, we can either go by prior agreement, or just let them all live. If the other mes really wanted to live, I'd let them. For instance, say I made 5 copies and all 5 of us went out to try different approaches to a career, agreeing the best one would survive. If a year later more than one claimed to have the best result for Blueberry, I might as well let more than one live.

ETA: However, there might be situations where I can only have one copy survive. For instance, I'm in a grad program now that I'd like to finish, and more than one of me can't be enrolled for administrative reasons. So if I really need only one of me, I guess we could decide randomly which one would survive. I'm all right with forcing a copy to suicide if he changes his mind, since I'm making that decision for all the clones ahead of time to lead to the best outcome for Blueberry.

Comment author: pdf23ds 28 January 2010 08:17:25PM 0 points [-]

Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow.

I don't think any such agreement could be legally binding under current law, which is relevant since we're talking about rights.