Blueberry comments on Welcome to Heaven - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (242)
This seems like a good solution. If I cloned myself, I'd want it to be established beforehand which copy would stay around, and which copy would go away. For instance, if you're going to make a copy that goes to watch a movie to see if the movie is worth your time, the copy that watches the movie should go away, because if it's good the surviving version of yourself will watch it anyway.
I (and thus my clones) don't see it as suicide, more like amnesia, so we'd have no problem going through with it if the benefit outweighed the amnesia.
If you keep the clone around, in terms of splitting their wealth, both clones can work and make money, so you should get about twice the income for less than twice the expenses (you could share some things). In terms of relationships, you could always bring the clones into a relationship. A four way relationship, made up of two copies of each original person, might be interesting.
Hmm... *Imagines such a relationship with significant other.* Holy hell that would be weird. The amount of puzzling scenarios I can think of just by sitting here is extravagant. Does anyone know of a decent novel based on this premise?
I don't think those kinds of situations will need to be spelled out in advance, actually. Coming up with a plan that's acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that's acceptable to just one version, once you're using the right kind of framework to think about it. (You should be about equally willing to take either role, in other words, otherwise your clone is likely to rebel, and since they're considered independent from the get-go (and not bound by any contracts they didn't sign, I assume), there's not much you can do about that.)
Setting up four-way relationships would definitely be interesting. Another scenario that I like is one where you make a clone to pursue an alternate life-path that you suspect might be better but think is too risky - after a year (or whatever), whichever of you is less happy could suicide and give their wealth to the other one, or both could decide that their respective paths are good and continue with half-wealth.
The more I think about this, the more I want to make a bunch of clones of myself. I don't even see why I'd need to destroy them. I shouldn't have to pay for them; they can get their own jobs, so wealth isn't that much of a concern.
The concern is that immediately after you clone, both copies agree that Copy 1 should live and Copy 2 should die, but afterwards, Copy 2 doesn't want to lose those experiences. If you decide beforehand that you only want one of you around, and Copy 2 is created specifically to be destroyed, there should be a way to bind Copy 2 to suicide.
Disagree. I would class that as murder, not suicide, and consider creating a clone who would be subject to such binding to be unethical.
Calling it murder seems extreme, since you end up surviving. What's the difference between binding a copy to suicide and binding yourself to take a sleep-amnesia pill?
If it's not utterly voluntary when committed, I don't class it as suicide. (I also consider 'driving someone to suicide' to actually be murder.)
My solution to resolving the ethical dilemma is, to reword it, to give the clone full human rights from the moment it's created (actually a slightly expanded version of current human rights, since we're currently prohibited from suiciding). I assume that it's not currently possible to enforce a contract that will directly cause one party's death; that aspect of inter-human interaction should remain. The wealth-split serves as a balance in two ways: Suddenly having your wealth halved would be traumatic for almost anyone, which gives a clone that had planned to suicide extra impetus to do so, and also should strongly discourage people from taking unnecessary risks when making clones. In other words, that's not a bug, it's a feature.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there's never a situation where an individual is deprived of rights.
I'm still unclear why you classify it as death at all. You end up surviving it.
I think you're thinking of a each copy as an individual. I'm thinking of the copies collectively as a tool used by an individual.
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. You have someone there to enforce it if necessary. The next day, you change your mind, and the person forces you to take the pill anyway. Have you been deprived of rights? (If it helps, substitute eating dessert, or gambling, or doing heroin for taking the pill.)
Yes, I am, and as far as I can tell mine's the accurate model. Each copy is separately alive and conscious; they should no more be treated as the same individual than twins are treated as the same individual. (Otherwise, why is there any ethical question at all?)
This kind of question comes up every so often here, and I still haven't heard or thought of an answer that satisfies me. I don't see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn't be coerced.
But if my copies and I don't think that way, is it still accurate for us? We agree to be bound by any original agreement, and we think any of us are still alive as long as one of us is, so there's no death involved. Well, death of a living organism, but not death of a person.
It's the same question, because I'm assuming both copy A and copy B agree to be bound by the agreement immediately after copying (which is the same as the original making a plan immediately before copying). Both copies share a past, so if you can be bound by your past agreements, so can each copy. Even if the copies are separate individuals, they don't have separate pasts.
If you and all your copies think that way, then you shouldn't have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that's what you really believe, though? Sure enough to bet 1/2 your wealth?
My concern with having specific copies be bound to past agreements is that I don't trust that people won't abuse that: It's easy not to see the clone as 'yourself', but as an easily exploitable other. Here's a possible solution to that problem (though one that I don't like as well as not having the clone bound by prior agreements at all): Clones can only be bound by prior agreements that randomly determine which one acts as the 'new' clone and which acts as the 'old' clone. So, if you split off a clone to go review a movie for you, and pre-bind the clone to die after reporting back, there's a 50% chance - determined by a coin flip - that it's you, the original, who will review the movie, and the clone who will continue with your life.
I don't think any such agreement could be legally binding under current law, which is relevant since we're talking about rights.