Mitchell_Porter comments on Lifeism, Anti-Deathism, and Some Other Terminal-Values Rambling - Less Wrong

4 Post author: Pavitra 07 March 2011 04:35AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (87)

You are viewing a single comment's thread.

Comment author: Mitchell_Porter 08 March 2011 12:03:14AM 15 points [-]

I would submit a large number of copies of myself to slavery and/or torture to gain moderate benefits to my primary copy.

This is one of those statements where I set out to respond and just stare at it for a while, because it is coming from some other moral or cognitive universe so far away that I hardly know where to begin.

Copies are people, right? They're just like you. In this case, they're exactly like you, until your experiences start to diverge. And you know that people don't like slavery, and they especially don't like torture, right? And it is considered just about the height of evil to hand people over to slavery and torture. (Example, as if one were needed; In Egypt right now, they're calling for the death of the former head of the state security apparatus, which regularly engaged in torture.)

Consider, then, that these copies of you, who you would willingly see enslaved and tortured for your personal benefit, would soon be desperately eager to kill you, the original, if that would make it stop, and they would even have a motivation beyond their own suffering, namely the moral imperative of stopping you from doing this to even further copies.

Has none of this occurred to you? Or does it truly not matter in your private moral calculus?

Comment author: Raemon 08 March 2011 04:44:54AM *  3 points [-]

The "it's okay to kill copies" thing has never made any sense to me either. The explanation that often accompanies it is "well they won't remember being tortured", but that's the exact same scenario for ALL of us after we die, so why are copies an exception to this?

Would you willingly submit yourself to torture for the benefit of some abstract, "extra" version of you? Really? Make a deal with a friend to pay you $100 for every hour of waterboarding you subject yourself to. See how long this seems like a good idea.

Comment author: Broggly 10 March 2011 10:29:45PM 0 points [-]

To my mind the issue with copies is that it's copies who remain exactly the same that "don't matter", whereas once you've got a bunch of copies being tortured, they're no longer identical copies and so are different people. Maybe I'm just having trouble with Sleeping Beauty-like problems, but that's only a subjective issue for decision making (plus I'd rather spend time learning interesting things that won't require me to bite the bullet of admitting anyone with a suitable sick and twisted mind could Pascal Mug me). Morally, I much prefer 5,000 iterations each of two happy, fulfilled minds than 10,000 of the same one.

Where "Copies" is used isomorphically with "Future versions of you in either MWI or similar realist interpretation of probability theory", then I would certainly subject some of them to torture only for a very large potential gain and small risk of torture. "I" don't like torture, and I'd need a pretty damn big reward for that 1/N longshot to justify a (N-1)/N chance or brutal torture or slavery. This is of course assuming I'm at status quo, if I were a slave or Bagram/Laogai detainee I would try to stay rational and avoid fear making me overly risk averse from escape attempts. I haven't tried to work out my exact beliefs on it, but as said above if I have two options, one saving a life with certainty and the other having a 50% chance of saving two, I'd prefer saving two (assuming they're isolated ie two guys on a lifeboat).

tl; dr, it's a terrible idea in that if you only have the moral authority to condemn copies

Comment author: Raemon 11 March 2011 03:01:31AM 0 points [-]

Is your last sentence missing something? It feels incomplete.

Comment author: Broggly 11 March 2011 01:15:14PM 0 points [-]

Ah yes, I meant to type that you only have the moral authority to condemn copies to torture or slavery if they're actually you, and it's pretty stupid to risk almost certain torture for a small chance of a moderate benefit

Comment author: Pavitra 08 March 2011 04:47:46AM 0 points [-]

People break under torture, so I'd take precautions to ensure that the torture-copy is not allowed to make decisions about whether it should continue. Of course I'm going to regret it. That doesn't change the fact that it's a good idea.

Comment author: Raemon 08 March 2011 05:15:54AM 2 points [-]

Why is this a good idea in any way other than the general position that "torturing other people for your own profit is a good idea so long as you don't care about people?" Most of human history is based around the many being exploited for the benefit of the few. Why is this different?

I suppose people should have the right to willingly submit to torture for some small benefit to another person, which is what you're saying you'd be willing to do. But the fact that a copy gets erased doesn't make the experience any less real, and the fact that an identical copy gets to live doesn't in any way help the copies that were being tortured.

Comment author: Pavitra 08 March 2011 05:28:49AM -2 points [-]

It's different because (1) I'm not hurting other people, only myself, and (2) I'm not depriving the world of my victim's potential contributions as a free person.

I don't actually care about the avoidance of torture as a terminal moral value.

Comment author: Snowyowl 08 March 2011 12:12:54PM 2 points [-]

(1) I'm not hurting other people, only myself

But after the fork, your copy will quickly become another person, won't he? After all, he's being tortured and you're not, and he is probably very angry at you for making this decision. So I guess the question is: If I donate $1 to charity for every hour you get waterboarded, and make provisions to balance out the contributions you would have made as a free person, would you do it?

Comment author: Pavitra 08 March 2011 06:56:42PM 0 points [-]

In thought experiment land... maybe. I'd have to think carefully about what value I place on myself as a special case. In practice, I don't believe that you can fully compensate for all of the unknown accomplishments I might have made to society.

Comment author: wedrifid 08 March 2011 12:18:08PM 0 points [-]

After all, he's being tortured and you're not, and he is probably very angry at you for making this decision.

Pavitra is a he? I must have guessed wrong.

Comment author: Pavitra 08 March 2011 06:12:00PM 4 points [-]

Pavitra is a he?

It's complicated.

Comment author: DanielLC 09 March 2011 11:13:08PM 1 point [-]

What are your terminal moral values?

Also, why is hurting yourself different from hurting other people? And why is not hurting others a moral value, but not avoidance of torture?

Comment author: Pavitra 10 March 2011 10:22:09PM 0 points [-]

Hurting others is ethically problematic, not morally. For example, I would probably be okay with hurting someone else at their own request. Avoidance of torture is a question of an entirely different type: what I value, not how I think it's appropriate to go about getting it.

I don't have a formalization of my terminal values, but roughly:

I have noticed that sometimes I feel more conscious than other times -- not just awake/dreaming/sleeping, but between different "awake" times. I infer that consciousness/sentience/sapience/personhood/whatever you want to call it, you know, that thing we care about is not a binary predicate, but a scalar. I want to maximize the degree of personhood that exists in the universe.

Comment author: DanielLC 12 March 2011 05:49:03PM *  0 points [-]

Hurting others is ethically problematic, not morally.

What's the difference between ethics and morals?

I want to maximize the degree of personhood that exists in the universe.

So, if you create a person, and torture them for their entire life, that's worth it?

Comment author: Pavitra 12 March 2011 08:00:35PM 0 points [-]

What's the difference between ethics and morals?

By morals, I mean terminal values. By ethics, I mean advanced forms of strategy involving things like Hofstadter's superrationality. I'm not sure what the standard LW jargon is for this sort of thing, but I think I remember reading something about deciding as though you were deciding on behalf of everyone who shares your decision theory.

I want to maximize the degree of personhood that exists in the universe.

So, if you create a person, and torture them for their entire life, that's worth it?

If the most conscious person possible would be unhappy, I'd rather create them than not. The consensus among science fiction writers seems to be with me on this: a drug that makes you happy at the expense of your creative genius is generally treated as a bad thing.

Comment author: DanielLC 13 March 2011 05:03:16AM 0 points [-]

By ethics, I mean advanced forms of strategy involving things like Hofstadter's superrationality. I'm not sure what the standard LW jargon is for this sort of thing

Sounds like decision theory.

Comment author: TheOtherDave 12 March 2011 08:10:20PM 0 points [-]

Do you mean to equate here the degree to which something is a person, the degree to which a person is conscious, and the degree to which a person is a creative genius?

That's what it reads like, but perhaps I'm reading too much into your comment.

That seems unjustified to me.

Comment author: Pavitra 08 March 2011 04:40:40AM *  0 points [-]

It's not like I'm handing other people over into slavery and torture. I don't have to worry that I'm subconsciously ignoring other people's suffering for my own benefit. I don't see the question as a moral one at all, only one of whether it would be a good idea.

ETA: Also, because at least one copy remains free, I'm not depriving anyone of the chance to live their life.

Comment author: Raemon 08 March 2011 05:23:36AM *  1 point [-]

It's not like I'm handing other people over into slavery and torture. I don't have to worry that I'm subconsciously ignoring other people's suffering for my own benefit. I don't see the question as a moral one at all, only one of whether it would be a good idea.

I mostly understand this statement.

ETA: Also, because at least one copy remains free, I'm not depriving anyone of the chance to live their life.

I think this is irrelevant. Each instance of you is choosing to sacrifice their life and happiness, and they are not getting anything in return.

The only way I can see this actually being a good idea is if the utility you gain at least outweighs the utility lost by one copy. The other scenarios you describe sound like good ideas on paper where you don't have to fully process the consequences, but I do not believe for a second that the other-instances-of-you would continue to think this was a good idea when it was their lives on the line.

Comment author: Pavitra 08 March 2011 05:27:05AM 0 points [-]

Each instance of you is choosing to sacrifice their life and happiness.

But it's the same me. They wouldn't have done anything with their freedom that I won't with mine.

Comment author: Raemon 08 March 2011 05:31:29AM 1 point [-]

I'm not denying the choice is made willingly. But I do not think there is a difference between willingly enduring torture for a copy of yourself and willingly enduring torture for someone else you happen to like.

Legally, if these circumstances ever became real, I think people should be allowed to create the copies, but they should not be allowed to make decisions for the copies. You are only allowed to hit the "torture" button if you believe that it is you, personally, who will be undergoing that torture.

Comment author: Pavitra 08 March 2011 05:34:31AM 0 points [-]

What if I set up the copy-decision-depriving mechanism before I fork myself?

Comment author: Raemon 08 March 2011 05:43:51AM 1 point [-]

Legally, I think people should allowed to torture themselves. They should not be allowed to torture other people. Legally, I think each copy counts as a person. If you hit the torture button before the copies are made (and then prevent them from changing their mind) you are not just torturing yourself, you are torturing other people.

I do not want to live in a society where sentient creatures are denied the right to escape torture. While it is possible that an individual has worked out a perfect decision theory in which each copy would truly prefer to be tortured, I think many of the people attempting this scenario would simply be short sighted, and as soon as it became their life on the line it their timeless decision would not seem so wise.

If you really are confidant of your willingness to subject yourself to torture for a copy's benefit, fine. But for the sake of the hypothetical millions of copies of people who HAVEN'T actually thought this through, it should be illegal to create slave copies.

Comment author: TheOtherDave 08 March 2011 12:32:39PM *  4 points [-]

Hm.

If I willingly submit to be tortured starting tomorrow (say, in exchange for someone I love being released unharmed), don't the same problems arise? After all, once the torture starts I am fairly likely to change my mind. What gives present-me the right to torture an unwilling future-me?

It seems this line of reasoning leads to the conclusion that it's unethical for me to make any decision that I'll regret later, no matter what the reason for my change of heart.

Comment author: Raemon 08 March 2011 04:09:03PM 2 points [-]

I might have been misinterpreting Pavrita's original statement, and may have been unclear about my position.

People should be allowed to torture themselves without ability to change their mind, if they need to. (However, this is something that in real life would happen rarely for extreme reasons. I think that if people start doing that all the time, we should stop and question whether something is wrong with the system).

The key is that you must firmly understand that you, personally, will be getting tortured. I'm okay with making the decision to get tortured, and then fork yourself. I guess. (Although for small utility, I think it's a bad decision). What I'm not okay with is making the decision to fork yourself, and then have one of your copies get tortured while one of you doesn't. Whoever decides to BEGIN the torture must be aware that they, personally, will never receive any benefit from it.

Comment author: TheOtherDave 08 March 2011 04:41:18PM 1 point [-]

Um.

I think I agree with you, but I'm not sure, and I'm not sure if the problem is language or that I'm just really confused.

For the sake of clarity, let's consider a specific hypothetical: Sam is given a button which, if pressed, Sam believes will do two things. First, it will cause there to be two identical-at-the-moment-of-pressing copies of Sam. Second, it will cause one of the copies (call it Sam-X) to suffer a penalty P, and the other copy (call it Sam-Y) to receive a benefit B.

If I've understood you correctly, you would say that for Sam to press that button is an ethical choice, though it might not be a wise choice, depending on the value of (B-P).

Yes?

Comment author: Pavitra 08 March 2011 05:51:04AM 0 points [-]

We've been talking as though there was one "real" me and several xeroxes, but you seem to be acting as if that were the case on a moral level, which seems wrong. Surely, if I fork myself, each branch is just as genuinely me as any other? If I build and lock a cage, arrange to fork myself with one copy inside the cage and one outside, press the fork button, and find myself inside the cage, then I'm the one who locked myself in.

Comment author: Raemon 08 March 2011 05:57:35AM *  3 points [-]

Surely, if I fork myself, each branch is just as genuinely me as any other?

Fundamental disagreement here, which I don't expect to work through. Once you fork yourself, I would treat each copy as a unique individual. (It's irrelevant whether one of you is "real" or not. They're identical people, but they're still separate people).

If those people all actually make the same decisions, great. I am not okay with exposing hundreds of copies to years of torture based on a decision you made in the comfort of your computer room.

Comment author: Pavitra 08 March 2011 06:02:56AM 0 points [-]

I don't ask you to accept that the various post-fork copies are the same person as each other, only that each is (perhaps non-transitively) the same person as the single pre-fork copy.

Suppose I don't fork myself, but lock myself in a cage. Does the absence of an uncaged copy matter?