cousin_it comments on Deontological Decision Theory and The Solution to Morality - Less Wrong

-7 [deleted] 10 January 2011 04:15PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (91)

You are viewing a single comment's thread. Show more comments above.

Comment author: cousin_it 10 January 2011 06:15:15PM 5 points [-]

How about the original form of the dilemma? Would you flip a switch to divert the trolley to a track with 1 person tied to it instead of 5?

Comment author: Alicorn 10 January 2011 06:28:31PM 3 points [-]

No.

(However, if there are 5 people total, and I can arrange for the train to run over only one of those same people instead of all five, then I'll flip the switch on the grounds that the one person is unsalvageable.)

Comment author: JGWeissman 10 January 2011 06:42:10PM 5 points [-]

I would predict that if the switch were initially set to send the trolley down the track with one person, you also would not flip it.

But suppose that you first see the two paths with people tied to the track, and you have not yet observed the position of the switch. As you look towards it, is there any particular position that you hope the switch is in?

Comment author: Alicorn 10 January 2011 06:47:06PM 1 point [-]

I might have such hopes, if I had a way to differentiate between the people.

(And above, when I make statements about what I would do in trolley problems, I'm just phrasing normative principles in the first person. Sufficiently powerful prudential considerations could impel me to act wrongly. For instance, I might switch a trolley away from my sister and towards a stranger just because I care about my sister more.)

Comment author: Vladimir_Nesov 10 January 2011 07:02:20PM *  5 points [-]

Find a point of balance, where the decision swings. What about sister vs. 2 people? Sister vs. million people? Say, balance is found at N people, so you value N+1 strangers more than your sister, and N people less. Then, N+1 people can be used in place of sister in the variant with 1 person on the other track: just as you'd reroute the train from your sister and to a random stranger, you'd reroute the train from N+1 strangers (which are even more valuable) and to one stranger.

Then, work back from that. If you reroute from N+1 people to 1 person, there is the smallest number M of people that you won't reroute from M people but would from all k>M. And there you have a weak trolley problem, closer to the original formulation.

(This is not the strongest problem with your argument, but an easy one, and a step towards seeing the central problem.)

Comment author: Alicorn 10 January 2011 07:17:19PM 4 points [-]

Um, my prudential considerations do indeed work more or less consequentialistically. That's not news to me. They just aren't morality.

Comment author: jimrandomh 10 January 2011 07:25:26PM 6 points [-]

Wait a second - is theree a difference of definitions here? That sounds a lot like what you'd get if you started with a mixed consequentialist and deontological morality, drew a boundary around the consequentialist parts and relabeled them not-morality, but didn't actually stop following them.

Comment author: shokwave 10 January 2011 07:29:20PM 2 points [-]

I presume prudential concerns are non-moral concerns. In the way that maintaining an entertainment budget next to your charity budget while kids are starving in poorer countries is not often considered a gross moral failure, I would consider the desire for entertainment to be a prudential concern that overrides or outweighs morality.

Comment author: Alicorn 10 January 2011 07:28:23PM 1 point [-]

I guess that would yield something similar. It usually looks to me like consequentialists just care about the thing I call "prudence" and not at all about the thing I call "morality".

Comment author: TheOtherDave 10 January 2011 08:35:18PM 1 point [-]

That seems like a reasonable summary to me. Does it seem to you that we ought to? (Care about morality, that is.)

Comment author: Alicorn 10 January 2011 09:10:39PM 1 point [-]

I think you ought to do morally right things; caring per se doesn't seem necessary.

Comment author: jimrandomh 10 January 2011 07:41:10PM 0 points [-]

Does the importance of prudence ever scale without bound, such that it dominates all moral concerns if the stakes get high enough?

Comment author: Alicorn 10 January 2011 07:50:24PM 0 points [-]

I don't know about all moral concerns. A subset of moral concerns are duplicated and folded into my prudential ones.

Comment author: Vladimir_Nesov 10 January 2011 07:44:35PM 3 points [-]

Can't parse.

Comment author: Alicorn 10 January 2011 07:49:39PM 0 points [-]

Easy reader version for consequentialists: I'm like a consequentialist with a cherry on top. I think this cherry on top is very, very important, and like to borrow moralistic terminology to talk about it. Its presence makes me a very bad consequentialist sometimes, but I think that's fine.

Comment author: Vladimir_Nesov 10 January 2011 08:06:13PM *  4 points [-]

Its presence makes me a very bad consequentialist sometimes, but I think that's fine.

If this cherry on top costs people lives, it's not "fine", it's evil incarnate. You should cut this part of yourself out without mercy.

(Compare to your Luminosity vampires, that are sometimes good, nice people, even if they eat people.)

Comment author: jimrandomh 10 January 2011 08:36:14PM 3 points [-]

I don't think cutting out deontology entirely would be a good thing. I do think that the relative weights of deontological and consequentialist rules needs to be considered, and that choosing inaction in a 5 lives:1 life trolley problem strongly suggests misweighting. But that's just a thought experiment; and I wouldn't consider it wrong to choose inaction in, say, a 1.2 lives:1 life trolley problem.

Comment author: Armok_GoB 13 January 2011 10:49:50PM -1 points [-]

I interpret this as that he currently acts consequentialist, but feel guilty after breaking a dentological principle, would behave in a more dentological fashion if he had more willpower, and would self modify to be purely dentological if he had the chance. Is this correct?

Comment author: Alicorn 13 January 2011 11:20:34PM 1 point [-]

Who are you talking about?

Comment author: jimrandomh 10 January 2011 07:00:11PM *  3 points [-]

What if it were 50 people? 500? 5*10^6? The remainder of all humanity?

My own position is that morality should incorporate both deontological and consequentialist terms, but they scale at different rates, so that deontology dominates when the stakes are very small and consequentialism dominates when the stakes are very large.

Comment author: Alicorn 10 January 2011 07:13:38PM *  4 points [-]

I am obliged to act based on my best information about the situation. If that best information tells me that:

  • I have no special positive obligations to anyone involved,

  • The one person is not willing to be run over to save the others (or simply willing to be run over e.g. because ey is suicidal), and

  • The one person is not morally responsible for the situation at hand or for any other wrong act such that they have waived their right to life,

Then I am obliged to let the trolley go. However, I have low priors on most humans being so very uninterested in helping others (or at least having an infrastructure to live in) that they wouldn't be willing to die to save the entire rest of the human species. So if that were really the stake at hand, the lone person tied to the track would have to be loudly announcing "I am a selfish bastard and I'd rather be the last human alive than die to save everyone else in the world!".

And, again, prudential concerns would probably kick in, most likely well before there were hundreds of people on the line.

Comment author: Yoreth 10 January 2011 09:42:16PM *  0 points [-]

Would it be correct to say that, insofar as you would hope that the one person would be willing to sacrifice his/her life for the cause of saving the 5*10^6 others, you yourself would pull the switch and then willingly sacrifice yourself to the death penalty (or whatever penalty there is for murder) for the same cause?

Comment author: Alicorn 10 January 2011 09:46:36PM *  2 points [-]

I'd be willing to die (including as part of a legal sentence) to save that many people. (Not that I wouldn't avoid dying if I could, but if that were a necessary part of the saving-people process I'd still enact said process.) I wouldn't kill someone I believed unwilling, even for the same purpose, including via trolley.

Comment author: shokwave 10 January 2011 06:47:10PM 2 points [-]

I feel like the difference between "No matter what, this person will die" and "No matter what, one person will die" is very subtle. It seems like you could arrange thought experiments that trample this distinction. Would that pose a problem?

Comment author: Alicorn 10 January 2011 06:56:09PM *  6 points [-]

I don't remember the details, but while I was at the SIAI house I was presented some very elaborate thought experiments that attempted something like this. I derived the answer my system gives and announced it and everyone made outraged noises, but they also make outraged noises when I answered standard trolley problems, so I'm not sure to what extent I should consider that a remarkable feature of those thought experiments. Do you have one in mind you'd like me to reply to?

Comment author: shokwave 10 January 2011 07:16:50PM *  3 points [-]

Not really. I am mildly opposed to asking trolley problem questions. I mostly just observed that, in my brain, there wasn't much difference between:

Set of 5 people where either 1 dies or 5 die.
Set of 6 people where either 1 dies or 5 die.

I wasn't sure exactly what work the word 'unsalvageable' was doing: was it that this person cannot in principle be saved, so er life is 'not counted', and really you have

Set of 4 people where either none die or 4 die?

Comment author: Alicorn 10 January 2011 07:18:13PM *  3 points [-]

Yes, that's the idea.

Comment author: shokwave 10 January 2011 07:21:44PM 3 points [-]

I see. My brain automatically does the math for me and sees 1 or 5 as equivalent to none or four. I think it assumes that human lives are fungible or something.

Comment author: Will_Sawin 12 January 2011 02:41:15AM 4 points [-]

That's a good brain. Pat it or something.