You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

jkaufman comments on Does utilitarianism "require" extreme self sacrifice? If not why do people commonly say it does? - Less Wrong Discussion

7 Post author: Princess_Stargirl 09 December 2014 08:32AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (99)

You are viewing a single comment's thread. Show more comments above.

Comment author: SilentCal 09 December 2014 06:05:24PM 22 points [-]

My view, and a lot of other people here seem to also be getting at this, is that the demandingness objection comes from a misuse of utilitarianism. People want their morality to label things 'permissible' and 'impermissible', and utilitarianism doesn't natively do that. That is, we want boolean-valued morality. The trouble is, Bentham went and gave us a real-valued one. The most common way to get a bool out of that is to label the maximum 'true' and everything else false, but that doesn't give a realistically human-followable result. Some philosophers have worked on 'satisficing consequentialism', which is a project to design a better real-to-bool conversion, but I think the correct answer is to learn to use real-valued morality.

There's some oversimplification above (I suspect people have always understood non-boolean morality in some cases), but I think it captures the essential problem.

Comment author: Dagon 09 December 2014 07:52:46PM 4 points [-]

Huh? So your view of a moral theory is that it ranks your options, but there's no implication that a moral agent should pick the best known option?

What purpose does such a theory serve? Why would you classify it as a "moral theory" rather than "an interesting numeric excercise"?

Comment author: jkaufman 09 December 2014 08:51:48PM *  1 point [-]

An agent should pick the best options they can get themselves to pick. In practice this will not be the ones that maximizes utility as they understand it, but it will be ones with higher utility than if they just did whatever they felt like. And, more strongly, it this gives higher utility than if they tried to do as many good things as possible without prioritizing the really important ones.