You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Nornagest comments on 'Effective Altruism' as utilitarian equivocation. - Less Wrong Discussion

1 Post author: Dias 24 November 2013 06:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (79)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nornagest 26 November 2013 07:17:01PM *  0 points [-]

utilitarianism isn't extreme altruism. It's just a way of trying to quantify morality. It doesn't decide what you care about. I'm pretty tired of people reacting to the concept of Utilitarianism with "Oh shit does that mean I need to give away all my money and live subsistence style to be a good person!?" A selfish utilitarian is just as possible as an extremely altruistic one or as one who's moderately altruistic.

There's enough ambiguity here that I'm not totally sure, but it sounds like you're describing consequential ethics, not utilitarianism as such. Utilitarianisms vary in details, but they all imply that people's utility is fungible, including that of their adherents; that a change in (happiness, fulfillment, preference satisfaction) is just as significant whether it applies to you or to, say, a bricklayer's son living in a malarial part of Burkina Faso.

It's certainly possible to claim utilitarian ethics and still prioritize your own utility in practice. But that's inconsistent -- aside from a few quibbles regarding asymmetric information -- with being a good person by that standard, if the standard means anything at all.

Comment author: drethelin 26 November 2013 07:27:24PM 0 points [-]

I've always thought a utilitarianism as an effort to quantify "good" and a framework for making moral decisions rather than an imperative. EG, the term Utility Function is a subset of utilitarian theory but does not presuppose utilitarian base motivations. Someone's utility function consists of their desire to maximize welfare as well as their desires to hope and honor and whatnot.

It's become increasingly clear that very few people think about it this way.

Comment author: Nisan 27 November 2013 12:13:23AM 2 points [-]

Yep, see the SEP on Utilitarianism and the LW wiki on utility functions.