You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Tetronian comments on Yet more "stupid" questions - Less Wrong Discussion

7 Post author: NancyLebovitz 28 August 2013 03:58PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (340)

You are viewing a single comment's thread.

Comment author: [deleted] 28 August 2013 06:50:44PM *  6 points [-]

I am confused by discussions about utilitarianism on LessWrong. My understanding, which comes mostly from the SEP article, was that pretty much all variants of utilitarianism are based on the idea that each person's quality of life can be quantified--i.e., that person's "utility"--and these utilities can be aggregated. Under preference utilitarianism, a person's utility is determined based on whether their values are being fulfilled. Under all of the classical formulations of utilitarianism, everyone's utility function has the same weight when the aggregation is performed, hence the catchy phrase "greatest good for the greatest number".

However, I have also seen LW posts and comments talk about utilitarianism in relation to how much you should value the lives of people close to you compared to other people, and how much you should value abstract things like "freedom" relative to people's lives. This comment thread is one example. These discussions about valuing the lives of others and quantifying abstract values sounds a lot like utility maximization under rational choice theory rather than utilitarianism.

So are people conflating utility maximization and utilitarianism, am I getting confused and misunderstanding the distinction, or is something else going on?

Comment author: Kaj_Sotala 28 August 2013 08:14:14PM 10 points [-]

So are people conflating utility maximization and utilitarianism

Often, yes.

Comment author: Douglas_Knight 28 August 2013 10:17:32PM 0 points [-]

It's true that people often conflate utilitarianism with consequentialism, but I don't think that's what's going on here. I think it is quite reasonable to include under utilitarianism moral theories that are pretty close, like weighting people when aggregating. If people think that raw utilitarianism doesn't describe human morality, isn't it more useful for the term to describe people departing from the outpost, rather than the single theory? Abstract values that are not per-person are more problematic to include in the umbrella, but searching for "free" in that post doesn't turn up an example. If your definition is so narrow that you reject Nozick's utility monster as having to do with utilitarianism, then your definition is too narrow. Also, the lack of a normalization means that giving everyone "the same weight" does not clearly pin it down.

Comment author: blacktrance 28 August 2013 07:55:07PM 0 points [-]

This confused me for a long time too. I ultimately came to the conclusion that "utilitarianism" as that word is usually used by LessWrongers doesn't have the standard meaning of "an ethical theory that holds some kind of maximization of utils in the world to be the good", and instead uses it as something largely synonymous with "consequentialism".

Comment author: ciphergoth 28 August 2013 08:40:54PM 6 points [-]

"Consequentialism" is too broad, "utilitarianism" is too narrow, and "VNM rationality" is too clumsy and not generally thought of as a school of ethical thought.

Comment author: blacktrance 28 August 2013 09:32:16PM 1 point [-]

It sounds like certain forms of egoism.