You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Coscott comments on Open Thread for February 11 - 17 - Less Wrong Discussion

3 Post author: Coscott 11 February 2014 06:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (325)

You are viewing a single comment's thread. Show more comments above.

Comment author: Coscott 11 February 2014 07:47:06PM 0 points [-]

I don't know how to parse that question.

I am claiming that people with no empathy at all can agree to work towards utilitarianism, for the same reason they can agree to cooperate in the repeated prisoner's dilemma.

Comment author: Lumifer 11 February 2014 08:04:53PM *  2 points [-]

I am claiming that people with no empathy at all can agree to work towards utilitarianism, for the same reason they can agree to cooperate in the repeated prisoner's dilemma.

I don't understand why is this an argument in favor of utilitarianism.

A bunch of people can agree to work towards pretty much anything, for example getting rid of the unclean/heretics/untermenschen/etc.

Comment author: Coscott 11 February 2014 08:09:49PM 0 points [-]

I think you are taking this sentence out of context. I am not trying to present an argument in favor of utilitarianism. I was trying to explain why empathy is not necessary for utilitarianism.

I interpreted the question as "Why (other than my empathy) should I try to maximize other people's utility?"

Comment author: Lumifer 11 February 2014 08:24:46PM 2 points [-]

I interpreted the question as "Why (other than my empathy) should I try to maximize other people's utility?"

Right, and here is your answer:

You can band together lots of people to work together towards the same utilitarianism.

I don't understand why this is a reason "to maximize other people's utility".

Comment author: Coscott 11 February 2014 08:28:34PM 0 points [-]

You can entangle your own utility with other's utility, so that what maximizes your utility also maximizes their utility and vice versa. Your terminal value does not change to maximizing other people's utility, but it becomes a side effect.

Comment author: Lumifer 11 February 2014 08:31:36PM 2 points [-]

So you are basically saying that sometimes it is in your own self-interest ("own utility") to cooperate with other people. Sure, that's a pretty obvious observation. I still don't see how it leads to utilitarianism.

If you terminal value is still self-interest but it so happens that there is a side-effect of increasing other people's utility -- that doesn't look like utilitarianism to me.

Comment author: Coscott 11 February 2014 08:54:12PM 0 points [-]

I was only trying to make the obvious observation.

Just trying to satisfy your empathy does not really look like pure utilitarianism either.

Comment author: [deleted] 11 February 2014 08:16:02PM 0 points [-]

There's no need to parse it anymore, I didn't get your comment initially.

for the same reason they can agree to cooperate in the repeated prisoner's dilemma.

I agree theoretically, but I doubt that utilitarianism can bring more value to egoistic agent than being egoistic without regard to other humans' happiness.

Comment author: Coscott 11 February 2014 08:22:16PM 2 points [-]

I agree in the short term, but many of my long term goals (e.g. not dying) require lots of cooperation.