You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Viliam_Bur comments on Open Thread for February 11 - 17 - Less Wrong Discussion

3 Post author: Coscott 11 February 2014 06:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (325)

You are viewing a single comment's thread. Show more comments above.

Comment author: Viliam_Bur 11 February 2014 08:16:18PM *  -2 points [-]

I guess the reason is maximizing one's utility function, in general. Empathy is just one component of the utility function (for those agents who feel it).

If multiple agents share the same utility function, and they know it, it should make their cooperation easier, because they only have to agree on facts and models of the world; they don't have to "fight" against each other.

Comment author: [deleted] 12 February 2014 09:17:01PM 1 point [-]

Apparently, we mean different things by "utilitarianism". I meant moral system whose terminal goal is to maximize pleasure and minimize suffering in the whole world, while you're talking about agent's utility function, which may have no regard for pleasure and suffering.

I agree, thought, that it makes sense to try to maximize one's utility function, but to me it's just egoism.