timtyler comments on Model Uncertainty, Pascalian Reasoning and Utilitarianism - Less Wrong

23 Post author: multifoliaterose 14 June 2011 03:19AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (154)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 15 June 2011 10:46:53AM 4 points [-]

I don't see how you can get people to stop talking about human utility functions unless you close LW off from newcomers.

I was pretty happy before LW, until I learnt about utility maximization. It tells me that I ought to do what I don't want to do on any other than some highly abstract intellectual level. I don't even get the smallest bit of satisfaction out of it, just depression.

Saving galactic civilizations from superhuman monsters burning the cosmic commons, walking into death camps as to reduce the likelihood of being blackmailed, discounting people by the length of their address in the multiverse...taking all that seriously and keeping one's sanity, that's difficult for some people.

What LW means by 'rationality' is to win in a hard to grasp sense that is often completely detached from the happiness and desires of the individual.

Comment author: timtyler 15 June 2011 12:22:14PM *  3 points [-]

I was pretty happy before LW, until I learnt about utility maximization. It tells me that I ought to do what I don't want to do on any other than some highly abstract intellectual level. I don't even get the smallest bit of satisfaction out of it, just depression.

That's not utility maximisation, that's utilitarianism. A separate idea, though confusingly named.

IMHO, utilitarianism is a major screw-up for a human being. It is an unnatural philosophy which lacks family values and seems to be used mostly by human beings for purposes of signalling and manipulation.