timtyler comments on Model Uncertainty, Pascalian Reasoning and Utilitarianism - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (154)
I was pretty happy before LW, until I learnt about utility maximization. It tells me that I ought to do what I don't want to do on any other than some highly abstract intellectual level. I don't even get the smallest bit of satisfaction out of it, just depression.
Saving galactic civilizations from superhuman monsters burning the cosmic commons, walking into death camps as to reduce the likelihood of being blackmailed, discounting people by the length of their address in the multiverse...taking all that seriously and keeping one's sanity, that's difficult for some people.
What LW means by 'rationality' is to win in a hard to grasp sense that is often completely detached from the happiness and desires of the individual.
That's not utility maximisation, that's utilitarianism. A separate idea, though confusingly named.
IMHO, utilitarianism is a major screw-up for a human being. It is an unnatural philosophy which lacks family values and seems to be used mostly by human beings for purposes of signalling and manipulation.