Psychohistorian comments on Rationalists lose when others choose - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (55)
This is a convenient word swap. Simplifying slightly, and playing a little taboo, we get:
"If you have a strictly selfish utility function, and you have a system of thinking that is especially good at satisfying this function, people will never trust you where your interests may coincide."
Well, yes. Duh.
But if people actually liked your utility function, they'd want you to be more, not less, rational. That is, if both my lover and I value each others' utility about as much as our own, we both want each other to be rational, because we'd be maximizing a very similar utility function. If, as your example requires, my coefficient for my lover's utility is zero, they'd want me to be irrational precisely because they want my behaviour to maximize a term that has no weight in my utility function (unless of course their utility function also has a zero coefficient for their utility, which would be unusual).
Rationality, as generally used on this site, refers to a method of understanding the world rather than a specific utility function. Because it has been redefined here, this seems neither insightful nor a serious problem for rationality.