Nick_Tarleton comments on Rationalists lose when others choose - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (55)
...but it doesn't (except in the trivial sense that says any action I take to achieve my values is thus "selfish").
And it may be perfectly rational (of high instrumental value) to be significantly altruistic (in your behavior), even if you place no terminal value whatsoever on helping other people, if it's what it takes to live comfortably in society, and you value your own comfort...
Yes, thank you. I think Eliezer, Nick, and the others complaining about this are confusing "acting selfishly" with "acting in a way that society judges as selfish".
You are not helping by being imprecise.
This is a convenient word swap. Simplifying slightly, and playing a little taboo, we get:
"If you have a strictly selfish utility function, and you have a system of thinking that is especially good at satisfying this function, people will never trust you where your interests may coincide."
Well, yes. Duh.
But if people actually liked your utility function, they'd want you to be more, not less, rational. That is, if both my lover and I value each others' utility about as much as our own, we both want each other to be rational, because we'd be maximizing a very similar utility function. If, as your example requires, my coefficient for my lover's utility is zero, they'd want me to be irrational precisely because they want my behaviour to maximize a term that has no weight in my utility function (unless of course their utility function also has a zero coefficient for their utility, which would be unusual).
Rationality, as generally used on this site, refers to a method of understanding the world rather than a specific utility function. Because it has been redefined here, this seems neither insightful nor a serious problem for rationality.
That was pretty close to what "instrumental rationality" means. Utility functions are not /necessarily/ selfish - but the ones biology usually makes are.
Yes, but also: If they're not selfish, then you're not looking at an independent rational agent.
Definitions of "instrumental rationality" make no mention of selfishness. The term seems like a distraction.
Yes. It's a distraction. I greatly regret using it.
Yes, it's trivial. That doesn't make it untrue. "Selfish" = trying to achieve your values, rather than a blend of your values and other people's values.
'selfish', as it's used in ethics and ordinary speech, is a vice involving too much concern for oneself with respect to others. If virtue theory is correct, acting selfishly is bad for oneself.