drethelin comments on 'Effective Altruism' as utilitarian equivocation. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (79)
I agree. Every non-sentientist value that you add to your pool of intrinsic values needs an exchange rate (which can be non-linear and complex and whatever) that implies you'd be willing to let people suffer in exchange for said value. This seems egoistic rather than altruistic because you'd be valuing your own preference for tradition more than you value the well-being of others for their own sake. If other people value tradition intrinsically, then preference utilitarianism will output that tradition counts to the extent that it satisfies people's preferences for it. This would be the utilitarian way to include "complexity of value".
If other people value tradition instead of helping other people, then the utilitarian thing to do is to get them to value helping other people more and tradition less. And on it goes, until you've tiled the universe with altruistic robots who only care about helping other altruistic robots (help other altrustic robots (help other altruistic robots (....(...(
Utilitarianism is fundamentally incompatible with value complexity.
If you value you something the correct thing to do is to convince others to value it. OBVIOUSLY AND WHATEVER YOUR VALUE IS. This is not a problem with utilitarianism. It's a problem with Values. If you value tradition it helps your values to convince other people to value tradition until the universe is tiled with traditional robots.
It's a problem with simple values, not values in general. If you have a complex value system, it might contain detailed, not-concisely-summarizable specifications about exactly when it helps to convince other people to value tradition.