RaelwayScot comments on Torture vs. Dust Specks - Less Wrong

39 Post author: Eliezer_Yudkowsky 30 October 2007 02:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (596)

Sort By: Old

You are viewing a single comment's thread.

Comment author: RaelwayScot 06 January 2016 12:49:18PM 0 points [-]

I think the problem here is the way the utility function is chosen. Utilitarianism is essentially a formalization of reward signals in our heads. It is a heuristic way of quantifying what we expect a healthy human (one that can raise up and survive in a typical human environment and has an accurate model of reality) to want. All of this only converges roughly to a common utility because we have evolved to have the same needs which are necessarily pro-life and pro-social (since otherwise our species wouldn't be present today).

Utilitarianism crudely abstracts from the meanings in our heads that we recognize as common goals and assigns numbers to them. We have to be careful what we want to assign numbers to in order to get results that we want in all corner cases. I think, hooking up the utility meter to neurons that detect minor inconveniences is not a smart way of achieving what we collectively want because it might contradict our pro-life and pro-social needs. Only when the inconveniences accumulate individually so that they condense as states of fear/anxiety or noticeably shorten human life, it affects human goals and it makes sense to include them into utility considerations (which, again, are only a crude approximation of what we have evolved to want).