latanius comments on Morality is not about willpower - Less Wrong

9 Post author: PhilGoetz 08 October 2011 01:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (144)

You are viewing a single comment's thread.

Comment author: latanius 08 October 2011 12:33:05PM 3 points [-]

Having two utility functions is like having no utility function at all, because you don't have an ordering of preferences.

The only kind of model that needs a global utility function is an optimization process. Obviously, after considering each alternative, there needs to be a way to decide which one to choose... assuming that we do things like considering alternatives and choosing one of them (using an ordering that is represented by the one utility function).

For example, evolution has a global utility function (inclusive genetic fitness). Of course, it may be described in parts (endurance, running speed, attractiveness to mates etc), but in the end it gets summed up (described by whether the genes are multiplied or not).

That said, there are things (such as Kaj's centrifugal governor or human beings) that aren't best modelled as optimization processes. Reflexes, for example, don't optimize, they just work (and a significant proportion of our brains does likewise). The fact that our conscious thinking is a (slightly buggy) implementation of an optimization process (with a more or less consistent utility function) might suggest that whole humans can also be modelled well that way...