Emile comments on Average utilitarianism must be correct? - Less Wrong

2 Post author: PhilGoetz 06 April 2009 05:10PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (159)

You are viewing a single comment's thread. Show more comments above.

Comment author: Emile 06 April 2009 08:59:03PM 4 points [-]

If your utility function were defined over all possible worlds, you would just say "maximize utility" instead of "maximize expected utility".

I disagree: that's only the case if you have perfect knowledge.

Case A: I'm wondering whether to flip the switch of my machine. The machine causes a chrono-synclastic infundibulum, which is a physical phenomenon that has a 50% chance of causing a lot of awesomeness (+100 utility), and a 50% chance of blowing up my town (-50 utility).

Case B: I'm wondering whether to flip the switch of my machine, a friendly AI I just programmed. I don't know whether I programmed it right, if I did it will bring forth an awesome future (+100 utility), if I didn't it will try to enslave mankind (-50 utility). I estimate that my program has 50% chances of being right.

Both cases are different, and if you have a utility function that's defined over all possible future words (that just takes the average), you could say that flipping the switch in the first case has utility of +50, and in the second case, expected utility of +50 (actually, utility of +100 or -50, but you don't know which).