You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

D_Malik comments on Open thread, Mar. 2 - Mar. 8, 2015 - Less Wrong Discussion

4 Post author: MrMind 02 March 2015 08:19AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (155)

You are viewing a single comment's thread. Show more comments above.

Comment author: D_Malik 02 March 2015 05:07:02PM *  8 points [-]

VNM utility is basically defined as "that function whose expectation we maximize". There exists such a function as long as you obey some very unobjectionable axioms. So instead of saying "I do not want to maximize the expectation of my utility function U", you should say "U is not my utility function".

Comment author: seer 07 March 2015 05:33:35AM 9 points [-]

The problem with this argument, is that it boils down to, if we accept intuitive axioms X we get counter-intuitive result Y. But why is ~Y any less worthy of being an axiom then X?

Comment author: Houshalter 02 March 2015 07:21:19PM 2 points [-]

You miss my point. I am objecting to those axioms. I don't want to change my utility function. If God is real, perhaps he really could offer infinite reward or infinite punishment. You might really think murdering 3^^^^3 people is just that bad.

However these events have such low probability that I can safely choose to ignore them, and that's a perfectly valid choice. Maximizing expected utility means you will almost certainly do worse in the real world than an agent that doesn't.

Comment author: SolveIt 02 March 2015 07:30:01PM 4 points [-]

Which axiom do you reject?

Comment author: MrMind 03 March 2015 08:15:46AM 2 points [-]

Continuity, I would say.

Comment author: hairyfigment 30 March 2015 06:38:36AM 0 points [-]

That makes no sense in context, since continuity is equivalent to saying (roughly) 'If you prefer staying on this side of the street to dying, but prefer something on the other side of the street to staying here, there exists some probability of death which is small enough to make you prefer crossing the street.'

This sounds almost exactly like what Houshalter is arguing in the great-grandparent ("these events have such low probability that I can safely choose to ignore them,") so it can't be the axiom s/he objects to.

I could see objecting to Completeness, since in fact our preferences may be ill-defined for some choices. I don't know if rejecting this axiom could produce the desired result in Pascal's Mugging, though, and I'd half expect it to cause all sorts of trouble elsewhere.

Comment author: IlyaShpitser 03 March 2015 12:09:51PM 1 point [-]

That sounds right, actually.

Comment author: Houshalter 02 March 2015 08:18:42PM *  -1 points [-]

That for any bet with an infinitesimally small value of p, there is a value of u high enough that I would take it.

Comment author: Douglas_Knight 03 March 2015 05:55:51AM 7 points [-]

That's not one of the axioms. In fact, none of the axioms mention u at all.