Eliezer_Yudkowsky comments on Real-Life Anthropic Weirdness - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (86)
When you said that, it seemed to me that you were saying that you shouldn't play the lottery even if the expected payoff - or even the expected utility - were positive, because the payoff would happen so rarely.
Does that mean you have a formulation for rational behavior that maximizes something other than expected utility? Some nonlinear way of summing the utility from all possible worlds?
If someone suggested that everyone in the world should pool their money together, and give it to one person selected at random (pretend for the sake of argument that utility = money), people would think that was crazy. Yet the idea of maximizing expected utility over all possible worlds assumes that an uneven distribution of utility to all your possible future selves is as good as an equitable distribution among them. So there's something wrong with maximizing expected utility.