You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Wei_Dai comments on A Thought on Pascal's Mugging - Less Wrong Discussion

12 Post author: komponisto 10 December 2010 06:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (159)

You are viewing a single comment's thread.

Comment author: Wei_Dai 01 April 2011 12:30:54AM 3 points [-]

The idea is that the Kolmogorov complexity of "3^^^^3 units of disutility" should be much higher than the Kolmogorov complexity of the number 3^^^^3.

Is your utility function such that there is some scenario for which you assign -3^^^^3 utils? If so, then the Kolmogorov complexity of "3^^^^3 units of disutility" can't be greater than K(your brain) + K(3^^^^3), since I can write a program to output such a scenario by iterating through all possible scenarios until I find one which your brain assigns -3^^^^3 utils.

A prior of 2^-(K(your brain) + K(3^^^^3)) is not nearly small enough, compared to the utility -3^^^^3, to make this problem go away.

Comment author: komponisto 03 April 2011 05:43:06PM 0 points [-]

Come to think of it, the problem with this argument is that it assumes that my brain can compute the utility it assigns. But if it's assigning utility according to Kolmogorov complexity (effectively the proposal in the post), that's impossible.

The same issue arises with having probability depend on complexity.

Comment author: Wei_Dai 04 April 2011 03:59:22PM *  0 points [-]

Ok, I think in that case my argument doesn't work. Let me try another approach.

Suppose some stranger appears to you and says that you're living in a simulated world. Out in the real world there is another simulation that contains 3^^^^3 identical copies of a utopian Earth-like planet plus another 3^^^^3 identical copies of a less utopian (but still pretty good) planet.

Now, if you press this button, you'll turn X of the utopian planets into copies of the less utopian planet, where X is a 10^100 digit random number. (Note that K(X) is of order 10^100 which is much larger than K(3^^^^3) and so pressing the button would increase the Kolmogorov complexity of that simulated world by about 10^100.)

What does your proposed utility function say you should do (how much would you pay to either press the button or prevent it being pressed), and why?

Comment author: komponisto 04 April 2011 11:29:15PM 1 point [-]

Utility is monotonic, even though complexity isn't. (Thus X downgrades out of the 3^^^^3 wouldn't be as bad as, say, 3^^^3 downgrades.) However, utility is bounded by complexity: the complexity of a scenario with utility N must be at least N. (Asymptotically, of course.)

Comment author: komponisto 01 April 2011 04:54:50AM 0 points [-]

Is your utility function such that there is some scenario for which you assign -3^^^^3 utils?

Probably not, if "you" is interpreted strictly to refer to my current human brain, as opposed to including more complex "enhancements" of the latter.