Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Nick_Tarleton comments on Pascal's Mugging: Tiny Probabilities of Vast Utilities - Less Wrong

39 Post author: Eliezer_Yudkowsky 19 October 2007 11:37PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (335)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Nick_Tarleton 20 October 2007 01:34:55AM 11 points [-]

Tom and Andrew, it seems very implausible that someone saying "I will kill 3^^^^3 people unless X" is literally zero Bayesian evidence that they will kill 3^^^^3 people unless X. Though I guess it could plausibly be weak enough to take much of the force out of the problem.

Andrew, if we're in a simulation, the world containing the simulation could be able to support 3^^^^3 people. If you knew (magically) that it couldn't, you could substitute something on the order of 10^50, which is vastly less forceful but may still lead to the same problem.

Andrew and Steve, you could replace "kill 3^^^^3 people" with "create 3^^^^3 units of disutility according to your utility function". (I respectfully suggest that we all start using this form of the problem.)

Michael Vassar has suggested that we should consider any number of identical lives to have the same utility as one life. That could be a solution, as it's impossible to create 3^^^^3 distinct humans. But, this also is irrelevant to the create-3^^^^3-disutility-units form.

IIRC, Peter de Blanc told me that any consistent utility function must have an upper bound (meaning that we must discount lives like Steve suggests). The problem disappears if your upper bound is low enough. Hopefully any realistic utility function has such a low upper bound, but it'd still be a good idea to solve the general problem.

I see a similarity to the police chief example. Adopting a policy of paying attention to any Pascalian muggings would encourage others to manipulate you using them. At first it doesn't seem like this would have nearly enough disutility to justify ignoring muggings, but it might when you consider that it would interfere with responding to any real threat (unlikely as it is) of 3^^^^3 deaths.

Comment author: SforSingularity 04 September 2009 02:58:28PM 11 points [-]

create 3^^^^3 units of disutility according to your utility function

For all X:

If your utility function assigns values to outcomes that differ by a factor of X, then you are vulnerable to becoming a fanatic who banks on scenarios that only occur with probability 1/X. As simple as that.

If you think that banking on scenarios that only occur with probability 1/X is silly, then you have implicitly revealed that your utility function only assigns values in the range [1,Y], where Y<X, and where 1 is the lowest utility you assign.

Comment author: Nick_Tarleton 04 September 2009 03:54:17PM 3 points [-]

If you think that banking on scenarios that only occur with probability 1/X is silly, then you have implicitly revealed that your utility function only assigns values in the range [1,Y], where Y<X, and where 1 is the lowest utility you assign.

... or your judgments of silliness are out of line with your utility function.

Comment author: SforSingularity 04 September 2009 03:56:42PM 2 points [-]

When I said "Silly" I meant from an axiological point of view, i.e. you think the scenario over, and you still think that you would be doing something that made you win less.

Of course in any such case, there are likely to be conflicting intuitions: one to behave as an aggregative consequentialist, and the another to behave like a sane human being.