Eliezer_Yudkowsky comments on Uncritical Supercriticality - Less Wrong

47 Post author: Eliezer_Yudkowsky 04 December 2007 04:40PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (159)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 05 December 2007 05:24:59AM 4 points [-]

Isn't the probability of ending up in a real world situation where the entire world is in terrible danger and only you can save it vastly smaller than that of falsely perceiving such a situation? Despite that, I'm glad Petrov made his decision.

Fair enough. s/probability of/expected utilities associated with/

But you can still end up with a "flat" rule for the human art of rationality, when the expected negative utilities associated from biased decisions that "the end justifies the means, in just this one case here", exceeds the expected positive utilities from cases where the universe really does end up a better place from shooting someone who makes an argument you don't like after taking all side effects into account including encouragement of similar behavior by others.

Remember, human targets shoot back. Since bullets are not even probabilistically more likely to hit when fired at a human target who has just made false statements as opposed to true statements, it's very difficult to see how a social decision process can be made more rational by introducing bullets into it.