Unknown comments on Anthropomorphic Optimism - Less Wrong

25 Post author: Eliezer_Yudkowsky 04 August 2008 08:17PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (51)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Unknown 05 August 2008 12:00:34PM 1 point [-]

The "mistake" Michael is talking about it the belief that utility maximization can lead to counter intuitive actions, in particular actions that humanly speaking are bound to be useless, such as accepting a Wager or a Mugging.

This is in fact not a mistake at all, but a simple fact (as Carl Shulman and Nick Tarleton suspect.) The belief that it does not is simply a result of Anthropomorphic Optimism as Eliezer describes it; i.e. "This particular optimization process, especially because it satisfies certain criteria of rationality, must come to the same conclusions I do." Have you ever considered the possibility that your conclusions do not satisfy those criteria of rationality?