Eliezer_Yudkowsky comments on Two-Tier Rationalism - Less Wrong

40 Post author: Alicorn 17 April 2009 07:44PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 17 April 2009 10:13:40PM 12 points [-]

And here is the reason I linked to "Bayesians vs. Barbarians", above: what Eliezer is proposing as the best course of action for a rationalist society that is attacked from without sounds like a second-tier rationalism.

Not exactly. Since I intend to work with self-modifying AIs, any decision theory I care to spend much time thinking about should be reflectively consistent and immediately so. This excludes e.g. both causal decision theory and evidential decision theory as usually formulated.

The idea of sacrificing your life after being selected in a draft lottery that maximized your expectation of survival if all other agents behaved the same way you did, is not meant to be second-tier.

But if humans cannot live up to such stern rationality in the face of Newcomblike decision problems, then after taking their own weakness into account, they may have cause to resort to enforcement mechanisms. This is second-tier-ish in a way, but still pretty strongly interpretable as maximizing, to the extent that you vote on the decision before the lottery.