Eliezer_Yudkowsky comments on Bayesians vs. Barbarians - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (270)
Yeah, that's a more complex issue - coordination among agents with different risk-bearing efficiencies. If you have an agent known to be fair or sufficiently rigorous rules of reasoning that you can verify fairness, then it's possible for everyone to know that they're taking "equal risk" in the sense of being at-risk for being recruited as a teenager. (But is that the same sort of equal risk as being recruited if you have genes for combat effectiveness?)
A society of rationalists would work it out, but it might be more complicated. And as Lawliet observes, you shouldn't assume you've got nukes and the Soviets don't.