Eliezer_Yudkowsky comments on Two Truths and a Lie - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (66)
A rationalist ends up being wrong sometimes, and can only hope for well-calibrated probabilities. I think that, in the absence of observation, this is the sort of prediction that most human-level intelligences would end up getting wrong, and I wouldn't necessarily assume they were making any errors of rationality in doing so, but rather hitting the 1 out of 20 occasions when a 5% probability occurs.