anonym comments on Making your explicit reasoning trustworthy - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (93)
I've been thinking about this lately. Specifically, I've been considering the following question:
If you were somehow obliged to pick which of your current beliefs you'd disagree with in eight years time, with real and serious consequences for picking correctly or incorrectly, what criteria would you use to pick them?
I'm pretty sure that difficulty in answering this question is a good sign.
It seems to me that the problem splits into two parts-- changes in belief that you have no way of predicting (they're based on information and/or thinking that you don't have yet), and changes in belief that are happening slowly because you don't like the implications.
Like Nancy said for the seond class of problems, but a little more generally, I'd preferentially pick the ones that I have rational reasons to suspect at the moment and that seem to be persisting for reasons that aren't obvious to me (or aren't rational), and ones that feel like they're surviving because they exploit my cognitive biases and other undesirable habits like akrasia.