Will_Sawin comments on Confidence levels inside and outside an argument - Less Wrong

129 Post author: Yvain 16 December 2010 03:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread. Show more comments above.

Comment author: Perplexed 17 December 2010 04:49:20PM 1 point [-]

Are you 100% sure about that?

Comment author: Will_Sawin 18 December 2010 05:28:07AM 0 points [-]

I don't know how to compute beliefs, conditional on it being false.

Comment author: Perplexed 18 December 2010 05:42:50AM 0 points [-]

My point is that there are some propositions - for instance the epistemic perfection of Bayesianism - to which you attach a probability of exactly 1.0. Yet you want to remain free to reject some of those "100% sure" beliefs at some future time, should evidence or argument convince you to do so. So, I am advising you to have one Bayesian in your head who believes the 'obvious', and at least one who doubts it. And then if the obvious ever becomes falsified, you will still have one Bayesian you can trust.

Comment author: Will_Sawin 18 December 2010 05:59:57AM 0 points [-]

I don't think the other guy counts as a Bayesian.

That's definitely a good approximation of the organizational structure of the human mind of an imperfect Bayesian. You have a human consciousness simulating a Bayesian probability-computer, but the human contains heuristics powerful enough to, in some situations, overrule the Bayesian.

This has nothing to do with arguments, though.