Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Navanen 18 December 2012 06:15:19AM 1 point [-]

The following was comment from discussion of the original post, and I thought it was great. Does anyone know how I might go about designing and practicing such calibration tests well?

“I'd suggest that there is a relatively straightforward and unproblematic place to apply humility: to overcome the universal overconfidence bias. Many studies have found that when asked to give estimates with a confidence interval, error rates are far higher than would be expected if the confidence interval were accurate. Many of these find errors an order of magnitude or more than subjects expected. You could take self-tests and find out what your overconfidence level is, then develop a calibration scale to correct your estimates. You could then use this to modify your confidence levels on future guesses and approach an unbiased estimate. One risk is that knowing that you are going to modify your intuitive or even logically-deduced confidence level may interfere with your initial guess. This might go in either direction, depending on your personality. It could be that knowing you are going to increase your error estimate will motivate you to subconsciously decrease your initial error estimate, so as to neutralize the anticipated adjustment. Or in the other direction, it could be that knowing that you always guess too low an error will cause you to raise your error guesses, so that your correction factor is too high. However both of these could be dealt with in time by re-taking tests while applying your error calibration, adjusting it as needed.”

Comment author: TimFreeman 11 August 2011 01:04:49PM 2 points [-]

If "humility" can be used to justify both activities and their opposites so easily, perhaps it's a useless concept and should be tabooed.

Comment author: Navanen 18 December 2012 05:50:09AM *  0 points [-]

Where people have vague mental models that can be used to argue anything, they usually end up believing whatever they started out wanting to believe.

"Humility" is a virtue that is often misunderstood. This doesn't mean we should discard the concept of humility, but we should be careful using it.

It seems to me to be the case that when confronting rationalists, those who have a belief they're motivated to continue to hold will attempt to manipulate rationalists into withdrawing skepticism or risk social disapproval. For example, when creationists ask something like "how can you be sure you're absolutely right about evolution?", I believe the actual intention is not to induce humility on the part of the evolutionist, but to appeal and warning for the evolutionist not to risk the creationist's disapproval.

So, it's crucial to identify the difference between when someone else wants you to be humble, and when someone wants you to be socially modest so you don't frustrate them by challenging their beliefs.

There's better discussion than what I can produce on when humility is and isn't useful in the comments of the SEQ RERUN of this post

NOTE: edited for simplicity and grammar.

Comment author: AspiringRationalist 21 June 2012 06:46:34AM 0 points [-]

If a particular impediment to accurate belief is a bias, what actionable consequences does that have relative to it not being a bias? EY seems to hint at biases being more difficult to correct than other impediments, and intuitively that makes sense, but is the actual approach any different?

Comment author: Navanen 18 December 2012 05:11:27AM 0 points [-]

I second this question.

Comment author: thomblake 17 April 2012 08:21:25PM 1 point [-]

This definition of bias seems problematic. If a putative bias is caused by absorbed cultural mores, then supposedly it is not a bias. But that causal chain can be tricky to track down; we go on thinking something is a 'bias' until we find the black swan culture where the bias doesn't exist, and then realize that the problem was not inherent in our mental machinery. But is that distinction even worth making, if we don't know what caused the bias?

Comment author: Navanen 18 December 2012 05:06:48AM 0 points [-]

I suspect the definition is worth making because even if we don't know what caused the bias, we can use the label of a bias "not inherent in our mental machinery" as a marker for study of what it's cause is in the future.

For example, I read in a contemporary undergraduate social psychology textbook that experimental results found that a common bias affected subjects from Western cultures more strongly than it affected subjects from more interdependent cultures such as China and Japan.

[Obviously, my example is useless. I just don't have access to that book at the current moment. I will update this comment with more detail when I'm able.]

Comment author: Navanen 18 December 2012 04:16:13AM 1 point [-]

Is there any decent literature on the extent to which the fact of knowing that my shoelaces are untied is a real property of the universe? Clearly it has measurable consequences - it will result in a predictable action taking place with a high probability. Saying 'I predict that when someone will tie his shoelaces when he sees they're undone' is based not on the shoelaces being untied, nor on the photons bouncing, but on this abstract concept of them knowing. Is there a mathematical basis for stating that the universe has measurably changed in a nonrandom way once those photons' effects are analysed? I'd love to read more on this.

This was a comment from the original post, and I agree with this sentiment. There was no response, so I'm wondering if anyone can answer this question now.

Note: I don't know who the author is because I have Kibitzing on in Chrome and for some reason I cannot turn it off right now.

Comment author: chatquitevoit 18 July 2011 04:08:46PM 0 points [-]

I wonder how much of a correlation there is between people who put effort into self-training in rationality (or communal training, a la Less Wrong) and those who actually train a martial art. And I don't mean in the "Now I'lll be able to beat up people Hoo-AH" three-week-course training - I mean real, long-term, long-rewards-curve training. I've done aikido on and off for years (my life's been too hectic to settle down to a single dojo, sadly), and it takes a similar sort of dedication, determination, and self-reflection as a serious foray into training your mind to rationality. And, I'd go so far as to say, a similar 'predilection of mind and preference' (and I'll let you LWers go to town on that one).

Comment author: Navanen 08 December 2012 04:52:16AM 0 points [-]

What are you meaning by correlation? Do you mean how similar the thinking of those who mindfully approach rationality training is to the thinking of people who seriously dedicate themselves to practicing a martial art? Or do you mean something else?