Posts

Sorted by New

Wiki Contributions

Comments

Navanen11y10

The following was comment from discussion of the original post, and I thought it was great. Does anyone know how I might go about designing and practicing such calibration tests well?

“I'd suggest that there is a relatively straightforward and unproblematic place to apply humility: to overcome the universal overconfidence bias. Many studies have found that when asked to give estimates with a confidence interval, error rates are far higher than would be expected if the confidence interval were accurate. Many of these find errors an order of magnitude or more than subjects expected. You could take self-tests and find out what your overconfidence level is, then develop a calibration scale to correct your estimates. You could then use this to modify your confidence levels on future guesses and approach an unbiased estimate. One risk is that knowing that you are going to modify your intuitive or even logically-deduced confidence level may interfere with your initial guess. This might go in either direction, depending on your personality. It could be that knowing you are going to increase your error estimate will motivate you to subconsciously decrease your initial error estimate, so as to neutralize the anticipated adjustment. Or in the other direction, it could be that knowing that you always guess too low an error will cause you to raise your error guesses, so that your correction factor is too high. However both of these could be dealt with in time by re-taking tests while applying your error calibration, adjusting it as needed.”

Navanen11y00

Where people have vague mental models that can be used to argue anything, they usually end up believing whatever they started out wanting to believe.

"Humility" is a virtue that is often misunderstood. This doesn't mean we should discard the concept of humility, but we should be careful using it.

It seems to me to be the case that when confronting rationalists, those who have a belief they're motivated to continue to hold will attempt to manipulate rationalists into withdrawing skepticism or risk social disapproval. For example, when creationists ask something like "how can you be sure you're absolutely right about evolution?", I believe the actual intention is not to induce humility on the part of the evolutionist, but to appeal and warning for the evolutionist not to risk the creationist's disapproval.

So, it's crucial to identify the difference between when someone else wants you to be humble, and when someone wants you to be socially modest so you don't frustrate them by challenging their beliefs.

There's better discussion than what I can produce on when humility is and isn't useful in the comments of the SEQ RERUN of this post

NOTE: edited for simplicity and grammar.

Navanen11y00

I suspect the definition is worth making because even if we don't know what caused the bias, we can use the label of a bias "not inherent in our mental machinery" as a marker for study of what it's cause is in the future.

For example, I read in a contemporary undergraduate social psychology textbook that experimental results found that a common bias affected subjects from Western cultures more strongly than it affected subjects from more interdependent cultures such as China and Japan.

[Obviously, my example is useless. I just don't have access to that book at the current moment. I will update this comment with more detail when I'm able.]

Navanen11y10

Is there any decent literature on the extent to which the fact of knowing that my shoelaces are untied is a real property of the universe? Clearly it has measurable consequences - it will result in a predictable action taking place with a high probability. Saying 'I predict that when someone will tie his shoelaces when he sees they're undone' is based not on the shoelaces being untied, nor on the photons bouncing, but on this abstract concept of them knowing. Is there a mathematical basis for stating that the universe has measurably changed in a nonrandom way once those photons' effects are analysed? I'd love to read more on this.

This was a comment from the original post, and I agree with this sentiment. There was no response, so I'm wondering if anyone can answer this question now.

Note: I don't know who the author is because I have Kibitzing on in Chrome and for some reason I cannot turn it off right now.

Navanen11y10

What are you meaning by correlation? Do you mean how similar the thinking of those who mindfully approach rationality training is to the thinking of people who seriously dedicate themselves to practicing a martial art? Or do you mean something else?