If "humility" can be used to justify both activities and their opposites so easily, perhaps it's a useless concept and should be tabooed.
Where people have vague mental models that can be used to argue anything, they usually end up believing whatever they started out wanting to believe.
"Humility" is a virtue that is often misunderstood. This doesn't mean we should discard the concept of humility, but we should be careful using it.
It seems to me to be the case that when confronting rationalists, those who have a belief they're motivated to continue to hold will attempt to manipulate rationalists into withdrawing skepticism or risk social disapproval. For example, when creationists ask something like "how can you be sure you're absolutely right about evolution?", I believe the actual intention is not to induce humility on the part of the evolutionist, but to appeal and warning for the evolutionist not to risk the creationist's disapproval.
So, it's crucial to identify the difference between when someone else wants you to be humble, and when someone wants you to be socially modest so you don't frustrate them by challenging their beliefs.
There's better discussion than what I can produce on when humility is and isn't useful in the comments of the SEQ RERUN of this post
NOTE: edited for simplicity and grammar.
If a particular impediment to accurate belief is a bias, what actionable consequences does that have relative to it not being a bias? EY seems to hint at biases being more difficult to correct than other impediments, and intuitively that makes sense, but is the actual approach any different?
This definition of bias seems problematic. If a putative bias is caused by absorbed cultural mores, then supposedly it is not a bias. But that causal chain can be tricky to track down; we go on thinking something is a 'bias' until we find the black swan culture where the bias doesn't exist, and then realize that the problem was not inherent in our mental machinery. But is that distinction even worth making, if we don't know what caused the bias?
I suspect the definition is worth making because even if we don't know what caused the bias, we can use the label of a bias "not inherent in our mental machinery" as a marker for study of what it's cause is in the future.
For example, I read in a contemporary undergraduate social psychology textbook that experimental results found that a common bias affected subjects from Western cultures more strongly than it affected subjects from more interdependent cultures such as China and Japan.
[Obviously, my example is useless. I just don't have access to that book at the current moment. I will update this comment with more detail when I'm able.]
Is there any decent literature on the extent to which the fact of knowing that my shoelaces are untied is a real property of the universe? Clearly it has measurable consequences - it will result in a predictable action taking place with a high probability. Saying 'I predict that when someone will tie his shoelaces when he sees they're undone' is based not on the shoelaces being untied, nor on the photons bouncing, but on this abstract concept of them knowing. Is there a mathematical basis for stating that the universe has measurably changed in a nonrandom way once those photons' effects are analysed? I'd love to read more on this.
This was a comment from the original post, and I agree with this sentiment. There was no response, so I'm wondering if anyone can answer this question now.
Note: I don't know who the author is because I have Kibitzing on in Chrome and for some reason I cannot turn it off right now.
I wonder how much of a correlation there is between people who put effort into self-training in rationality (or communal training, a la Less Wrong) and those who actually train a martial art. And I don't mean in the "Now I'lll be able to beat up people Hoo-AH" three-week-course training - I mean real, long-term, long-rewards-curve training. I've done aikido on and off for years (my life's been too hectic to settle down to a single dojo, sadly), and it takes a similar sort of dedication, determination, and self-reflection as a serious foray into training your mind to rationality. And, I'd go so far as to say, a similar 'predilection of mind and preference' (and I'll let you LWers go to town on that one).
What are you meaning by correlation? Do you mean how similar the thinking of those who mindfully approach rationality training is to the thinking of people who seriously dedicate themselves to practicing a martial art? Or do you mean something else?
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
The following was comment from discussion of the original post, and I thought it was great. Does anyone know how I might go about designing and practicing such calibration tests well?