I am a newbie so today I read the article by Eliezer Yudkowski "Your Strength As A Rationalist" which helped me understand the focus of LessWrong, but I respectfully disagreed with a line that is written in the last paragraph:
It is a design flaw in human cognition...
So this was my comment in the article's comment section which I bring here for discussion:
Since I think evolution makes us quite fit to our current environment I don't think cognitive biases are design flaws, in the above example you imply that even if you had the information available to guess the truth, your guess was another one and it was false, therefore you experienced a flaw in your cognition.
My hypotheses is that reaching the truth or communicating it in the IRC may have not been the end objective of your cognitive process, in this case just to dismiss the issue as something that was not important anyway "so move on and stop wasting resources in this discussion" was maybe the "biological" objective and as such it should be correct, not a flaw.
If the above is true then all cognitive bias, simplistic heuristics, fallacies, and dark arts are good since we have conducted our lives for 200,000 years according to these and we are alive and kicking.
Rationality and our search to be LessWrong, which I support, may be tools we are developing to evolve in our competitive ability within our species, but not a "correction" of something that is wrong in our design.
Edit 1: I realize there is change in the environment and that may make some of our cognitive biases, which were useful in the past, to be obsolete. If the word "flaw" is also applicable to describe something that is obsolete then I was wrong above. If not, I prefer the word obsolete to characterize cognitive biases that are no longer functional for our preservation.
Depends on what you define success, actually.
Most people here like science and everything around it so eliminating cognitive biases is EXTREMELY important in order to reach their goal.
Most people on the outside however are more obsessed about money or status and so are probably going to benefit from some degree of rationality, but anything else is probably dimnishing returns for one reason or another.ff
I'd say Eliezer put a higher standard on "Human" rather than what your average clubgoer thinks of.
Again it depends on how you define success.
In other words, epistemic rationality is not instrumental rationality.
The potential rewards of epistemic rationality for a society are very high.
However, it doesn't follow that everyone needs to be an epistemic rationalist, and it also doesn't necessarily follow that anyone has to remove all their biases individually, since biases can be allowed to cancel out in collective rationality.