Razib summarized my entire cognitive biases talk at the Singularity Summit 2009 as saying: "Most people are stupid."
Hey! That's a bit unfair. I never said during my talk that most people are stupid. In fact, I was very careful not to say, at any point, that people are stupid, because that's explicitly not what I believe.
I don't think that people who believe in single-world quantum mechanics are stupid. John von Neumann believed in a collapse postulate.
I don't think that philosophers who believe in the "possibility" of zombies are stupid. David Chalmers believes in zombies.
I don't even think that theists are stupid. Robert Aumann believes in Orthodox Judaism.
And in the closing sentence of my talk on cognitive biases and existential risk, I did not say that humanity was devoting more resources to football than existential risk prevention because we were stupid.
There's an old joke that runs as follows:
A motorist is driving past a mental hospital when he gets a flat tire.
He goes out to change the tire, and sees that one of the patients is watching him through the fence.
Nervous, trying to work quickly, he jacks up the car, takes off the wheel, puts the lugnuts into the hubcap -
And steps on the hubcap, sending the lugnuts clattering into a storm drain.
The mental patient is still watching him through the fence.
The motorist desperately looks into the storm drain, but the lugnuts are gone.
The patient is still watching.
The motorist paces back and forth, trying to think of what to do -
And the patient says,
"Take one lugnut off each of the other tires, and you'll have three lugnuts on each."
"That's brilliant!" says the motorist. "What's someone like you doing in an asylum?"
"I'm here because I'm crazy," says the patient, "not because I'm stupid."
Excellent distinction, Yasser.
I would add one more case:
Wrongness is when the output of the "program" doesn't correlate reliably with reality. But this could happen not only because the algorithm is flawed (wrong because crazy), but also because of insufficient or incorrect input. I think this is an important distinction, because the person can be smart (non-stupid) and rational (non-irrational = non-crazy) but still wrong nevertheless — and those around would call him "crazy" or "stupid" undeservedly.
Example: CEOs taking calculated risks but being fired because the company, guided by him, flipped the coin and got head instead of the desired tail. Stakeholders expected him to be omniscient.
Those CEOs who get it right will be perceived as omniscient gurus. Hindsight bias will make them write books on how to be successful; survivorship bias will lure people into buying them.
Not being crazy makes your output less wrong. But doesn't guarantee it to be right, either.
If I didn't get it wrong in my analysis above (puns intended), would it be fair to say that this community, having the mission to fix the biases in our algorithms, should be even more appropriately called Less Crazy instead?