A Taxonomy of Bias: Mindware Problems

14 Kaj_Sotala 07 July 2010 09:53PM

This is the third part in a mini-sequence presenting content from Keith E. Stanovich's excellent book What Intelligence Tests Miss: The psychology of rational thought. It will culminate in a review of the book itself.

Noting that there are many different kinds of bias, Keith Stanovich proposes a classification scheme for bias that has two primary categories: the Cognitive Miser, and Mindware Gaps. Last time, I discussed the Cognitive Miser category. Today, I will discuss Mindware Problems, which has the subcategories of Mindware Gaps and Corrupted Mindware.

Mindware Problems

Stanovich defines "mindware" as "a generic label for the rules, knowledge, procedures, and strategies that a person can retrieve from memory in order to aid decision making and problem solving".

Mindware Gaps

Previously, I mentioned two tragic cases. In one, a pediatrician incorrectly testified the odds of a two children in the same family suffering infant death syndrome to be 73 million to 1. In the other, people bought into a story of "facilitated communication" helping previously non-verbal children to communicate, without looking at it in a critical manner. Stanovich uses these two as examples of a mindware gap. The people involved were lacking critical mindware: in one case, that of probabilistic reasoning, in the other, that of scientific thinking. One of the reasons why so many intelligent people can act in an irrational manner is that they're simply missing the mindware necessary for rational decision-making.

Much of the useful mindware is a matter of knowledge: knowledge of Bayes' theorem, taking into account alternative hypotheses and falsifiability, awareness of the conjunction fallacy, and so on. Stanovich also mentions something he calls strategic mindware, which refers to the disposition towards engaging the reflective mind in problem solving. These were previously mentioned as thinking dispositions, and some of them can be measured by performance-based tasks. For instance, in the Matching Familiar Figures Test (MFFT), participants are presented with a picture of an object, and told to find the correct match from an array of six other similar pictures. Reflective people have long response times and few errors, while impulsive people have short response times and numerous errors. These types of mindware are closer to strategies, tendencies, procedures, and dispositions than to knowledge structures.

Stanovich identifies mindware gaps to be involved in at least conjunction errors and ignoring base rates (missing probability knowledge), as well as the Wason selection task and confirmation bias (not considering alternate hypotheses). Incorrect lay psychological theories are identified as a combination of a mindware gap and contaminated mindware (see below). For instance, people are often blind to their own biases, because they incorrectly think that biased thinking on their part would be detectable by conscious introspection. In addition to bias blind spot, lay psychological theory is likely to be involved in errors of affective forecasting (the forecasting of one's future emotional state).

continue reading »

The persuasive power of false confessions

10 matt 11 December 2009 01:54AM

First paragraph from a Mind Hacks post:

The APS Observer magazine has a fantastic article on the power of false confessions to warp our perception of other evidence in a criminal case to the point where expert witnesses will change their judgements of unrelated evidence to make it fit the false admission of guilt.

The post and linked article are worth reading… and I don't have much to add.

The New Nostradamus

13 Kaj_Sotala 12 September 2009 02:42PM

I stumbled upon an article called The New Nostradamus, reporting of a game-theoretic model which predicts political outcomes with startling effectiveness. The results are very impressive. However, the site hosting the article is unfamiliar to me, so I'm not certain of the article's verity, but a quick Google seems to support the claims, at least on a superficial skimming. Here's his TED talk. The model seems almost too good to be true, though. Anybody know more?

Some choice bits from the article:

The claim:

In fact, the professor says that a computer model he built and has perfected over the last 25 years can predict the outcome of virtually any international conflict, provided the basic input is accurate. What’s more, his predictions are alarmingly specific. His fans include at least one current presidential hopeful, a gaggle of Fortune 500 companies, the CIA, and the Department of Defense.

The results:

The criticism rankles him, because, to his mind, the proof is right there on the page. “I’ve published a lot of forecasting papers over the years,” he says. “Papers that are about things that had not yet happened when the paper was published but would happen within some reasonable amount of time. There’s a track record that I can point to.” And indeed there is. Bueno de Mesquita has made a slew of uncannily accurate predictions—more than 2,000, on subjects ranging from the terrorist threat to America to the peace process in Northern Ireland—that would seem to prove him right.

[...]

To verify the accuracy of his model, the CIA set up a kind of forecasting face-off that pit predictions from his model against those of Langley’s more traditional in-house intelligence analysts and area specialists. “We tested Bueno de Mesquita’s model on scores of issues that were conducted in real time—that is, the forecasts were made before the events actually happened,” says Stanley Feder, a former high-level CIA analyst. “We found the model to be accurate 90 percent of the time,” he wrote. Another study evaluating Bueno de Mesquita’s real-time forecasts of 21 policy decisions in the European community concluded that “the probability that the predicted outcome was what indeed occurred was an astounding 97 percent.” What’s more, Bueno de Mesquita’s forecasts were much more detailed than those of the more traditional analysts. “The real issue is the specificity of the accuracy,” says Feder. “We found that DI (Directorate of National Intelligence) analyses, even when they were right, were vague compared to the model’s forecasts. To use an archery metaphor, if you hit the target, that’s great. But if you hit the bull’s eye—that’s amazing."

continue reading »

Would Your Real Preferences Please Stand Up?

42 Yvain 08 August 2009 10:57PM

Related to: Cynicism in Ev Psych and Econ

In Finding the Source, a commenter says:

I have begun wondering whether claiming to be victim of 'akrasia' might just be a way of admitting that your real preferences, as revealed in your actions, don't match the preferences you want to signal (believing what you want to signal, even if untrue, makes the signals more effective).

I think I've seen Robin put forth something like this argument [EDIT: Something related, but very different], and TGGP points out that Brian Caplan explicitly believes pretty much the same thing1:

I've previously argued that much - perhaps most - talk about "self-control" problems reflects social desirability bias rather than genuine inner conflict.

Part of the reason why people who spend a lot of time and money on socially disapproved behaviors say they "want to change" is that that's what they're supposed to say.

Think of it this way: A guy loses his wife and kids because he's a drunk. Suppose he sincerely prefers alcohol to his wife and kids. He still probably won't admit it, because people judge a sinner even more harshly if he is unrepentent. The drunk who says "I was such a fool!" gets some pity; the drunk who says "I like Jack Daniels better than my wife and kids" gets horrified looks. And either way, he can keep drinking.

I'll call this the Cynic's Theory of Akrasia, as opposed to the Naive Theory. I used to think it was plausible. Now that I think about it a little more, I find it meaningless. Here's what changed my mind.

continue reading »

Zwicky's Trifecta of Illusions

18 thomblake 17 July 2009 04:59PM

Linguist Arnold Zwicky has named three linguistic 'illusions' which seem relevant to cognitive bias. They are:

  1. Frequency Illusion - Once you've noticed a phenomenon, it seems to happen a lot.
  2. Recency Illusion - The belief that something is a recent phenomenon, when it has actually existed a long time.
  3. Adolescent Illusion - The belief that adolescents are the cause of undesirable language trends.

Zwicky talks about them here, and in not so many words links them to the standard bias of selective perception.

As an example, here is an exerpt via Jerz's Literacy Weblog (originally via David Crystal), regarding text messages:

  • Text messages aren't full of abbreviations - typically less than ten percent of the words use them. [Frequency Illusion]
  • These abbreviations aren't a new language - they've been around for decades. [Recency Illusion]
  • They aren't just used by kids - adults of all ages and institutions are the leading texters these days. [Adolescent Illusion]

It is my conjecture that these illusions are notable in areas other than linguistics. For example, history is rife with allusions that the younger generation is corrupt, and such speakers are not merely referring to their use of language. Could this be the adolescent illusion in action?

So, are these notable biases to watch out for, or are they merely obvious instances of standard biases?

Can self-help be bad for you?

3 Tom_Talbot 07 July 2009 08:40PM

From the NHS Behind the Headlines blog:

 

“Self help makes you feel worse,” BBC News has reported. It says that the growing trend of using self-help mantras to boost your spirits may actually have a detrimental effect. The news comes from Canadian research, which found that people with low self-esteem felt worse after repeating positive statements about themselves.

 

Although positive self-statements are widely believed to boost mood and self-esteem, they have not been widely studied, and their effectiveness has not been demonstrated. This experimental study sought to investigate the contradictory theory that these statements can be harmful.

The researchers had a theory that when a person feels deficient in some way, making positive self-statements to improve that aspect of their life may highlight the discrepancy between their perceived deficiency and the standard they would like to achieve. The researchers carried out three studies in which they manipulated positive self-statements and examined their effects on mood and self-esteem.

 

Something about the hypothesis sounds familiar:

This experimental research among a group of Canadian university students has found that positive statements may reinforce that positivity among those with high self-esteem, and make them feel even better. But it causes those with low self-esteem to feel worse and to have lower self-esteem.

The researchers say that this theory is based on the idea of ‘latitudes of acceptance’, i.e. messages that reinforce a position close to one’s own are more likely to be persuasive than messages that reinforce a position far from one’s own. As they suggest, if a person believes that they are unlovable and keeps repeating, "I’m a lovable person", they may dismiss this statement and possibly reinforce their conviction that they are unlovable.

 

What do you think? Is this plausible, or is it an attempt to shoehorn one of those trendy heuristics-and-biases-related hypotheses into a study on self-esteem? If you accept the validity of the study and its conclusion, does it influence LW's Rationalists Should Win self-help philosophy? What if it is literally true that some people are more lovable and some less, and that this has unavoidable effects on self-esteem? Do low self-esteem rationalists need different techniques from those with high self-esteem?

Don't Count Your Chickens...

3 thomblake 17 June 2009 03:21PM

A blog post by Derek Sivers links to evidence that stating one's goals makes one less likely to accomplish them.

Excerpt:

Announcing your plans to others satisfies your self-identity just enough that you're less motivated to do the hard work needed.

Link: Shut up! Announcing your plans makes you less motivated to accomplish them.

Honesty: Beyond Internal Truth

40 Eliezer_Yudkowsky 06 June 2009 02:59AM

When I expect to meet new people who have no idea who I am, I often wear a button on my shirt that says:

SPEAK THE TRUTH,
EVEN IF YOUR VOICE TREMBLES

Honesty toward others, it seems to me, obviously bears some relation to rationality.  In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.

And yet it must be admitted and fully acknowledged, that such morals are encoded nowhere in probability theory.  There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates.  I have said little of honesty myself, these past two years; the art which I've presented has been more along the lines of:

SPEAK THE TRUTH INTERNALLY,
EVEN IF YOUR BRAIN TREMBLES

I do think I've conducted my life in such fashion, that I can wear the original button without shame.  But I do not always say aloud all my thoughts.  And in fact there are times when my tongue emits a lie.  What I write is true to the best of my knowledge, because I can look it over and check before publishing.  What I say aloud sometimes comes out false because my tongue moves faster than my deliberative intelligence can look it over and spot the distortion.  Oh, we're not talking about grotesque major falsehoods - but the first words off my tongue sometimes shade reality, twist events just a little toward the way they should have happened...

From the inside, it feels a lot like the experience of un-consciously-chosen, perceptual-speed, internal rationalization.  I would even say that so far as I can tell, it's the same brain hardware running in both cases - that it's just a circuit for lying in general, both for lying to others and lying to ourselves, activated whenever reality begins to feel inconvenient.

continue reading »

Instrumental vs. Epistemic -- A Bardic Perspective

66 MBlume 25 April 2009 07:41AM

(This article expands upon my response to a question posed by pjeby here)

I've seen a few back-and-forths lately debating the instrumental use of epistemic irrationality -- to put the matter in very broad strokes, you'll have one commenter claiming that a particular trick for enhancing your effectiveness, your productivity, your attractiveness, demands that you embrace some belief unsupported by the evidence, while another claims that such a compromise is unacceptable, since a true art should use all available true information. As Eliezer put it:

I find it hard to believe that the optimally motivated individual, the strongest entrepreneur a human being can become, is still wrapped up in a blanket of comforting overconfidence. I think they've probably thrown that blanket out the window and organized their mind a little differently. I find it hard to believe that the happiest we can possibly live, even in the realms of human possibility, involves a tiny awareness lurking in the corner of your mind that it's all a lie.

And with this I agree -- the idea that a fully developed rational art of anything would involving pumping yourself with false data seems absurd.

Still, let us say that I am entering a club, in which I would like to pick up an attractive woman. Many people will tell me that I must believe myself to be the most attractive, interesting, desirable man in the room. An outside-view examination of my life thus far, and my success with women in particular, tells me that I most certainly am not. What shall I do?

continue reading »

Maybe theism is wrong

-5 infotropism 11 April 2009 04:53PM

 

(This is meant as an entirely rewritten version of the original post. It is still long, but hopefully clearer.)

 

Theism is often bashed. Part of that bashing is gratuitous and undeserved. Some people therefore feel compelled to defend theism. Their defence of theism goes further than just putting the record straight though. It attempts to show how theism can be a good thing, or right. That is probably going too far.

I would argue several points. And for that I will be using the most idealistic vision of religion I can conjure, keeping in mind that real world examples may not be as utopian. My intended conclusion is that fairness and tolerance are a necessary and humane means to the end of helping people, which cannot, however, be used to justify as right something that is ultimately wrong.

Theism is indeed a good thing, on short and mid term, both for individuals and society, as it holds certain benefits.Such as helping people stick together in close knit communities, helping people life a more virtuous life by giving themselves incentives to do so, helping them feel better when life feels unbearable or meaningless.

Another point is that theism also possesses deep similarities with science, and uses optimally rational arguments and induction. Optimally, that is, insofar as the premises of theism allow; those premises, what we could call their priors are, for instance, in Christianity, to be found in the Bible.

Finally, I also wanted to draw on further similarities between religion and secular groups of people. Atheism, humanism, transhumanism, even rationalism as we know it on LW. These similarities lie in the objectives which any of those groups honestly strives to attain. Those goals are, for instance, truth, the welfare of human beings, and their betterment.

Within the world view of each of those groups, each is indeed doing its best to achieve those ends. One of catholicism's final beacon, used to guide people's life path, can be roughly said to be "what action should I take that will make me more able to love others, and myself" for instance. This, involves understanding, and following the word of God, as love and morality is understood to emanate from that source.

And so the Bible, is supposed to hold those absolute truths, not so much in a straightforward, explained way, but rather in the same way that the observable universe is supposed to hold absolute truth for secular science. And just as it is possible to misconstrue observations and build flawed theories in the scientific model, given that observational, experimental data, so is it for a christian person, to misunderstand the data presented in the Bible. Rational edifices of thought have therefore been built to derive humanly understandable, cross checked (inside that edifice), usable-on-a-daily-basis truth, from the Bible.

That is about as far as we can go for similarities, purity of purpose, intellectual honesty and adequacy with the real world.

The premise of theism itself, is flawed. Theism presupposes the supernatural. Therefore, the priors of theism, do not correspond to the real state of the universe as we observe it, and this implies two main consequences.

The first is that an intellectual edifice based upon flawed premises, no matter how carefully crafted, will still be flawed itself.

The second runs deeper and is that the premises of theism themselves are in part incompatible with rationality itself, and hence limit the potential use of rational methods. In other words, some methods of rationality, as well as some particular arguments are forbidden, or unknown to what we could tentatively call religious science.

From that, my first conclusion is that theism is wrong. Epistemically wrong, but also, doing itself a disservice, as the goals it has set itself up to, cannot be completed through its program. This program will not be able to hit its targets in optmization space, because of that epistemical flaw. Even though theism possesses short and mid term advantages, its whole edifice makes it a dead end, which will at the very least slow down humanity's progress towards nobler objectives like truth or betterment, if not even rendering that progress outright impossible past a certain point.

Yet, it seems to me that this mistaken edifice isn't totally insane, far from it, at least at its roots. Hence it should be possible to heal it. Or at least, helping the people that are part of it, healing them.

But, religion cannot be honestly called right, no matter how deep that idea is rooted in our culture and collective consciousness. On the long term, theism deprives us of our potential, it builds a virtual, unnecessary cage around us.

To conclude on that, I wanted to point out that religious belief appears to be a human universal, and probably a hard coded part of human nature. It seems fair to recognize it in us, if we have that tendency. I know I do, for instance, and fairly strongly so. Idem for belief in the supernatural.

This should be part of a more general mental discipline, of admitting to our faults and biases, rather than trying to hide and make up for them. The only way to dissect and correct them, is to first thoroughly observe those faults in our reasoning. Publicly so even. In a community of rationalists, there should be no question that even the most flawed, irrational of us, should only be treated as a friend in need of help, if he so desires, and if we have enough ressources to provide to his needs. The important thing there, is to have someone possessing a willingness to learn, and grow past his mistakes. This, can indeed be made easier, if we are supportive of each other, and tolerant, unconditionally.

Yet, at the same time, even for that purpose, we can't yield to falseness. We can and must admit for instance that religion has good points, that we may not have a licence to change people against their will, and that if people want to be helped, that they should feel relaxed in explaining all the relevant information about what they perceive to be their problem. We can't go as far as saying that such a flaw, or problem, is, in itself, alright, though.

 

View more: Prev | Next