Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Confabulation Bias

-1 EricHerboso 28 September 2012 01:27AM

(Edit: Gwern points out in the comments that there is previous discussion on this study at New study on choice blindness in moral positions.)

Earlier this month, a group of Swedish scientists published a study that describes a new type of bias that I haven't seen listed in any of the sequences or on the wiki. Their methodology:

We created a self-transforming paper survey of moral opinions, covering both foundational principles, and current dilemmas hotly debated in the media. This survey used a magic trick to expose participants to a reversal of their previously stated attitudes, allowing us to record whether they were prepared to endorse and argue for the opposite view of what they had stated only moments ago.

In other words, people were surveyed on their beliefs and were immediately asked to defend them after finishing the survey. Despite having just written down how they felt, 69% did not even notice that at least one of their answers were surreptitiously changed. Amazingly, a majority of people actually "argued unequivocally for the opposite of their original attitude".

Perhaps this type of effect is already discussed here on LessWrong, but, if so, I have not yet run across any such discussion. (It is not on the LessWrong wiki nor the other wiki, for example.) This appears to be some kind of confabulation bias, where invented positions thrust upon people result in confabulated reasons for believing them.

Some people might object to my calling this a bias. (After all, the experimenters themselves did not use that word.) But I'm trying to refer less to the trick involved in the experiment and more toward the bias this experiment shows that we have toward our own views. This is a fine distinction to make, but I feel it is important for us to recognize.

When I say we prefer our own opinions, this is obvious on its face. Of course we think our own positions are correct; they're the result of our previously reasoned thought. We have reason to believe they are correct. But this study shows that our preference for our own views goes even further than this. We actually are biased toward our own positions to such a degree that we will actually verbally defend them even when we were tricked into thinking we held those positions. This is what I mean when I call it confabulation bias.

Of particular interest to the LessWrong community is the fact that this bias apparently is more susceptible to those of us that are more capable of good argumentation. This puts confabulation bias in the same category as the sophistication effect in that well informed people should take special care to not fall for it. (The idea that confabulation bias is more likely to occur with those of us that argue better is not shown in this study, but it seems like a reasonable hypothesis to make.)

As a final minor point, I just want to point out that the effect did not disappear when the changed opinion was extreme. The options available to participants involved agreeing or disagreeing on a 1-9 scale; a full 31% of respondents who chose an extreme position (like 1 or 9) did not even notice when they were shown to have said the opposite extreme.

[Link] “Proxy measures, sunk costs, and Chesterton's fence”, or: the sunk cost heuristic

7 kpreid 08 August 2012 02:39PM

Thought this post might be of interest to LW: Proxy measures, sunk costs, and Chesterton's fence. To summarize: Previous costs are a proxy measure for previous estimates of value, which may have information current estimates of value do not; therefore acting according to the sunk cost fallacy is not necessarily wrong.

This is not an entirely new idea here, but I liked the writeup. Previous discussion: Sunk Costs Fallacy Fallacy; Is Sunk Cost Fallacy a Fallacy?.

Excerpt:

If your evidence may be substantially incomplete you shouldn't just ignore sunk costs — they contain valuable information about decisions you or others made in the past, perhaps after much greater thought or access to evidence than that of which you are currently capable. Even more generally, you should be loss averse — you should tend to prefer avoiding losses over acquiring seemingly equivalent gains, and you should be divestiture averse (i.e. exhibit endowment effects) — you should tend to prefer what you already have to what you might trade it for — in both cases to the extent your ability to measure the value of the two items is incomplete. Since usually in the real world, and to an even greater degree in our ancestors' evolutionary environments, our ability to measure value is and was woefully incomplete, it should come as no surprise that people often value sunk costs, are loss averse, and exhibit endowment effects — and indeed under such circumstances of incomplete value measurement it hardly constitutes "fallacy" or "bias" to do so.

Case Study: Testing Confirmation Bias

32 gwern 02 May 2012 02:03PM

Master copy lives on gwern.net

Cashing Out Cognitive Biases as Behavior

13 gwern 02 March 2012 04:57AM

We believe cognitive biases and susceptibility lead to bad decisions and suboptimal performance. I’d like to look at 2 interesting studies:

continue reading »

[Transcript] Tyler Cowen on Stories

65 Grognor 17 December 2011 05:42AM

I was shocked, absolutely shocked, to find that Tyler Cowen's excellent TEDxMidAtlantic talk on stories had not yet been transcribed. It generated a lot of discussion in the thread about it where it was first introduced, so I went ahead and transcribed it. I added hyperlinks to background information where I thought it was due. Here you go:

continue reading »

Recent updates to gwern.net (2011)

33 gwern 26 November 2011 01:58AM

A list of things I have written or researched in 2011 which I put on my personal site.

This has been split out to https://www.gwern.net/Changelog

Number bias

1 [deleted] 17 October 2010 02:03PM

The New York Times ran an editorial about an interesting type of cognitive bias: according to the article, the fact that our system of timekeeping is based on factors of 24, 7, etc. and the fact that we have 10 fingers profoundly influences our way of thinking. As the article explains, this bias is distinct from scope neglect and misunderstanding of probability. Has anyone else heard of this kind of "number bias" before? Also, is this an issue that deserves further study on LessWrong?