According to this brain-imaging study, volunteers presented with negative scenarios (i.e. car crashes, cancer), and asked to estimate the probability of these scenarios happening to them, would only update their beliefs if the actual rate of ocurrence in the population, given to them afterwards, was lower, i.e. more optimistic, than what they had guessed. The more "optimistic" the subjects were, according to a personality test, the less likely they were to update their belief based on more negative information, and the less activity they showed their frontal lobes, indicating that they weren't "paying attention" to the new information.

Sounds like confirmation bias, except that interestingly enough, it's unidirectional in this case. I wonder if very pessimistic people would have the opposite bias, only updating their estimate if the actual probability was higher, or more negative.

Link to article on kurzweilai.

Link to abstract in Nature journal. I can't access the full text.

New Comment
5 comments, sorted by Click to highlight new comments since: Today at 2:31 PM

Sometime, I'll have to look up statistics for crashes per mile driven (perhaps adjusted for age, etc) to see if I am actually badly estimating my driving ability. After my first car-crash, there must have been some serious learning done in the unconscious parts of my mind, since my motorway-driving speed dropped by more than 10mph afterwards, almost overnight. I don't consciously remember thinking that driving was any more dangerous though. I think my judgement of my driving skill might have dropped, from 'maybe average' to 'probably below average' which I notice every time I read the 'most people think they're above average drivers' factlet. Hopefully auto-driving vehicles will become common soon enough that reality will not have to injure me more to provide further learning experiences.

Having said that, I'll see if I notice the preferentially updating in one direction effect in myself.

Full text is here: http://www.fil.ion.ucl.ac.uk/~tsharot/Sharot_NN_11.pdf

(Search technique: Google search for "how unrealistic optimism is maintained in the face of reality pdf"; notice the classic pattern "educational domain followed by ~username" → "probably the researcher's home page")

Article from Nature here

If one fights anothers' beliefs one ends up strengthening the others beliefs. edit - in response to the (-1) given.

Optimism is a belief about the future, unrealistic optimism is what the article is about. "We examined this question and found a marked asymmetry in belief updating. Participants updated their beliefs more in response to information that was better than expected than to information that was worse." the abstract states. So participants updated their beliefs more in response to information that was better than expected (i.e optimistic) than to information that was worse (realistic). I summarised that with the statement above this edit.

Further the statement is supported by looking at disconfirmation bias : Subjects will spend more time and cognitive resources denigrating contrary arguments than supportive arguments. Hence if a subject has their argument attacked, they will spend more time and cognitive resources denigrating the attack, than on their own supportive arguments. This supports that if one attacks anothers' belief, one strengthens their beliefs because the other will then attack the counter argument - the very act of which will support their original belief.

Now place that in an optimism situation. If one has an optimistic outlook and that outlook is attacked, then the person will expend more energy on defending their optimistic outlook than on reconsidering their optimistic outlook. That's basically what I understand of the article... hence...

"If one fights anothers' beliefs one ends up strengthening the others beliefs."

Now this is of course seen in the ongoing "battle" between the religious and the atheist. Plenty of arguments going both ways, and they tend to be very heated and get even more heated - and when we cut through it all what's one of the major things that people take out of these arguments... that "I'm right, and they're wrong."

If you check out Belief Perseverance in (Social Psychology, Myers, 2010, p 84) you'll get a better understanding of what is happening. The way around Belief Perseverance is to ask the other to consider for themselves why the opposite of their belief might be true and to have them explain their consideration - this is certainly not having them defend their own argument against an attack.