Building on the recent SSC post Why Doctors Think They’re The Best...

What it feels like for meHow I see others who feel the same
There is controversy on the subject but there shouldn't be because the side I am on is obviously right.They have taken one side in a debate that is unresolved for good reason that they are struggling to understand
I have been studying this carefullyThey preferentially seek out conforming evidence
The arguments for my side make obvious sense, they're almost boring.They're very ready to accept any and all arguments for their side.
The arguments for the opposing side are contradictory, superficial, illogical or debunked.They dismiss arguments for the opposing side at the earliest opportunity.
The people on the opposing side believe these arguments mostly because they are uninformed, have not thought about it enough or are being actively misled by people with bad motives.The flawed way they perceive the opposing side makes them confused about how anyone could be on that side. They resolve that confusion by making strong assumptions that can approach conspiracy theories.

The scientific term for this mismatch is: confirmation bias

What it feels like for meHow I see others who feel the same
My customers/friends/relationships love me, so I am good for them, so I am probably just generally good.They neglect the customers / friends / relationships that did not love them and have left, so they overestimate how good they are.
When customers / friends / relationships switch to me, they tell horror stories of who I'm replacing for them, so I'm better than those.They don't see the people who are happy with who they have and therefore never become their customers / friends / relationships.

The scientific term for this mismatch is: selection bias

What it feels like for meHow I see others who feel the same
Although I am smart and friendly, people don't listen to me.Although they are smart and friendly, they are hard to understand.
I have a deep understanding of the issue that people are too stupid or too disinterested to come to share.They are failing to communicate their understanding, or to give unambiguous evidence they even have it.
This lack of being listened to affects several areas of my life but it is particularly jarring on topics that are very important to me.This bad communication affects all areas of their life, but on the unimportant ones they don't even understand that others don't understand them.

The scientific term for this mismatch is: illusion of transparency

What it feels like for meHow I see others who feel the same
I knew at the time this would not go as planned.They did not predict what was going to happen.
The plan was bad and we should have known it was bad.They fail to appreciate how hard prediction is, so the mistake seems more obvious to them than it was.
I knew it was bad, I just didn't say it, for good reasons (e.g. out of politeness or too much trust in those who made the bad plan) or because it is not my responsibility or because nobody listens to me anyway.In order to avoid blame for the seemingly obvious mistake, they are making up excuses.                                                                                                                                                                 

The scientific term for this mismatch is: hindsight bias

What it feels like for meHow I see others who feel the same
I have a good intuition; even decisions I make based on insufficient information tend to turn out to be right.They tend to recall their own successes and forget their own failures, leading to an inflated sense of past success.
I know early on how well certain projects are going to go or how well I will get along with certain people.They make self-fulfilling prophecies that directly influence how much effort they put into a project or relationship.
Compared to others, I am unusually successful in my decisions.They evaluate the decisions of others more level-headedly than their own.
I am therefore comfortable relying on my quick decisions.They therefore overestimate the quality of their decisions.
This is more true for life decisions that are very important to me.Yes, this is more true for life decisions that are very important to them.

The scientific term for this mismatch is: optimism bias

Why this is better than how we usually talk about biases

Communication in abstracts is very hard. (See: Illusion of Transparency: Why No One Understands You) Therefore, it often fails. (See: Explainers Shoot High. Aim Low!) It is hard to even notice communication has failed. (See: Double Illusion of Transparency) Therefore it is hard to appreciate how rarely communication in abstracts actually succeeds.

Rationalists have noticed this. (Example) Scott Alexander uses a lot of concrete examples and that should be a major reason why he’s our best communicator. Eliezer’s Sequences work partly because he uses examples and even fiction to illustrate. But when the rest of us talk about rationality we still mostly talk in abstracts.

For example, this recent video was praised by many for being comparatively approachable. And it does do many things right, such as emphasize and repeat that evidence alone should not generate probabilities, but should only ever update prior probabilities. But it still spends more than half of its runtime displaying mathematical notation that no more than 3% of the population can even read. For the vast majority of people, only the example it uses can possibly “stick”. Yet the video uses its single example as no more than a means for getting to the abstract explanation.

This is a mistake. I believe a video with three to five vivid examples of how to apply Bayes’ Theorem, preferably funny or sexy ones, would leave a much more lasting impression on most people.

Our highly demanding style of communication correctly predicts that LessWrongians are, on average, much smarter, much more STEM-educated and much younger than the general population. You have to be that way to even be able to drink the Kool Aid! This makes us homogeneous, which is probably a big part of what makes LW feel tribal, which is emotionally satisfying. But it leaves most of the world with their bad decisions. We need to be Raising the Sanity Waterline and we can’t do that by continuing to communicate largely in abstracts.

The tables above show one way to do better that does the following.

  • It aims low - merely to help people notice the flaws in their thinking. It will not, and does not need to, enable readers to write scientific papers on the subject.
  • It reduces biases into mismatches between Inside View and Outside View. It lists concrete observations from both views and juxtaposes them.
  • These observations are written in a way that is hopefully general enough for most people to find they match their own experiences.
  • It trusts readers to infer from these juxtaposed observations their own understanding of the phenomena. After all, generalizing over particulars is much easier than integrating generalizations and applying them to particulars. The understanding gained this way will be imprecise, but it has the advantage of actually arriving inside the reader’s mind.
  • It is nearly jargon free; it only names the biases for the benefit of that small minority who might want to learn more.

What do you think about this? Should we communicate more concretely? If so, should we do it in this way or what would you do differently?

Would you like to correct these tables? Would you like to propose more analogous observations or other biases?

Thanks to Simon, miniBill and others for helping with the draft of this post.

New Comment
32 comments, sorted by Click to highlight new comments since:

I would like to encourage this!

Alternative representations for a larger audience could be

  • cartoons explaining a single concept, like XKCD or Dilbert.
  • graphical overviews, like the cognitive bias cheatsheet.

What else would be feasible?

I'm fantasizing about infographics with multiple examples of the same bias, an explanation how they're all biased the same way, and very brief talking points like "we're all biased, try to avoid this mistake, forgive others if they make it, learn more at LessWrong.com".

They could be mass produced with different examples. Like one with a proponent of Minimum Wage and an opponent of it, arguing under intense confirmation bias as described in the table above, with a headline like "Why discussions about Minimum Wage often fail". Another one "Why discussions of Veganism often fail", another one "Why discussions of Gun Control often fail" etc. Each posted to the appropriate subreddits etc. Then evolve new versions based on what got the most upvotes.

But I am completely clueless about how to do infographics. I'd love for someone to grab the idea and run with it. But realistically I should probably try to half-ass something and hope it shows enough potential for someone with the skills to take pity.

Or at least get more eyes on it to further improve the concept. Getting feedback from fellow LessWrongers was extremely helpful for development thus far.

No harm done with experimenting a bit I suppose.

Do you have examples of infographics that come close to what you have in mind?

These infographics feature pairs of people having difficult conversations snippets about the morality of abortion. https://whatsmyprolifeline.com/

I think this is off topic here, except it does sort of the same thing by breaking principles down I to concrete statements. That said, I think that site is exceptionally well-written and designed. I wish other persuasion projects adopted that kind of approach.

Love the tables

Dear Chaosmage, thank you for this post. I agree that communicating in abstracts is very hard. There seems to almost always be an element of who-what is good-bad and right or wrong with the content of any communication and are the people involved in the communication good-bad, right/wrong as people. Yodkowsky’s statement that arguments are soldiers comes to mind.

I am 63. I spent a lot of my life trying to impose my views on people from a testosterone fueled classic male dominance ego. No one would be wrong to accuse me of arrogance and a belief in my intellectual superiority. But I am older now and I see how often I really was wrong. I wish I had learned collaborative communicating skills so much earlier. I have learned that accepting I may be as wrong as right in a discussion with someone about opinions and feelings is important. Giving someone a way out, realizing and actually accepting they have to have whatever time is needed to consider what I’m trying to communicate or engaging (if possible) in a manner that allows both sides legitimacy works best. And generally, I have found people resist facts until it is comfortable for them to accept them, and always, always people have honor and pride in the game. I learned a long time ago to try and see the other person’s side, and listen/read to learn and understand not just to reply.

We have more technology than ever, but our brains essentially work the same as they did 10,000 or years ago.

I think learning to communicate in a clear concise “concrete” manner is always a good thing to have in our human toolbox. In my experience, trying to “crack somones head open” and pour in a lot of information we want them to accept often fails. That’s irrational in my opinion. Getting people interested and enthused and guiding them to the subject and allowing them to learn at their pace works better in my life experiences. I easily accept that this is not always true, because people are in fact forced to learn and humans sometimes do change their views/behaviors when being forced too.

Lastly, I intentionally tried to make this post concrete and easy to see my points. I left out all the complex words that go with the thinking behavior genre.

Keep up the good work! Best, Mike

“The real problem of humanity is the following: we have Paleolithic emotions; medieval institutions; and god-like technology.” E. O. Wilson
https://harvardmagazine.com/breaking-news/james-watson-edward-o-wilson-intellectual-entente


What it feels like for me:

Some things have permanent consequences (STDs) so I will never risk them (always condom).

How I see others who feel the same:

They exaggerate tiny risks to certainties because it's easier than doing tradeoffs. They ignore the benefits.


Feels like for me:

The most attractive things are exclusive and competed for (expensive cars, hot women), so it's smarter and friendlier to stick to no-cost alternatives (video games, porn).

How I see others:

By not even trying to get to Harvard or the Olympics, they're not getting any real mastery of anything.


Feels like for me:

I learned basic facts about how the world works as a kid, made the smart choices, and think everyone else's choices were dumb choices.

How I see others:

They feel the exact same way about me, no matter which decisions they made!


Sorry, can't name any of the biases.

I'm not really sure this helps me, though. The intensity of "how it feels for me" feels too strong for any "that's just a bias" to break through. Sometimes I just can't make myself believe the two lines in the illusion are the same length. I've looked at them a hundred times and the one's always longer.

I think it should also be mentioned that, when you're right and the other person is wrong, that also feels, from the inside, like confirmation bias does. That's why confirmation bias is so powerful: from the inside, you can't tell the difference between it and actually being right.

Promoted to curated: I really like the concreteness in this post, and liked it enough that I OCR'd all the pictures and translated them into real tables, to make sure the text is findable in our search, and generally easier to read in different contexts. 

Awesome! Thanks a lot!

Tone issues aside, I'm not even sure that this comment is about this post. Seems to tangentially pick on a single example given and use it as a wedge to talk about something entirely different.

This is fantastic. My first reaction was to share this with my son. It is like a translation between languages and that is great.

Did you share it with your son, and if so what was the result?

"I will read it later, dad"

which is OK, sometimes it sticks, sometimes it does not.

Did he read it later?

Very good. One minor quibble: it's not immediately obvious which table (the one above or the one below) is referred to in the "The scientific term for this mismatch is:" comments. Maybe a space after the comment to show it goes with the table above?

I'm going to assume this is a false flag attack on conflict theory by an insane, terroristic mistake theorist.

I'm going to assume this is a false flag attack on conflict theory by an insane, terroristic mistake theorist.

That's a very conflict-theorist hypothesis.

There is a significant amount to like here, and certainly this ought to serve as a very good first line kind of check for the possibility your strongly entangled in some of these biases with certain decisions. 

But, I'm not sure (and maybe it's not intended) to address situations where it's harder to tell. 

Take confirmation bias, in the case where you're actually right it feels almost identical to what is written in the table. Which means the table cannot reliably be useful to help decide the question of: 

"Am I really correct here, or am I just cheating myself with different standards of evidence depending on whether or not it matches my intuition/preference"

I'm not sure what to do about that, outside of rigorously trying to falsify yourself, in an honest manner, by aggressively pitting the opposing set of ideas against your own. 

That's exactly right. It would be much better know a simple method of how to distinguish overconfidence from being actually right without a lot of work. In the absence of that, maybe tables like this can help people choose more epistemic humility.

FWIW there are lots of graphical resources on cognitive biases here: https://www.yourbias.is/

I did not know this, and I like it. Thank you!

I'm using pictures because I couldn't get either editor to accept a proper table.

Sorry for that! We do have an editor in the works that has proper table support. 

You can do it in latex, with textrm to get your formatting out of the math mode. Not elegant, but it serves:

Code:

$$\begin{array}{|c|c|c|c|}
\hline
\textrm{System}&& SA\textrm{ possible?} & \textrm{Penalty neutralised?} \\
\hline\hline \textrm{20BQ} && \textrm{Yes} & \textrm{No} \\
\hline \textrm{RR} && \textrm{Yes} & \textrm{No}\\
\hline \textrm{AU} && \textrm{Probably} & \textrm{Mostly}\\
\hline
\end{array}$$

Relatedly, there's an awkward cursor line in the top-right box for optimism bias.

I want to make 
1) A general compliment for the post. 
I think the tables are helpful for those who seek to recognize their biases. Bravo!
2) A comment about the 3% of the 3B1B video. 
Thinking about the audience is crucial for communication. I think the video has reached much more than the 3% of its target audience.
3) A meta-comment N1 on communication.
When communicating, it is useful to know why you are doing it (also for other activities). If one wants to make the broad population aware of cognitive biases, one should know why and consider the marginal added value of the educational activity.
4) A meta-comment N2 on communication. 
I believe communication is most efficient when one understands the motives of the target audience and through communication provides support for the achievement of those motives. 
E.g. Situation: my co-authors are angry with the referee who criticized our manuscript as too implicit. I might, step 1) focus on making the best effort to resubmit the paper, thus putting explicitly the motivation for action 
step 2) make them aware of the transparency bias, on the example of how the manuscript can be written more clearly, thus introducing action(rewriting) that is aligned with the motivation (resubmission). 
Note in the example above, the educational part comes as a side-effect. One might be explicit about overcoming the biases to solidify the educational effect.

To recap, we should have the interests of our audience in mind. Moreover, if we do not act in the interest of the audience we risk triggering their defensive mode and thus make our efforts futile if not harmful

4-bis) A meta-comment on meta-comment N2. 
One might say that scalability is important (e.g. for Raising the Sanity Waterline) and that knowing one's audience is not scalable.
In other words, it's too costly to "educate" one person at a time.
I think this is not always true, there are motivations that are common across the population. Distilling those motives might be extremely useful for communication. Understanding which biases an "average Joe" is facing in his primary activities might be the most efficient way to "educate" the population on implicit biases.
E.g. I imagine a hypothetical video titled "How to actually get rich instead of fooling yourself" might be an example.

P.S. This is my first comment on LW, so "hello world". It looks longish - if you think multiple comments is a better format, please let me know.

Welcome. You're making good points. I intend to make versions of this geared to various audiences but haven't gotten around to it.

I guess for better memorization these tables can be provided as anki cards. One card would provide either side (how I feel/how I see others feeling that way) of one row of any table and would ask for other side or bias name.

I'm reading this for the first time today. It'd be great if more biases were covered this way. The "illusion of transparency" one is eerily close to what I've thought so many times. Relatedly, sometimes I do succeed at communicating, but people don't signal that they understand (or not in a way I recognize). Thus sometimes I only realize I've been understood after someone (politely) asks that I stop repeating myself, mirroring back to me what I had communicated. This is a little embarrassing, but also a relief - once I know I've been understood, I can finally let go.

Here's the thing though. Sometimes one side IS genuinely correct[/good], and the other side IS genuinely wrong[/evil].

Take the [] out, and this is one of the first things I was thinking upon reading this post. Interestingly, you don't need to bring conflict vs mistake theory into this at all.

I think this comment should be its own post/open thread comment (probably the latter) and then I'd find it reasonable (dn about others). The tolerance on this site for talking about your pet issue in the context of sth vaguely related is very low.