Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Ambiguity in cognitive bias names; a refresher

25 nerfhammer 21 February 2012 04:37AM

This came on the nyc list, I thought I would adapt it here.

Cognitive biases have names. That's what makes them memetic. It's easier to think about something that has a name. Though I think the benefits outweigh the costs, there is also the risk of a little Albert: a concept living on after the original research has been found to be much more ambiguous than first realized.

There are many errors that are possible with respect to named ideas, and despite being studied generally scientifically, cognitive biases are no exception. There is no equivalent to cognitive biases as the Académie Française is to French.

Let's describe some. Here they are:

  • different people in different fields will "discover" virtually the same bias but not be aware of each other and assign it different names. For example, see the Curse of Knowledge which I think George Loewenstein came up with  vs. the Historian's Fallacy by David Hackett Fischer, presentist bias, creeping determinism, and probably many others, not all of them scientific. Sometimes researchers in seemingly closely related subfields are remarkably insular to each other. 
  • researchers will use one term predominantly while an offshoot will decide they don't like the name and use a different one. For example the Fundamental Attribution Error has also been called the overattribution effect, the correspondence bias, the attribution bias, and the actor-observer effect. In this case the older term still predominates, and is used in intro textbooks without asterisks. Of the naming errors this is one of the least harmful, since everyone agrees what the FAE is, some just prefer a different name for it.
  • an author will decide he doesn't like the names of some biases will invent idiosyncratic names of his own. Jonathan Baron has a good textbook on cognitive bias but he uses names of his own invention half the time.
  • the same term will sometimes have different polysemous meanings. For example the "Zeigarnik Effect" has been used to refer to a memory bias in having a superior recall for unfinished tasks, and the term has also been used to refer to an attentional bias in which unfinished tasks tend intrude on consciousness; almost, but not quite exactly, the same thing. The term "confirmation bias" has several different but related meanings, for example, to seek out confirming information, to notice confirming information, to ask confirming questions, etc. which are not all quite exactly the same thing. The different meanings may have completely different contexts, boundary conditions etc., leading to confusion. Furthermore some of the senses may be at least partially disproven but not necessarily others, for example, the tendency to ask confirming questions has turned out to be more complicated than once thought. You might never know from reading about the attentional Zeigarnik that there is also a memory Zeiganik effect that is conceptually somewhat different. I recall seeing even prominent researchers occasionally making mistakes of this category. Of all the naming ambiguities I think is the most dangerous.
  • an offshoot of researchers may knowingly use the same term with a conflicting definition. For example "heuristic" in "Heuristics and Biases" versus "Fast and Frugal Heuristics", the latter of which was an intentional reaction to the former. In this case those involved know there is a disagreement in meaning, but those unfamiliar to the topic might be confused.[This is a point of contention which I'm willing to yield on]
  • the same term may be redefined by researchers who may not aware of each other. There has been more than one paper trying to introduce a bias to call "the disconfirmation effect". But this only happens for really obscure biases.
  • a bias may have different components which do not have names of their own and/or a bias may overlap partially but not completely with another bias. For instance, hindsight bias has different components one of which has some overlap with the curse of knowledge. 
  • the same bias term will be used as a rough category of experimental effect and also as a singular bias. For example, the term "an actor-observer bias" could refer to any difference in actors and observers, whereas "the actor-observer bias" refers to the Fundamental Attribution Error specifically; the same is true of "an" vs. "the" attribution bias, also referring to the FAE. This could confuse only those who are unfamiliar with the terminology.
  • sometimes authors have tried to enforce strict, distinct meanings for the subterms "bias" vs. "effect" vs. "neglect" vs. "error" or "fallacy"; other times, perhaps more often, these terms are used only by convention. For example the conjunction fallacy vs. the conjunction error, correspondence bias vs. the fundamental attribution error, base rate neglect vs. base rate error. Sometimes the originators of a bias try to use the terminology precisely while later authors citing it aren't as careful. Sometimes even the originators of a bias do not try to choose a subterm carefully. You might suspect what permutation of a term catches on is based on whichever has a better ring to it.

Morality is not about willpower

9 PhilGoetz 08 October 2011 01:33AM

Most people believe the way to lose weight is through willpower.  My successful experience losing weight is that this is not the case.  You will lose weight if you want to, meaning you effectively believe0 that the utility you will gain from losing weight, even time-discounted, will outweigh the utility from yummy food now.  In LW terms, you will lose weight if your utility function tells you to.  This is the basis of cognitive behavioral therapy (the effective kind of therapy), which tries to change peoples' behavior by examining their beliefs and changing their thinking habits.

Similarly, most people believe behaving ethically is a matter of willpower; and I believe this even less.  Your ethics is part of your utility function.  Acting morally is, technically, a choice; but not the difficult kind that holds up a stop sign and says "Choose wisely!"  We notice difficult moral choices more than easy moral choices; but most moral choices are easy, like choosing a ten dollar bill over a five.  Immorality is not a continual temptation we must resist; it's just a kind of stupidity.

This post can be summarized as:

  1. Each normal human has an instinctive personal morality.
  2. This morality consists of inputs into that human's decision-making system.  There is no need to propose separate moral and selfish decision-making systems.
  3. Acknowledging that all decisions are made by a single decision-making system, and that the moral elements enter it in the same manner as other preferences, results in many changes to how we encourage social behavior.

continue reading »

30th Soar workshop

18 Johnicholas 23 May 2010 01:33PM

This is a report from a LessWrong perspective, on the 30th Soar workshop. Soar is a cognitive architecture that has been in continuous development for nearly 30 years, and is in a direct line of descent from some of the earliest AI research (Simon's LT and GPS). Soar is interesting to LessWrong readers for two reasons:

  1. Soar is a cognitive science theory, and has had some success at modeling human reasoning - this is relevant to the central theme of LessWrong, improving human rationality.
  2. Soar is an AGI research project - this is relevant to the AGI risks sub-theme of LessWrong.

continue reading »

The Danger of Stories

9 Matt_Simpson 08 November 2009 02:53AM

Tyler Cowen argues in a TED talk (~15 min) that stories pervade our mental lives.  He thinks they are a major source of cognitive biases and, on the margin, we should be more suspicious of them - especially simple stories.  Here's an interesting quote about the meta-level:

What story do you take away from Tyler Cowen?  ...Another possibility is you might tell a story of rebirth.  You might say, "I used to think too much in terms of stories, but then I heard Tyler Cowen, and now I think less in terms of stories". ...You could also tell a story of deep tragedy.  "This guy Tyler Cowen came and he told us not to think in terms of stories, but all he could do was tell us stories about how other people think too much in terms of stories."

View more: Next