Hygienic Anecdotes

10 badger 29 March 2009 05:46AM

Bayesians must condition their beliefs on all available evidence; it is not cheating to use less than ideal sources of information. However, this process also requires conditioning on the evidence for your evidence. Outside of academic journals, evidence is often difficult to trace back to the source and is dependent on our notoriously faulty memory. Given the consequences of low-fidelity copying, should rationalists trust evidence they can't remember the source of, even if they remember reading the primary source themselves? Should community members be expected to produce citations on demand?

This issue came to mind while trying to find a study I vaguely remembered about how the increased happiness of the religious could be explained by increased community involvement and while trying to factcheck PhilGoetz's now infamous anecdote about Steve Jobs. I started contemplating the standards for relaying highly relevant, but potentially wrong or distorted information.

Luckily factchecking is much easier in the age of the internet. Wikipedia serves as a universally accessible standard reference, and Google serves well for everything else. But sometimes my google-fu is not strong enough. So, I'll put this to the community: how should rationalists balance the tradeoff between neglecting evidence and propogating bad information?

Hygienic practices have been touched on before, but I haven't seen any consensus on this issue. Are the standards for what you personally condition on and what you share in discussion different? What needs a citation and what doesn't? Does anyone have recommendations for ways to better track the sources of evidence, i.e. reference management software?

 

Fight Biases, or Route Around Them?

25 Yvain 25 March 2009 10:23PM

Continuation of: The Implicit Association Test
Response to: 3 Levels of Rationality Verification

I've not yet seen it pointed out before that we use "bias" to mean two different things.

Sometimes we use "bias" to mean a hard-coded cognitive process that results in faulty beliefs. Take as examples the in-group bias, the recall bias, the bad guy bias, and various other things discovered by Tversky and Kahneman.

Other times, we use "bias" to mean a specific faulty belief generated by such a process, especially one that itself results in other faulty beliefs. For example, Jews are sometimes accused of having a pro-Israel bias. By this we mean that they have a higher opinion of Israel than the evidence justifies; this is a specific belief created by the in-group bias. This belief may itself generate other faulty beliefs; for example, they may have a more negative opinion of Palestinians than the evidence justifies. It is both the effect of a bias, and the cause of other biases.

Let's be clear about this "more than the evidence justifies" bit. Hating Hitler doesn't mean you're biased against Hitler. Likewise, having a belief about a particular ethnic group doesn't mean you're biased for or against them. My Asian friends hate it when people sheepishly admit in a guilty whisper that they've heard Asians are good at academics. Asians are good at academics. Just say "55% chance an average Asian has a GPA above the American population mean" and leave it at that. This is one of Tetlock's critiques of the Implicit Association Test, and it's a good one. I'd probably link Asians to high achievement on an IAT, but it wouldn't be a bias or anything to get upset about.

And let's also be clear about this faulty belief thing. You don't have to believe something for it to be a belief; consider again the skeptic who flees the haunted house. She claims she doesn't belief in ghosts, and she's telling the truth one hundred percent. She's still going to be influenced by her belief in ghosts. She's not secretly supernaturalist any more than someone who gets "strongly biased" on the IAT is secretly racist. But she needs to know she's still going to run screaming from haunted houses, and IAT-takers should be aware they're still probably going to discriminate against black people in some tiny imperceptible way.

continue reading »

The Implicit Association Test

24 Yvain 25 March 2009 12:11AM

Continuation of: Bogus Pipeline, Bona Fide Pipeline
Related to: The Cluster Structure of Thingspace

If you've never taken the Implicit Association Test before, try it now.

Any will do. The one on race is the "classic", but the one on gender and careers is a bit easier to watch "in action", since the effect is so clear.

The overwhelming feeling I get when taking an Implicit Association Test is that of feeling my cognitive algorithms at work. All this time talking about thingspace and bias and categorization, and all of a sudden I have this feeling to attach the words to...

...which could be completely self-delusional. What is the evidence? Does the Implicit Association Test work?

Let the defense speak first1. The Implicit Association Test correctly picks up control associations. An IAT about attitudes towards insects and flowers found generally positive attitudes to the flowers and generally negative attitudes to the insects (p = .001), just as anyone with their head screwed on properly would expect. People's self-reports were also positively correlated with their IAT results (ie, someone who reported loving flowers and hating insects more than average also had a stronger than average IAT) although these correlations did not meet the 95% significance criterion. The study was repeated with a different subject (musical instruments vs. weapons) and similar results were obtained.

In the next study, the experimenters recruited Japanese-Americans and Korean-Americans. Japan has been threatening, invading, or oppressing  Korea for large chunks of the past five hundred years, and there's no love lost between the two countries. This time, the Japanese-Americans were able to quickly match Japanese names to "good" stimuli and Korean names to "bad" stimuli, but took much longer to perform the opposite matching. The Korean-Americans had precisely the opposite problem, p < .0001.  People's self-reports were also positively correlated with their IAT results (ie, a Korean who expressed especially negative feelings towards the Japanese on average also had a stronger than average IAT result) to a significant level.

continue reading »

In What Ways Have You Become Stronger?

24 Vladimir_Nesov 15 March 2009 08:44PM

Related to: Tsuyoku Naritai! (I Want To Become Stronger), Test Your Rationality, 3 Levels of Rationality Verification.

Robin and Eliezer ask about the ways to test rationality skills, for each of the many important purposes such testing might have. Depending on what's possible, you may want to test yourself to learn how well you are doing at your studies, at least to some extent check the sanity of the teaching that you follow, estimate the effectiveness of specific techniques, or even force a rationality test on a person whose position depends on the outcome.

Verification procedures have various weaknesses, making them admissible for one purpose and not for another. But however rigorous the verification methods are, one must first find the specific properties to test for. These properties or skills may come naturally with the art, or they may be cultivated specifically for the testing, in which case they need to be good signals, hard to demonstrate without also becoming more rational.

continue reading »

3 Levels of Rationality Verification

43 Eliezer_Yudkowsky 15 March 2009 05:19PM

Previously in seriesSchools Proliferating Without Evidence
Followup to
A Sense That More Is Possible

I strongly suspect that there is a possible art of rationality (attaining the map that reflects the territory, choosing so as to direct reality into regions high in your preference ordering) which goes beyond the skills that are standard, and beyond what any single practitioner singly knows.  I have a sense that more is possible.

The degree to which a group of people can do anything useful about this, will depend overwhelmingly on what methods we can devise to verify our many amazing good ideas.

I suggest stratifying verification methods into 3 levels of usefulness:

  • Reputational
  • Experimental
  • Organizational

If your martial arts master occasionally fights realistic duels (ideally, real duels) against the masters of other schools, and wins or at least doesn't lose too often, then you know that the master's reputation is grounded in reality; you know that your master is not a complete poseur.  The same would go if your school regularly competed against other schools.  You'd be keepin' it real.

Some martial arts fail to compete realistically enough, and their students go down in seconds against real streetfighters.  Other martial arts schools fail to compete at all—except based on charisma and good stories—and their masters decide they have chi powers.  In this latter class we can also place the splintered schools of psychoanalysis.

So even just the basic step of trying to ground reputations in some realistic trial other than charisma and good stories, has tremendous positive effects on a whole field of endeavor.

But that doesn't yet get you a science.  A science requires that you be able to test 100 applications of method A against 100 applications of method B and run statistics on the results.  Experiments have to be replicable and replicated.  This requires standard measurements that can be run on students who've been taught using randomly-assigned alternative methods, not just realistic duels fought between masters using all of their accumulated techniques and strength.

continue reading »

Schools Proliferating Without Evidence

40 Eliezer_Yudkowsky 15 March 2009 06:43AM

Previously in seriesEpistemic Viciousness

Robyn Dawes, author of one of the original papers from Judgment Under Uncertainty and of the book Rational Choice in an Uncertain World—one of the few who tries really hard to import the results to real life—is also the author of House of Cards: Psychology and Psychotherapy Built on Myth.

From House of Cards, chapter 1:

The ability of these professionals has been subjected to empirical scrutiny—for example, their effectiveness as therapists (Chapter 2), their insight about people (Chapter 3), and the relationship between how well they function and the amount of experience they have had in their field (Chapter 4).  Virtually all the research—and this book will reference more than three hundred empirical investigations and summaries of investigations—has found that these professionals' claims to superior intuitive insight, understanding, and skill as therapists are simply invalid...

Remember Rorschach ink-blot tests?  It's such an appealing argument: the patient looks at the ink-blot and says what he sees, the psychotherapist interprets their psychological state based on this.  There've been hundreds of experiments looking for some evidence that it actually works.  Since you're reading this, you can guess the answer is simply "No."  Yet the Rorschach is still in use.  It's just such a good story that psychotherapists just can't bring themselves to believe the vast mounds of experimental evidence saying it doesn't work—

—which tells you what sort of field we're dealing with here.

And the experimental results on the field as a whole are commensurate.  Yes, patients who see psychotherapists have been known to get better faster than patients who simply do nothing.  But there is no statistically discernible difference between the many schools of psychotherapy.  There is no discernible gain from years of expertise.

And there's also no discernible difference between seeing a psychotherapist and spending the same amount of time talking to a randomly selected college professor from another field.  It's just talking to anyone that helps you get better, apparently.

In the entire absence of the slightest experimental evidence for their effectiveness, psychotherapists became licensed by states, their testimony accepted in court, their teaching schools accredited, and their bills paid by health insurance.

And there was also a huge proliferation of "schools", of traditions of practice, in psychotherapy; despite—or perhaps because of—the lack of any experiments showing that one school was better than another...

continue reading »

Epistemic Viciousness

55 Eliezer_Yudkowsky 13 March 2009 11:33PM

Previously in seriesA Sense That More Is Possible

Someone deserves a large hattip for this, but I'm having trouble remembering who; my records don't seem to show any email or OB comment which told me of this 12-page essay, "Epistemic Viciousness in the Martial Arts" by Gillian Russell.  Maybe Anna Salamon?

      We all lined up in our ties and sensible shoes (this was England) and copied him—left, right, left, right—and afterwards he told us that if we practised in the air with sufficient devotion for three years, then we would be able to use our punches to kill a bull with one blow.
      I worshipped Mr Howard (though I would sooner have died than told him that) and so, as a skinny, eleven-year-old girl, I came to believe that if I practised, I would be able to kill a bull with one blow by the time I was fourteen.
      This essay is about epistemic viciousness in the martial arts, and this story illustrates just that. Though the word ‘viciousness’ normally suggests deliberate cruelty and violence, I will be using it here with the more old-fashioned meaning, possessing of vices.

It all generalizes amazingly.  To summarize some of the key observations for how epistemic viciousness arises:

  • The art, the dojo, and the sensei are seen as sacred.  "Having red toe-nails in the dojo is like going to church in a mini-skirt and halter-top...  The students of other martial arts are talked about like they are practicing the wrong religion."
  • If your teacher takes you aside and teaches you a special move and you practice it for 20 years, you have a large emotional investment in it, and you'll want to discard any incoming evidence against the move.
  • Incoming students don't have much choice: a martial art can't be learned from a book, so they have to trust the teacher.
  • Deference to famous historical masters.  "Runners think that the contemporary staff of Runner's World know more about running than than all the ancient Greeks put together.  And it's not just running, or other physical activities, where history is kept in its place; the same is true in any well-developed area of study.  It is not considered disrespectful for a physicist to say that Isaac Newton's theories are false..."  (Sound familiar?)
  • "We martial artists struggle with a kind of poverty—data-poverty—which makes our beliefs hard to test... Unless you're unfortunate enough to be fighting a hand-to-hand war you cannot check to see how much force and exactly which angle a neck-break requires..."
continue reading »

A Sense That More Is Possible

61 Eliezer_Yudkowsky 13 March 2009 01:15AM

Previously in seriesRaising the Sanity Waterline
Followup toTeaching the Unteachable

To teach people about a topic you've labeled "rationality", it helps for them to be interested in "rationality".  (There are less direct ways to teach people how to attain the map that reflects the territory, or optimize reality according to their values; but the explicit method is the course I tend to take.)

And when people explain why they're not interested in rationality, one of the most commonly proffered reasons tends to be like:  "Oh, I've known a couple of rational people and they didn't seem any happier."

Who are they thinking of?  Probably an Objectivist or some such.  Maybe someone they know who's an ordinary scientist.  Or an ordinary atheist.

That's really not a whole lot of rationality, as I have previously said.

Even if you limit yourself to people who can derive Bayes's Theorem—which is going to eliminate, what, 98% of the above personnel?—that's still not a whole lot of rationality.  I mean, it's a pretty basic theorem.

Since the beginning I've had a sense that there ought to be some discipline of cognition, some art of thinking, the studying of which would make its students visibly more competent, more formidable: the equivalent of Taking a Level in Awesome.

But when I look around me in the real world, I don't see that.  Sometimes I see a hint, an echo, of what I think should be possible, when I read the writings of folks like Robyn Dawes, Daniel Gilbert, Tooby & Cosmides.  A few very rare and very senior researchers in psychological sciences, who visibly care a lot about rationality—to the point, I suspect, of making their colleagues feel uncomfortable, because it's not cool to care that much.  I can see that they've found a rhythm, a unity that begins to pervade their arguments—

Yet even that... isn't really a whole lot of rationality either.

continue reading »

Test Your Rationality

39 RobinHanson 01 March 2009 01:21PM

So you think you want to be rational, to believe what is true even when sirens tempt you?  Great, get to work; there's lots you can do.  Do you want to justifiably believe that you are more rational than others, smugly knowing your beliefs are more accurate?  Hold on; this is hard

Humans nearly universally find excuses to believe that they are more correct that others, at least on the important things. They point to others' incredible beliefs, to biases afflicting others, and to estimation tasks where they are especially skilled.  But they forget most everyone can point to such things.  

But shouldn't you get more rationality credit if you spend more time studying common biases, statistical techniques, and the like?  Well this would be good evidence of your rationality if you were in fact pretty rational about your rationality, i.e., if you knew that when you read or discussed such issues your mind would then systematically, broadly, and reasonably incorporate those insights into your reasoning processes. 

But what if your mind is far from rational?  What if your mind is likely to just go through the motions of studying rationality to allow itself to smugly believe it is more accurate, or to bond you more closely to your social allies? 

It seems to me that if you are serious about actually being rational, rather than just believing in your rationality or joining a group that thinks itself rational, you should try hard and often to test your rationality.  But how can you do that? 

To test the rationality of your beliefs, you could sometimes declare beliefs, and later score those beliefs via tests where high scoring beliefs tend to be more rational.  Better tests are those where scores are more tightly and reliably correlated with rationality.  So, what are good rationality tests?

The Martial Art of Rationality

41 Eliezer_Yudkowsky 22 November 2006 08:00PM

I often use the metaphor that rationality is the martial art of mind.  You don't need huge, bulging muscles to learn martial arts - there's a tendency toward more athletic people being more likely to learn martial arts, but that may be a matter of enjoyment as much as anything else.  Some human beings are faster or stronger than others; but a martial art does not train the variance between humans.  A martial art just trains your muscles - if you have the human-universal complex machinery of a hand, with tendons and muscles in the appropriate places, then you can learn to make a fist.

How can you train a variance?  What does it mean to train +2 standard deviations of muscle?  It's equally unclear what it would mean to train an IQ of 132.

But if you have a brain, with cortical and subcortical areas in the appropriate places, you might be able to learn to use it properly.  If you're a fast learner, you might learn faster - but the art of rationality isn't about that; it's about training brain machinery we all have in common.

continue reading »

View more: Prev | Next