The wiki entry for Epistemic Hygiene defines the term as:

Epistemic hygiene consists of practices meant to allow accurate beliefs to spread within a community and keep less accurate or biased beliefs contained. The practices are meant to serve an analogous purpose to normal hygiene and sanitation in containing disease.

The term was coined in Steve Rayhawk's and Anna Salamon's post "The ethic of hand-washing and community epistemic practice", and there have been several mentions of it around the site. But what, exactly, might good epistemic hygiene norms be?

I'm especially interested in this question in the context of meetup groups. In Less Wrong NYC: Case Study of a Successful Rationalist Chapter, Cosmos writes:

Epistemic privilege and meme-sharing: The most powerful aspect of a group of rationalists is that you have an entire class of people whose reasoning you trust. Division of labor arises naturally as each member has different interests, they all pursue a variety of skills and areas of expertise, which they can then bring back to the group. Even the lowest-level rationalists in the group can rapidly upgrade themselves by adopting winning heuristics from other group members. I cannot overstate the power of epistemic privilege. We have rapidly spread knowledge about metabolism, exercise, neuroscience, meditation, hypnosis, several systems of therapy... and don't forget the Dark Arts.

This would imply that one way that a meetup group (or for that matter, any social group) could get really successful would be by adopting great epistemic hygiene norms. But unless everyone present has read most of the Sequences - which gets increasingly unlikely the more the group grows - even most LW meetups probably won't just spontaneously enforce such norms. Wouldn't it be great if there were an existing list of such norms, that each group could look through and then decide which ones they'd like to adopt?

Here are some possible epistemic hygiene norms that I could find / come up with:

  • Be honest about your evidence and about the actual causes of your beliefs. Valuable for distinguishing accurate from mistaken beliefs. (The ethic of hand-washing)
  • Focus attention on the evidence and on the actual causes of both your beliefs, and beliefs of the people you're talking with. (The ethic of hand-washing)
  • Only pass on ideas that you've verified yourself. (Problematic, since any given individual can only verify a tiny fraction of all of their beliefs.) (The ethic of hand-washing)
  • Explicitly separate “individual impressions” (impressions based only on evidence you've verified yourself) from “beliefs” (which include evidence from others’ impressions). (Naming beliefs)
  • Give status for reasoned opinion-change in the face of good evidence, rather than considering the "loser" of a debate low-status. (The ethic of hand-washing)
  • Leave the other person a line of retreat in all directions, avoiding pressures that might wedge them towards either your ideas or their own. (The ethic of hand-washing)
  • Encourage people to present the strongest cases they can against their own ideas. (comment, Carl Shulman)
  • Encourage "Why do I think that" monologues. You elaborate on a thing you currently believe to be true by specifying the reasons you believe it, the reasons you believe the reasons, etc and trying to dig out the whole epistemological structure. (comment, JulianMorrison)
  • Find ways to track the sincerity and accuracy of what people have said in the past, and make such information widely available. (comment, mark_spottswood)
  • Don't trust evidence you don't remember the source of, even if you remember reading the primary source yourself. (Hygienic Anecdotes)
  • Be upfront about when you don't remember the source of your claim. (comment, PhilGoetz)
  • When asked a question, state the facts that led you to your conclusion, not the conclusion itself. (Just the facts, ma'am!)
  • If you basically agree with someone's argument, but want to point out a minor problem, start off your response with the words "I agree with your conclusion". (Support That Sounds Like Dissent)
  • When agreeing with someone's claim, distinguish between "I have independent evidence that should add to our confidence in the speaker's conclusion" and "based on the evidence others have presented, I now agree, but don't take my agreement as further reason to update". (comment, AnnaSalamon)
  • Don't judge people for having bad ideas, only judge the ideas. (Your Rationality is My Business)
  • Be careful when recommending books on sources where you are not an expert, particularly when they're on highly controversial topics and happen to support your own conclusions. (comment, Vladimir_M)
  • Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing. (Is That Your True Rejection? at CATO Unbound)
  • In discussions, presume the kinds of conditions that are the least convenient for your argument. (The Least Convenient Possible World)
  • If people are trying to figure out the truth, don't mistake their opinions about facts for statements of values. (Levels of communication)

And of course, there's a long list of norms that basically amount to "don't be guilty of bias X", e.g. "avoid unnecessarily detailed stories about the future", "avoid fake explanations", "don't treat arguments as soldiers", etc.

Which of these norms do you consider the most valuable? Which seem questionable? Do you have any norms of your own to propose?

New Comment
26 comments, sorted by Click to highlight new comments since:

Salamon and Rayhawk, "Share likelihood ratios, not posterior beliefs", suggests a few rules for updating on others' beliefs that may be relevant; most importantly (paraphrased):

  • Distinguish evidence from priors — in conversation, when saying things like "I think Jack is smart, but not extremely smart" distinguish "I don't have evidence that Jack is extremely smart" from "I do have evidence that Jack is not extremely smart". This makes it possible for the recipient of information to combine evidence without double-counting agreed priors e.g. the base rate of extremely smart people in the world.

This is something which I think overlaps a number of your suggestions, but doesn't exactly match any of them,

If you're making a generalization, check it for scope. How much knowledge of what you're generalizing about do you actually have? Could conditions have changed? How representative are the examples you're drawing your conclusions from?

I agree. I've noticed an especially strong tendency to premature generalization (including in myself) in response to people asking for advice. Tell people what your experiences were, not (just) the general conclusions you drew from them.

Another angle on scope: If there's a difference, how large is it?

This relates to a frequent annoyance-- articles about differences between men and women which does mention the size of the difference or how much overlap there might be.

Thanks for the compilation. This point was especially useful:

Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing.

I think beginning rationalists should first look to make sure they're willing to change their mind on a subject, period.

(It's worth noting that the "moral wrongdoing is like infectious disease" metaphor is a surprisingly deep aspect of human social psychology, even to the extent that hand-washing affects levels of risk aversion. The pervasiveness of the metaphor is especially clear in the Christians' emphasis on baptism, holy water, purgatorial fires, et cetera. I take about five baths a day, and I suspect it has at least a little to do with constantly feeling guilty. Understanding the basis of this connection might helps us manipulate it: "quietly going along with something even if you don't actually agree with it is dirty".)

[-][anonymous]50
  • Don't be afraid to ask stupid questions.

I hate, hate social situations where I have to take big status hit to ask a simple clarifying question; it would greatly promote understanding to have the social norm that dumb-sounding questions are considered acceptable.

Train your social environment to expect trivial clarifying questions from you (and so to stop making inferences of any significance from individual instances of this behavior), by asking them frequently.

I try to keep in mind something I remember reading about the eminent mathematician David Hilbert. Supposedly he would constantly and shamelessly ask questions during math presentations, and once even asked what a Hilbert space was.

I usually use “Sorry for the stupid question, but X is [answer I think most likely], isn't it?”. I will learn the same thing as if I asked “What is X?”, but I won't sound like an idiot.

Don't punish yourself or others for taking time to think before responding.

If you were a kid in school who was rewarded for being quick with the answers, taking time and letting others take time is a habit that takes some training.

To express it at more length:

Don't press yourself or others to have immediate perfect articulations of what they know.

If people are only respected if they speak immediately, the only thoughts that will be spoken are cached thoughts.

Be comfortable saying things like "I think I shouldn't agree or disagree right away," or "I'll need to take some time to think through what my real opinion is on that point," or "I didn't quite say what I really meant earlier." Support others who say those things.

Believe that the value of a person's thoughts is not always the same as their speed of response.

Create a favorable environment for both you and others to figure out what they really think, notice, and remember, as opposed to what they're able to articulate in the heat of a single conversation.

This is a special case of "leave a line of retreat."

No, it is not. Leaving a line of retreat is about imagining possible worlds where one of your beliefs is false.

Oops. Yah. It's related, in the sense of "lower the pressure to insist that you're right", but it's not the same. Fixed.

Just a data point, but I felt a need to write it:

I found here a link to an interesting article "Multitaskers bad at multitasking". According to the article, people who 'routinely consumed multiple media such as internet, television and mobile phones' were worse at three attention-related experiments described in the article. However these people have self-reported as better in multitasking.

Now here are some selected comments from below the article: (three comments from three different people)

Women make better multitaskers then most men, only because their lives depend on multitasking at home and at work...it seems the above study was just computer based, which is probably why the multitaskers didn't do so well...they were thinking of other things that they should've or could've been doing.

This is surprising. I'm fantastic at it. Perhaps the pool didn't actually include those who are actually gifted enough to multitask.

I am a multitasker. I have a conference call on mute as I type this. The advantage? Ignoring the detail allows me to focus on the big picture, ensuring my team are always realigning with the strategic objectives. I have a very competent group of focussed people. They're great with detailed tasks that require focus. They wouldn't let me near those activities as we all know I'd make mistakes. As a team we understand each other's strengths and weaknesses and it works well.

There is something valuable in these comments. First, some aspects of the work (such as focusing on the big picture) in a given context may be more valuable than what was tested in the experiment. Second, despite multitasker getting less utility per task, they may get more total utility, which would make multitasking a good strategy. Third, people are different, so even if the results of this experiment are relevant for majority of people, there may be some exceptions for whom multitasking is a way to get high quality work done.

However all three comments have somehow missed the essence of the article. If the article says "it was experimentally shown that people who believe to be good at multitasking work are statistically worse at work" you can't respond just by "that's nonsense, because I believe I am good at multitasking my work", unless you include some convincing evidence why we should believe that your belief is more reliable than similar beliefs of the other people who were experimentally proven wrong. (How about considering a possibility that you might be wrong, both about "being fantastic" and "focusing on the big picture"?) And that evidence should be something better than a cached thought (women being better multitaskers), or an ad-hoc excuse (a computer-based study can't measure the real multitasking).

I think we are already doing good about it on LW. I did not notice the gradual change in my epistemic hygiene expectations, until I read a text containing basic mistakes, which I probably would not have recognized as mistakes a few month ago. As an analogy, washing one's hands may feel like a boring ritual, until one sees other people eating with dirty hands and suddenly feels disgusted.

Yeah. I have just been metaphorically bloodying my forehead trying to explain to someone that if A says "X", B says "oh, you cherry-picked that example, so it's a bad argument", A says "no I didn't, it's from Y" and B replies "oh, you just picked Y at random, so it's a bad argument", then B has done something stupid no matter what the argument is about - and this has been impossible to get across. It felt very like arguing with a creationist. (It was similar in unlikelihood of minds being changed - one person is British, one is American and the argument was about gun control - so it was pretty much epistemic sewer diving.)

tl;dr LessWrong has made me significantly less tolerant of run-of-the-mill weapons-grade stupidity. (That should be an oxymoron, but somehow doesn't seem to be one in practice.)

Another one that I feel is important:

Identify the testable predictions that go along with your belief.

This not only wards off "just so stories" and confirmation bias; it also shows people how to mine their own knowledge to add weight for or against the belief you're offering. If their expertise overlaps yours at all, they probably have some knowledge to offer; if you point out the testable predictions your belief implies, they can deploy that knowledge to make yourself and them smarter.

An important one:

Others:

Also, I rather like these.

[-]TimS40

In terms of debate, the following have always been the most helpful to me in ensuring that I'm not believing things only because I want them to be true. I've ordered them from easiest to hardest to implement.

In discussions, presume the kinds of conditions that are the least convenient for your argument.

Even if you've been fairly mind-killed, it's pretty easy to notice when someone else raises a difficulty you haven't thought of.

Encourage "Why do I think that" monologues. You elaborate on a thing you currently believe to be true by specifying the reasons you believe it, the reasons you believe the reasons, etc and trying to dig out the whole epistemological structure

This method also has the benefit of forcing you to focus on improving the truth-quality of beliefs rather than winning the argument.

Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing.

This is incredibly hard to implement, because if you didn't have the core belief, the supporting beliefs would never seem important.

The term was coined in Steve Rayhawk's and Anna Salamon's post

No, it's a lot older.

ETA: at least, I'm pretty sure I've seen it floating around before then, possibly as "epistemological hygiene", though it doesn't seem to be used in the SL4 archives.

[-]Crux20

Interesting. I've used the term "epistemic hygiene" plenty of times, but it looks like I've been using it incorrectly. I thought an "epistemic hygiene technique" was simply a strategy for avoiding getting epistemically messed up in some way, but it seems like what it really deals with is sound belief propagation throughout a community.

As for the question of what epistemic hygiene norms there should be though, I wonder whether there's anything more to what those norms should be past what I used to think the term meant. In other words, does being epistemically sanitary with what you write involve anything past simply remaining epistemically rational with what you're communicating?

It seems like an interesting question--how to keep the community memeplex sanitary, and avoid spilling harmful memes or letting destructive beliefs propagate--but I'm not sure there's anything more to this than what we already seem to spend most our time talking about: how to maintain your own epistemic rationality, and how to communicate efficiently and effectively by properly managing the inferential distance etc.

It certainly seems like it may be a fruitful line of inquiry, and I'll think more about it later, but for now I'm just trying to bring up the possibility that this question is of no special importance, and there's nothing to community epistemic practice besides basic individual epistemic practice coupled with the ability to communicate effectively.

Only pass on ideas that you've verified yourself. (Problematic, since any given individual can only verify a tiny fraction of all of their beliefs.)

That really depends how literally you take this, doesn't it? Totally literally, you don't pass on an unverified idea - you can pass on a verifiable token referring to the idea ("I heard from so-and-so that...")

[-][anonymous]20

[admin if you (Kaj) delete your comment I will delete mine]

[This comment is no longer endorsed by its author]Reply
[-][anonymous]20

Fixed, thanks.

[This comment is no longer endorsed by its author]Reply

I've been thinking about how we might improve epistemic hygiene in the EA community (particularly on the forum) this post has been useful and I'm keen to find more content in this space.

[-][anonymous]00

Thanks Kaj, this is my favourite article on LessWrong at the moment. I have plenty to learn.

Most valuable to me at the moment:

Before you stake your argument on a point, ask yourself in advance what you would say if that point were decisively refuted. If you wouldn't actually change your mind, search for a point that you find more convincing. (Is That Your True Rejection? at CATO Unbound)

In discussions, presume the kinds of conditions that are the least convenient for your argument. (The Least Convenient Possible World)

If people are trying to figure out the truth, don't mistake their opinions about facts for statements of values. (Levels of communication)

Only pass on ideas that you've verified yourself. (Problematic, since any given individual can only verify a tiny fraction of all of their beliefs.) (The ethic of hand-washing)

Explicitly separate “individual impressions” (impressions based only on evidence you've verified yourself) from “beliefs” (which include evidence from others’ impressions). (Naming beliefs)

Leave the other person a line of retreat in all directions, avoiding pressures that might wedge them towards either your ideas or their own. (The ethic of hand-washing) Encourage people to present the strongest cases they can against their own ideas. (comment, Carl Shulman)

The link to “Your Rationality is My Business” is broken (by an extra lesswrong.com/ in the URL).