As I've been reading through various articles and their comments on Less Wrong, I've noticed a theme that has appeared repeatedly: a frustration that we are not seeing more practical benefits from studying rationality. For example, Eliezer writes in A Sense that More Is Possible,

Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people...

Yvain writes in Extreme Rationality: It's Not That Great,

...I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines, I can't think of any.

patrissimo wrote in a comment on another article,

Sorry, folks, but compared to the self-help/self-development community, Less Wrong is currently UTTERLY LOSING at self-improvement and life optimization.

These writers have also offered some suggestions for improving the situation. Eliezer writes,

Of this [question] there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.

patrissimo describes what he thinks an effective rationality practice would look like.

  1. It is a group of people who gather in person to train specific skills.
  2. While there are some theoreticians of the art, most people participate by learning it and doing it, not theorizing about it.
  3. Thus the main focus is on local practice groups, along with the global coordination to maximize their effectiveness (marketing, branding, integration of knowledge, common infrastructure). As a result, it is driven by the needs of the learners [emphasis added].
  4. You have to sweat, but the result is you get stronger.
  5. You improve by learning from those better than you, competing with those at your level, and teaching those below you.
  6. It is run by a professional, or at least someone getting paid [emphasis added] for their hobby. The practicants receive personal benefit from their practice, in particular from the value-added of the coach, enough to pay for talented coaches.

Dan Nuffer and I have decided that it's time to stop talking and start doing. We are in the very early stages of creating a business to help people improve their lives by training them in instrumental rationality. We've done some preliminary market research to get an idea of where the opportunities might lie. In fact, this venture got started when, on a whim, I ran a poll on ask500people.com asking,

Would you pay $75 for an interactive online course teaching effective decision-making skills?

I got 299 responses in total. These are the numbers that responded with "likely" or "very likely":

  • 23.4% (62) overall.
  • 49% (49 of 100) of the respondents from India.
  • 10.6% (21 of 199) of the respondents not from India.
  • 9.0% (8 of 89) of the respondents from the U.S.

These numbers were much higher than I expected, especially the numbers from India, which still puzzle me. Googling around a bit, though, I found an instructor-led online decision-making course for $130, and a one-day decision-making workshop offered in the UK for £200 (over $350)... and the Google keyword tool returns a large number of search terms (800) related to "decision-making", many of them with a high number of monthly searches.

So it appears that there may be a market for training in effective decision-making -- something that could be the first step towards a more comprehensive training program in instrumental rationality. Some obvious market segments to consider are business decision makers, small business owners, and intelligent people of an analytical bent (e.g., the kind of people who find Less Wrong interesting). An important subset of this last group are INTJ personality types; I don't know if there is an effective way to find and market to specific Meyers-Briggs personality types, but I'm looking into it.

"Life coaching" is a proven business, and its growing popularity suggests the potential for a "decision coaching" service; in fact, helping people with big decisions is one of the things a life coach does. One life coach of 12 years described a typical client as age 35 to 55, who is "at a crossroads, must make a decision and is sick of choosing out of safety and fear." Life coaches working with individuals typically charge around $100 to $300 per hour. As far as I can tell, training in decision analysis / instrumental rationality is not commonly found among life coaches. Surely we can do better.

Can we do effective training online? patrissimo thinks that gathering in person is necessary, but I'm not so sure. His evidence is that "all the people who have replied to me so far saying they get useful rationality practice out of the LW community said the growth came through attending local meetups." To me this is weak evidence -- it seems to say more about the effectiveness of local meetups vs. just reading about rationality. In any event, it's worth testing whether online training can work, since

  • not everyone can go to meetups,
  • it should be easier to scale up, and
  • not to put too fine a point on it, but online training is probably more profitable.

To conclude, one of the things an entrepreneur needs to do is "get out of the building" and talk to members of the target market. We're interested in hearing what you think. What ideas do you think would be most effective in training for instrumental rationality, and why? What would you personally want from a rationality training program? What kinds of products / services related to rationality training would you be interesting in buying?

New to LessWrong?

New Comment
39 comments, sorted by Click to highlight new comments since: Today at 8:40 AM

"Life coaching" is a proven business, and its growing popularity suggests the potential for a "decision coaching" service; in fact, helping people with big decisions is one of the things a life coach does.

I was going to call bullshit on this statement, but then I looked it up. Apparently the life/business coaching industry makes $2.4 billion/yr., For comparison, the porn industry makes $2.6-3.9 billion/yr.

Yesterday, I started playing around with the idea of doing rationality business consulting.

My basic idea is as follows. Use my contacts to get in touch with a number of suitable companies and offer to hold them a free presentation on rationality ideas. If they agree, hold a lecture maybe an hour long, packed full of interesting and obviously valuable concepts thrown at the audience at a rapid pace. At the end, thank them for their time, take any questions they might have, and mention that I also have an extended lecture series prepared that covers these topics in more detail, and many others besides. Offer to hold those lectures in exchange for a sizable sum, or if they're not interested, ask them if they could at least recommend me to somebody who might be.

If I just have a good enough presentation, act confident enough and price myself appropriately, it seems to me like this should result in sales pretty quickly. And presentations are one of the few areas where I don't have issues with confidence - if I've practiced a lecture enough and know the content, I get more excited than nervous.

I've scribbled down some random ideas on content to be included in my opening lecture: this is still a very rough draft, with ideas chosen more or less at random as they came to mind, but feedback is more than welcome.

How to act and decide better

How to communicate better

How to improve company atmosphere: status games and offense

  • Humans constantly play various status games with each other, often more or less unconsciously. Becoming more aware of the status games helps take them less seriously. Impro has a lot of status game exercises that can be used to make people more aware of them.
  • A major reason for people taking offense is that they think their status is being lowered. If everyone understands this, it can help defuse conflicts as it becomes easier to make explicit the reasons for being offended. "I experienced X as offensive because I felt that it implied I wasn't good at Y, and that I therefore wasn't of much value." "Oh, I didn't mean that."
  • The ABC of Cognitive Behavioral Therapy: Activating Event, Belief, Consequence. A boss asks "have you done the task I gave you, the deadline is approaching?", the subordinate believes this to mean that the boss doesn't trust his ability to get things done without micromanagement, gets depressed. CBT attacks the "Belief" stage and encourages one to ask whether the beliefs are reasonable.
  • Verbal self-defense: ways to deflect hostile statements so that they don't escalate into conflict.
  • Various cultural differences that may lead to conflict, e.g. askers versus guessers.

Other

Other, completely unorganized

(This section was written when I just started coming up with ideas one after another and stopped trying to sort them or make them very comprehensible to outsiders, so don't worry if you don't understand all of the points.)

  • 59 seconds
  • Robbins: coloring memories, other things
  • Radical Honesty? Perhaps something a bit less radical...
  • commitment effects
    • beware identity
    • Goertzel's quote on identity
    • self-signaling
    • success spirals
    • Mindset
  • PJ Eby on Akrasia, lukeprog on procrastination, procrastination book
  • sick relationships
  • life hacker?
  • “think of the opposite”; bayes' theorem in everyday life
  • what intelligence tests miss
  • what data generated that thought
  • that one paper on an ideal working environment
  • flow
  • psychology of happiness and well-being; the 3-1 rule
  • pjeby: instant motivation
  • prediction markets?
  • humans are not automatically strategic
  • relating to the curse of identity: analyzing the concrete consequences of decisions, the effect of social norms on decision-making (school example), charity examples, scope neglect, political ignorance
    • expected utility
  • value of information? decision analysis sequence?
  • bureaucracies and avoiding responsibility
  • status quo bias; groupthink; review meetings; devil's advocates
  • fundamental attribution error & benefit of doubt
  • taboo your words
  • the content hilighted in the comments for the various "what's been the most useful LW insight to you" posts

You'll notice that I also have a lot of content that's not strictly rationality-related. That's intentional: I figure that having useful insights on a wide variety of topics will seem more impressive and increase the likelihood that at least something in there does catch one's interest. (Also, I wrote in more detail about the non-rationality stuff because for the rationality stuff I could just link to the LW posts.)

My guess is that thematic unity is important; it's good to have "insights into a wide variety of [business problems]", but you want those insights to seem to come from a unified source, so that the person ends up with some clear, vivid association with you ("Kaj: rational decision-making") instead of associating you with a random hodge-podge of key words.

Do you have friends who are in your target audience, so that you can give them two-sentence descriptions of each of your potential themes and see which ones they're excited about?

Do you know anyone who does business consulting for money, who you could ask questions of? I don't know how often people make money for lectures, vs. for the sort of "consulting" where they listen to folks' particular problems, and try to help them with them?

It sounds worth pursuing, anyhow -- the idea of "rationality business consulting" has been batted around for a while, and it would be great if someone actually tries it.

Good points, thanks. I believe I'll start with the "traditional" rationality stuff at first, since it's easier to give the impression of everything coming from a unified source that way. Also, I know it better than the other stuff.

I spoke to some friends who are familiar with this kind of thing, and one offered to give me feedback (both on the content, and for my presentation style) as well as possibly put me in contact with people who might be interested. I think I'll re-read What Intelligence Tests Miss as well as skim through a couple of other books on rationality that I haven't yet read, and then see what kinds of presentation ideas strike me as good.

They did mention that what I'm talking about is really more lecturing than consulting, but they say there's good money to be had that way too - presuming that I have the patience to build up a favorable reputation and references.

You should take a look at what existing executive coaches, business coaches, business consultants, and management training programs do. There are some of these that actually apply the methods of decision analysis to their clients' problems, or teach their clients how to do so. Here are some that I have encountered:

  • Hubbard Decision Research. Training, consulting, and tools for difficult measurement, forecasting and investment decisions facing organizations, using Applied Information Economics (basically, decision analysis with an emphasis on expected value of information and applying statistical methods to indirectly measure things that are not obviously measurable.)

  • HeadScratchers. Critical thinking workshops and coaching for problem solving, decision making and creativity.

  • Baker Street Publishing. They sell various publications for decision coaches, management consultants, and strategic planners.

  • Solysis. Consulting for organizations, decision coaching for executives.

  • Executive Decision Making (Professional Development course from Cornell)

  • Online organizational decision making course from Van Thinking

  • Strategic Decision Making, upcoming 5-day workshop from the London School of Economics and Political Science.

Thanks, I'll look at those.

I think I may have a valuable point or two to contribute because of some of my life experience, e.g.:

  • I had a business as a Life Coach (in California in the 90's).

  • I used to be a fairly avid consumer of various flavors of coaching, motivational programs, self help etc. (I still am, I've just gotten MUCH more discriminating -- that's why I'm here.)

My primary reactions to your post are:

  • There is almost certainly a market for the service you describe. Your big problem, especially at first, is going to be sales and marketing. I'm sure this much is obvious, but you probably ought to ask yourself if you have an appetite for doing full-time sales and marketing, because that is your future for the next 5+ years if you start this business and want it to be successful. This leads me to the next point:

  • Be careful what you wish for. I often think about going back into some kind of coaching business, but when I do, I remember what it was like, and that gives me pause. I didn't like having to constantly market myself as a coach. There were a number of things that felt unsavory about it, including the fact that all my friends were now prospective clients. Surprisingly, I also really dreaded my coaching calls, even though there was frequently a nice feeling I had helped someone AFTER the call. My point is that it's hard to predict whether you are going to enjoy being a practicing coach or not, and I judge that it probably takes a very specific kind of personality type -- an aggressively extroverted sales-oriented type -- to really enjoy that business. This should be an important element of your consideration IMO.

  • Another judgment I have is that coaching is hard, and it's hard in subtle ways. People are not very amenable to change, even if they THINK they are amenable. The behaviors that would make an actual difference to our lives are not as accessible to conscious tinkering as we expect them to be. As you contemplate starting this business, you may also want to ponder if you will be frustrated when you observe people not changing as much as you'd like them to change, in response to your coaching. Letting go of the results is a subtle and important skill IMO.

All of this is not meant to discourage you -- I think you have an interesting idea for a business, and I encourage you to pursue it IF none of the above puts you off. Just pay close attention and try to determine how much fun, or not, this is actually going to be. I think you should only do it if it is actually fun. It was not fun for me.

Note that the comment syntax around here uses two carriage returns to indicate a paragraph break and two spaces to indicate a line break.

Glad you're trying this. You may even beat SI's spin-off "Rationality Org" out the door. Inevitably, we'll be trying different things, so I'm sure we'll have lessons to share with each other about what does and doesn't work.

it seems to say more about the effectiveness of local meetups vs. just reading about rationality.

Or the importance of interaction, especially live interaction, when it comes to changing your thought processes.

In particular, there's quite a bit of value to getting real-time feedback about the thought processes you're actually using at a given moment of time. By default, we tend to respond to descriptions of dysfunctional thought patterns by projecting them onto other people, and are not necessarily able to recognize the same pattern in ourselves... particularly if it's an intuitive, system 1 type of process.

Sometimes, I think that of the services I provide as a mind hacking instructor, the most critical one is providing real-time distinctions between a student's observations and confabulations. (Critical in the sense that without that basic distinction, the student will not be able to develop enough internal observation to be able to reliably apply any other change techniques on their own.)

This.

When I hang out with other rationalists in real time, they can actually respond to and point out particular thought processes. Sometimes it's pointing out or joking about a particular bias, other times it's just waggling their eyebrow and asking if I'm sure.

The real-time feedback helps learning so much it's ridiculous. The 1 week I spent in San Francisco probably saved me a few months of time, and there are specific useful insights that I don't think I would have ever come up with on my own.

The point about interaction is a good one. I've been thinking to some extent about that, but I hadn't considered the benefits of live interaction. That's definitely easier to get when you have people meeting face to face. We'll want to brainstorm some ways of getting an element of live interaction into an online setting.

It can still be done online, e.g. Google+ Hangouts are an example of live group interaction (up to 10 people) that seems to be fairly popular.

The experience still isn't as rich as in-person meeting, but it's a big step up from pre-recorded video.

Rationality is a lot like grammar: it's good to have for any job, everybody learns most of what they'll ever learn as kids, and you lose it when you drink. The main difference is that people don't think of it as something to be learned.

As money-making operations go, there are quite a few that teach rationality without calling it that. QA and troubleshooting are both huge IT sectors that are entirely about applied rationality, and if you can prove that your rationality program benefits those organizations, you will get work from IT managers.

That comparison is very impressive in its accuracy. It also clearly illustrates, for me, the massive critical failure of modern educational institutions to optimize for learning. Grammar is basically the very first thing a kid has to learn when entering school. Philosophy and rationality are barely even considered remotely approachable until far, far later, usually at a bachelors' level, if ever. Traditional view, even among philosophy and humanities teachers, seems to be that even basic rationality is "way too complex" for a child to learn.

Anecdotal evidence: My personal experiment, with my younger sister, seems to demonstrate that younger minds are even better at learning and internalizing rationality.

Anyway, to come back from that tangeant, I'll reiterate that I like the comparison between rationality and grammar. I also really hope good formal training regimens for rationality can be made in the very near future, be that through comparing with best available grammar training and martial arts training and QA/troubleshooting methods or through other means.

There are two different things commonly called "grammar".

One of them is the structure of one's native language, which one needs in order to communicate. This is learned well before school. It is how you know to say "I want an apple, please" and not "Apple a please want I." Learning this sort of grammar is instinctive and unavoidable; you can't not learn it, if you're a little kid exposed to language users (spoken or signed) at all. And yes, it is complex — but we also have "designed-in" abilities to learn it.

The other thing called "grammar" is a collection of rules for a high-status register of one's native language, which one needs in order to sound "educated" rather than "ignorant". The meta-rule behind these rules is "Find ways to avoid speaking like a member of the underclass." It is how you know to say "I'd like to ask you a question" and not "I wanna aks you a question, yo." There is nothing about a high-status register of language that makes it any more capable of accurately representing the world than a low-status register. Truth can be represented in the speech of South Central L.A. just as well as it can be represented in a Harvard accent.

Yes. It illustrates, not proves.

Compare:

Grammar as a collection of rules is taught from a very young age, and is merely for signalling and more easily avoiding ambiguous statements in specific contexts. It is also very complex, and extremely hard to acquire and master - which is exactly why, I've been told, it's taught starting at a very young age and all throughout compulsory education, and often well into higher education institutions.

Rationality is taught once you've already learned, mostly the wrong way, all they think anyone needs to know, and it merely serves to improve almost every aspect of personal thought, beliefs and decisions, as well as improve all learning done afterwards. It is also very complex, and extremely hard to acquire and master - which is exactly why, I've been told, it's taught starting very late in any curriculum, and almost never before you've already made it past compulsory education, and often well into higher education institutions.

I think one of the worries I would have is - is this a cult, feel-good motivational-speaker sort of thing, a self-propagating meme predator attacking me?

Possibly an attempt to align your financial incentives with the students would help - for example, by offering some sort of money back if not satisfied after such-and-so period of time, or accepting student labor as payment (while still paying them at their current rate - that is, hiring them out at a higher, post-training rate).

I also think that because this is an information good (videos, software, text), you might not be able to charge what you initially might expect to be able to charge. Possibly generating content continuously, giving away a fraction of it (the oldest fraction) and charging for the cutting-edge updates could work.

If I knew precisely what I wanted from rationality training, I wouldn’t need to buy the knowledge, would I?

What would I want from it? Immediate results that aren't tied to an incredibly specific problem. I want procedures to define the problem clearly, to map the potential answer space, and then to organise that answer space with evidence. And I want to be able to go home and do at least one of those three things better than I could when I got there.

What do I think would be effective? Games. Why? It lets you test what you’re doing. You’re talking about making good decisions, and those decisions should make you win in a game that works with the rules that the heuristic concerns just as they would anywhere else. Even more helpfully, you can compare the results of your game to those of people who play differently.

Gambling? Probability and evidence. Role-playing? Planning. You’d probably want role-playing games based on the real world with a high rate of iteration that restart every time one side loses. Switch up the roles whenever one side starts to win consistently. Get people used to generating a load of options, losing them and chucking them away in favour of another set. Pick a general task - i.e. don’t invest too much in the background waffle or constrain the options too much....

Besides, games are fun and people learn better when they’re having fun than when they’re bored senseless.

Point out at the end how you can tie the same heuristics you’re teaching to win in the games can be applied to make money, or to achieve some other concrete goal that people are likely to want. Which should be a fairly easy step, if you actually have something to sell.

What would I pay for it? At the moment, nothing. I've seen no evidence you're better at winning than me. Ideally you should have something you can show being used for a task, even if I can’t see precisely how it works, before you try and sell it to me. Otherwise I’ll be very suspicious that you’re just trying to sell a fashionable style of thought that gives people The Right Answer.

49% (99 of 100) of the respondents from India.

Typo?

Oops. Fixed now.

I wouldn't necessarily pay for an online course, partly because I'm "cheap" and more reluctant to pay for a lot of things than most people, and partly because my life is going rather smoothly right now and I'm not sure how much added benefit I could get out of a $75 course. That being said, I pay significantly more than $75 to learn martial arts, for the main reason that it's fun and thus makes my life better. So if I had good evidence that your rationality course was fun, that might sway me. And when I translate $75 into my current hourly wage, it does seem that the good (and fun) obtained from such a course is worth as much as a shortish day at work.

Anyway best of luck!

I'd be extremely skeptical about the results, so I'd probably not pay. I don't even see how exactly such course could generate evidence of it working.

I think the most important thing about a rationality training service is operationalizing what is meant by rationality.

What exact services would the rationality training service provide? Would students have beliefs that match reality better? Be less prone to cognitive biases? Tend to make decisions that promote greater utility (for themselves or others)? How would you test this? Martial arts dojos tend to (putting it crudely) make their students better at hitting things than they were before; that's a lot easier to objectively measure than making students better at thinking than they were before.

I personally would not pay for a rationality training service unless it provided clear, non-anecdotal evidence that the average person received some benefit. I'd be particularly concerned about whether the service actually taught people to think more clearly, or simply inculcated them with the views of the people running the service.

Instrumental rationality is the focus we have in mind -- doing the things that most enhance your personal utility. Avoiding cognitive biases and having beliefs that match reality better are means to better instrumental rationality, but not the end. Some of the things that I think would fall under instrumental rationality would be better decisions (the ones important enough to merit some analyzing), determining what habits would be good to acquire or discard, and overcoming akrasia. I think we would have to start highly focused on one of these areas and a specific target market, and branch out over time.

As to how to test benefit of the training... I've put that on my list of questions to consider. I don't know the answer right now. But anything that has an observable effect of some sort will be measurable in some fashion.

BTW, just discussing things on LW makes me a more careful thinker. I originally wrote, "As to how to demonstrate benefit of the training...", and then I realized the bias in that word "demonstrate" -- it presupposes a particular conclusion in advance!

To a certain degree one could test instrumental rationality indirectly. Perhaps have them set a goal they haven't made much progress on (dieting? writing a novel? reducing existential risk?) and see if instrumental rationality training leads to more progress on the goal. Or give people happiness tests before and a year after completing the training (i.e. when enough time has passed that the hedonic treadmill has had time to work). Admittedly, these indirect methods are incredibly prone to confounding variables, but if averaged over a large enough sample size the trend should be clear.

Something to think about if you have a goal of losing weight. How do you decide whether a goal makes sense?

Interesting article!

I presume that "I realized this goal was irrational and switched to a different goal that would better achieve my values" would also be a victory for instrumental rationality...

Please forgive me if this post doesn't belong here. I want to start a non-profit teaching rationality to a broad audience, starting off as one-day workshops on the weekends. Initially as a hobby, and then see where it goes from there. Can anyone give me practical advice, insofar as market research (nonprofits don't tend to post their annual reports), advertising (how do I make this look good to the average guy who says, "There's nothing wrong with the way I think/I'm not stupid/My life is fine the way it is") and how to present Bayes' Theorem to people who are allergic to anything resembling math? I really want to help do something about the state of intellect of humanity. Again, I apologize if this post doesn't belong here/is redundant.

Welcome! You might be interested in the welcome thread, first of all.

If you're interested in volunteering to speak about rationality topics, there may be easier ways to try it out than starting and advertising a new organization. (Plenty of smaller humanist/philosophy/discussion groups are happy when someone volunteers to present.)

Also, since you have a new account, can I ask whether you've picked up Less Wrong-style rationality fairly recently? It's worth noting that the earliest burst of enthusiasm isn't as reliable as it seems, and one should often scale down one's initial ambition accordingly. There's always opportunity to add more responsibilities later if it goes well. (This Kaj Sotala post touches on that phenomenon.)

Excellent advice, thank you very much. I had not considered that option for some reason.

Yes, I am relatively new to rationalism as a whole, I started seriously studying it about a year ago. Nonetheless, it's pretty easy to stay enthusiastic - I just have to read a page of failblog or try to talk to a friend...about anything...lol

Also, if you remain interested and, specifically like LW's vision of rationality you could try using the curriculum thats currently being developed here http://lesswrong.com/lw/9hb/position_design_and_write_rationality_curriculum/

I think it's a great idea, and I would pay for a live course but probably not pay for online training.

Are you in touch with anyone involved in planning the SIAI spinoff "Rationality Org"? There might be some opportunities for cross-fertilization.

So, I presented a five-minute speech to my community college class on the base rate fallacy. Unfortunately, it went over their heads at about the same apparent distance as airplanes usually pass over my own. Here is the text of the speech, if anyone would be so kind as to offer criticism.

"I have come here to chew bubblegum and kick ass...and I'm all out of bubblegum". You might remember this line from the 80s sci fi movie They Live. In the movie, society is controlled by aliens who are using up Earth's natural resources. They're in politics, they're in law enforcement, and they look just like humans. Then the protagonist finds a box of sunglasses that let him see who's an alien and who isn't. For some reason they also make everything black and white. It's a crappy movie, but the point is, there are similar situations in real life, where we're trying to figure out who has a certain disease, or who the criminal is, or whatever the case may be. Let's consider a situation where we're trying to figure out who has breast cancer.

Let's pretend you have a friend who's going to the doctor to get a breast cancer screening. The doctor tells your friend that in her age group, 40-50, about 1% of women have breast cancer. He tells her that mammograms are 80% accurate, with a 10% false positive rate. After the screening, your friend is told that she tested positive. Now, how likely do you think it is that your friend has breast cancer? Congratulations, you just committed the base rate fallacy. Let's take another look at this question, which was originally published in the New England Journal of Medicine. Think about 1000 women just like your friend, visiting a clinic to get screened for breast cancer. We already know that on average 1% of those women will have cancer. So 10 women will actually have cancer. This means that 990 women will not have cancer. Since the test is 80% accurate, 8 of the women with cancer will test positive. But since there is a 10% false positive rate, 99 women without cancer will also test positive. So out of 107 women who test positive, only 8 actually have breast cancer. That's about a 7.5% probability that she has cancer. Would it change the way you view your test results, or the advice you might give to a loved one, if you knew how base rates affected the accuracy of the test?

When it comes to reducing uncertainty, one thing we can do is to get a second, and even a third test, independently conducted. Sometimes, however, that's simply not possible. Let's consider another situation. Imagine someone installed a camera in your city to catch terrorists. Just for the sake of the argument, let's say that we know that there are 100 terrorists among the 1 million inhabitants of your city. If the camera sees a terrorist, it will ring a bell 99% of the time. Its false positive rate is only 1%. Sounds great, right? Suppose the city's entire 1 million inhabitants pass before the camera. It will ring the bell 99 times for the 100 terrorists – and 9,999 times for the rest of the citizens. Now what? Do you stick upwards of 10,000 people in a holding cell and keep running them past the camera? Do you spend the time and money to criminally investigate 10,000 people? Is imprisoning 10,000 people to catch 100 acceptable in terms of human rights and security? What does this problem sound like?

If you said it sounds a lot like the TSA, you're right. Bruce Schneier, a security technologist, estimates we can expect to find about 1 terrorist among 8 million people passing through airports. This low base rate already makes it clear that the TSA will be relegated to screening false positives nearly all the time. It's hard to be certain. But, as one online journalist pointed out, the TSA's success rate – far from being 99% as in our example above – is close enough to 0 to be negligible. They've found lots of contraband, including weapons and even explosives. But I wasn't able to find a single instance that was linked to a terrorist or terrorist group. We do know that terrorists are still flying. They are occasionally caught – by other agencies – prior to flying. Setting aside the questionable competency or practicality of TSA procedures, and based purely on numbers alone, do you think the TSA is worth a reported 8 billion dollars and 90 million hours of waiting in line per year?

Keep in mind that base rates tend to have little to no effect if they are very high. Since about 50% of women who take pregnancy tests are already pregnant, pregnancy tests are accurate enough to be generally reliable. Of course, everyone knows that they are also not 100% accurate. A good way to think about probabilities and percentages is to use natural frequencies. Just as I did in the situations I talked about tonight, think about a group of people in terms of tens or hundreds out of thousands or millions. That will help you understand what all the numbers really mean.

I hope that you leave tonight with a good understanding of what base rates are and how they affect how we think about problems. Both problems that affect us personally, and societal issues that affect all of us.

What are you hoping that people will do with this information? Most of these folks will never run the TSA, so they can't do much except gripe about being made to take their shoes off in airports. Even in the breast cancer example, the most that your average person would take away from the speech is "you're supposed to multiply something by...something, and somehow the test might be wrong." What advice are they supposed to give their friend? Most women with a scary-looking mammogram who hear their friend say, "You're probably fine" are going to doubt whether the friend takes their health seriously.

The problem with supposedly practical applications of Bayes' theorem is that you usually don't have the data to do the math even if you know how, and there's usually not much practical action you can take based on it anyway. It's an interesting idea, and the people who like that sort of thing may want to learn more about it, but there's no information in this lecture that would let them trace the idea (other than talking to you afterwards). But I gather this was not the kind of audience who goes home and googles Bayes' theorem, so mentioning the name probably wouldn't have done much.

In most of the cases where knowing about base rates would help us, we don't actually know the base rate. If I know my child's preschool teacher is being investigated for child abuse, is that strong evidence that she really abuses children? I suspect most preschool teachers do not abuse children, but that many are accused of it at some point in their careers, but I don't know the rates of either. So I can't really draw useful conclusions.

Thanks for your input. I'm not sure whether you are saying that it is a waste of time (both mine and theirs) to try to teach people about Bayesian inference, or whether there was a better way I could have explained it and made it relevant to them. If the latter, do you have any ideas as to how I could improve my treatment of the topic?

I'm not sure there's a way to make it relevant to a previously uninterested audience in 5 minutes. I think your speech was well done for the constraints you had, but I don't have ideas for how to make that topic work given the constraints.

I might have picked a simpler cognitive bias to talk about instead.