I'm trying to develop a large set of elevator pitches / elevator responses for the two major topics of LW: rationality and AI.

An elevator pitch lasts 20-60 seconds, and is not necessarily prompted by anything, or at most is prompted by something very vague like "So, I heard you talking about 'rationality'. What's that about?"

An elevator response is a 20-60 second, highly optimized response to a commonly heard sentence or idea, for example, "Science doesn't know everything."

 

Examples (but I hope you can improve upon them):

 

"So, I hear you care about rationality. What's that about?"

Well, we all have beliefs about the world, and we use those beliefs to make decisions that we think will bring us the most of what we want. What most people don't realize is that there is a mathematically optimal way to update your beliefs in response to evidence, and a mathematically optimal way to figure out which decision is most likely to bring you the most of what you want, and these methods are defined by probability theory and decision theory. Moreover, cognitive science has discovered a long list of predictable mistakes our brains make when forming beliefs and making decisions, and there are particular things we can do to improve our beliefs and decisions. [This is the abstract version; probably better to open with a concrete and vivid example.]

"Science doesn't know everything."

As the comedian Dara O'Briain once said, science knows it doesn't know everything, or else it'd stop. But just because science doesn't know everything doesn't mean you can use whatever theory most appeals to you. Anybody can do that, and use whatever crazy theory they want.

"But you can't expect people to act rationally. We are emotional creatures."

But of course. Expecting people to be rational is irrational. If you expect people to usually be rational, you're ignoring an enormous amount of evidence about how humans work.

"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."

But of course. You have to weigh the cost of new information with the expected value of that new information. Sometimes it's best to just act on the best of what you know right now.

"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"

But of course. We even have lots of data on which situations are conducive to intuitive judgment, and which ones are not. And sometimes, it's rational to use your intuition because it's the best you've got and you don't have time to write out a bunch of probability calculations.

"But I'm not sure an AI can ever be conscious."

That won't keep it from being "intelligent" in the sense of being very good at optimizing the world according to its preferences. A chess computer is great at optimizing the chess board according to its preferences, and it doesn't need to be conscious to do so.

 

Please post your own elevator pitches and responses in the comments, and vote for your favorites!

 

New Comment
69 comments, sorted by Click to highlight new comments since:

What most people don't realize is that there is a mathematically optimal way to update your beliefs in response to evidence, and a mathematically optimal way to figure out which decision is most likely to bring you the most of what you want

You've said something similar in a recent video interview posted on LW, and it cringed me then, as it does now. We don't know of such optimal ways in the generality the context of your statement suggests, and any such optimal methods would be impractical even if known, which again is in conflict with the context. Similarly, turning to the interview, SingInst's standard positions on many issues don't follow from formal considerations such as logic and decision theory, there is no formal theory that represents them to any significant extent. If there is strength to the main arguments that support these positions, it doesn't currently take that form.

Fair enough. My statement makes it sounds like we know more than we do. Do you like how I said it here, when I had more words to use?

It made me cringe as well but more because it will make people hug the opposite wall of the proverbial elevator, not because such methods are conclusively shown as impractical - http://decision.stanford.edu/.

I think Ian Pollock more effectively got at what Luke is trying to communicate.

[-]Shmi370

First, a general comment on your versions, sorry: you tend to use big words, scientific jargon, and too few examples.

Compare your pitch with the following (intentionally oversimplified) version:

"So, I hear you care about rationality. What's that about?"

It's about figuring out what you really want and getting it. If you are at a game, and it's really boring, should you walk out and waste what you paid for the tickets? If you apply for a position and don't get it, does it help to decide that you didn't really want it, anyway? If you are looking to buy a new car, what information should you take seriously? There are many pitfalls on the road to making a good decision; rationality is a systematic study of the ways to make better choices in life. Including figuring out what "better" really means for you.

First, a general comment on your versions, sorry: you tend to use big words, scientific jargon, and too few examples.

And doing that is going to instantly turn people off.

[-][anonymous]60

.

First, thanks to lukeprog for posting this discussion post. The Ohio Less Wrong group has been discussing elevator pitches, and the comments here are sure to help us!

I often end up pitching LW stuff to people who are atheists, but not rationalists. I think this type of person is a great potential "recruit", because they WANT a community, but often find the atheistic community a little too "patting ourselves on the back"-ish (as do I). My general pitch is that Less Wrong is like the next step: "Yeah, we're all (mainly) atheists, but now what??"

Here's an example from a recent facebook comment thread:

Other person- What exactly do atheist groups do? I went to a couple meetings of [ Freethought Group ] here at [ Local big university], but it turned out to be exactly like Sunday school but except for reading Bible verses, everyone talked about why religion was terrible. It's not exactly what I'm all about.

Me- Yeah, I hate "Rah rah, Atheism!" stuff too. I know [Person A ] and [Person B ] from lesswrong.com . I like the site because it's like..."Yeah, we've all got the atheism stuff figured out. Let's move on and see where we can go from there."

Then I point them to Methods of Rationality, and hopefully now to our meetups.

Coming up with elevator pitches/responses strikes me as a great activity to do at LW meetups.

[-][anonymous]20

.

If there is interest in some discussion logs to analyse, I'm having a lengthy FB thread with a fairly intelligent theist I knew from rabbinical seminary. I don't think his arguments are particularly good, and I'm not great at arguing either, though I hope my content is a bit more convincing despite lack of style. I do not expect to change his mind - he holds a rabbinical position and chances of him changing his mind are near zero, but there are some observers I care about and this is an exercise in rationality for me. I can anonymize and post if people find this kind of thing interesting, I would certainly appreciate some feedback.

Well, I would find it interesting, but as a point of order: maybe you should let him know you're doing this (even anonymizedly) so he can get help from a gang of his friends too?

I have no intention to have this turn into a public debate out of a Facebook thread. This is a chance to improve my rationality and argumentation skills.

Yes... I took "there are some observers I care about" plus "I would appreciate some feedback" to mean 'I'd like some debate advice (which I will be applying)'. If that's not getting help from a gang of your friends, I don't know what is.

You're correct, it's a side benefit, but having a thread evolve into some kind of public debate looks silly. If public debate on such issues is desired there are order of magnitude better ways of doing it than this.

I don't think pedanterrific is planning to have a bunch of LWers start commenting on the thread in support of atheism. I think he's expecting a bunch of LWers to give you advice in this thread, which you will then use in your own posts. And he thinks the rabbi should be given an opportunity to ask his own community for similar advice. To use a boxing metaphor, nobody else is going to start fighting, but you're going to have more coaches and your opponent should too.

I got that, but having to tell him that there are a bunch of people helping, bring your friends, seems awkward in the context. I'd rather not have the help and just let people view the log as a post mortem, for improving my rationality. Another part of it is the fact that I'm actually doing ok in the argument (I think) and "calling for help" would look like/could be spun as a weakness.

Okay then! That makes sense. Also, I support posting the log when the argument is done; I'd enjoy reading it and would be happy to comment.

The third, compromise, option would be, if I end up using a suggestion from LW, to say "(I got this argument from talking it over with a a friend)", though I'm not sure if that goes far enough to satisfy standards of a fair fight people want to see.

[-][anonymous]00

.

I too am a member of the Ohio Less Wrong group. I was quite surprised to see this topic come up in Discussion, but I approve wholeheartedly.

My thoughts on the subject are leaning heavily towards the current equivalent of an 'elevator pitch' we have already: the Welcome to Less Wrong piece on the front page.

I particularly like the portion right at the beginning, because it grabs onto the central reason for wanting to be rational in the first place. Start with the absolute basics for something like an elevator pitch, if you ask me.

Thinking and deciding are central to our daily lives. The Less Wrong community aims to gain expertise in how human brains think and decide, so that we can do so more successfully.

I might cut out the part about 'human brains' though. Talk like that tends to encourage folks to peg you as a nerd right away, and 'nerd' has baggage you don't want if you're introducing an average person.

Possible absolute shite ahead (I went the folksy route):

"So, I hear you care about rationality. What's that about?"

It's about being like Brad Pitt in Moneyball. (Oh, you didn't see it? Here's a brief spoiler free synopsis). It's the art seeing how others, and even yourself, are failing and then doing better.

"Science doesn't know everything."

Oh, yeah, I completely agree. But, it does know a helluva lot. It put us on the moon, gave us amazing technology like this [pull out your cellphone], and there's every reason to think it's going to blow our minds in the future.

"But you can't expect people to act rationally. We are emotional creatures."

Yeah, no that's true. We've recently seen all kinds of bad decisions--housing crisis and so on. But that's all the more reason to try and get people to act more rationally.

"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."

Yeah, true... true. Still, we can prepare in advance for those situations. For example, you might have reason to believe that you're going to start a new project at your job. That's going to involve a lot of decisions and any poor decision at such an early stage can magnify as times goes by. That's why you prepare the best you can for those quick decisions that you know you'll be making.

"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"

Yeah, intuitions are just decisions based on experience. I remember reading that chess masters, y'know like Billy Fisher or Kasparov, don't even deliberate on their decisions, they just know; whereas, chess experts, a level below master, do deliberate. But to get to that level of mastery, you need tens of thousands of hours of practice, man. Only a few of us are lucky enough to have that kind of experience in even a very narrow area. If you're a something like an intermediate chess player in an area with a bunch of skilled chess players, your intuition is going to suck.

"But I'm not sure an AI can ever be conscious."

Maybe not, but that's not really important. Did you hear about Watson? That machine that beat those Jeopardy players? They're saying Watson could act as a medical diagnostician like House and do a better job at it. Not only that, but it'd be easier than playing jeopardy... isn't that crazy?

[-][anonymous]90

.

Oh, yeah, I completely agree. But, it does know a helluva lot. It put us on the moon, gave us amazing technology like this [pull out your cellphone], and there's every reason to think it's going to blow our minds in the future.

I like the others, but I think the problem with this one is that it doesn't provide them with any reason why they shouldn't just fill the gaps in whatever science knows now with whatever the hell they want.

The elevator pitch that got me most excited about rationality is from Raising the Sanity Waterline. It only deals with epistemic rationality, which is an issue, and it, admittedly, is best fit towards people who belong to a sanity-focused minority, like atheism or something political. It was phrased with regard to religion originally, so I'll keep it this way here, but it can easily be tailored.

"What is rationality?"

Imagine you're teaching a class to deluded religious people, and you want to get them to change their mind and become atheists, but you absolutely cannot talk about religion in any way. What would you do? You'd have to go deeper than talking about religion itself. You'd have to teach your students how to think clearly and actually reevaluate their beliefs. That's (epistemic) rationality.

"Why is rationality important? Shouldn't we focus on religion first?"

By focusing on rationality itself you not only can approach religion in a non-threatening way, but you can also align yourself with other sane people who may care about economics or politics or medicine. By working together you can get their support, even though they may not care about atheism per se.

"But you can't expect people to act rationally. We are emotional creatures."

Yes, we are emotional creatures. But being emotional is not incompatible with being rational! In fact, being emotional sometimes makes us more rational. For example, anger can inhibit some cognitive biases, and people who sustain damage to "emotional" areas of their brains do not become more rational, even when they retain memory, logical reasoning ability, and facility with language. What we want to do is make the best possible use of our available tools -- including our emotional tools -- in order to get the things that we really want.

Remember that your links don't work in speech. :D

Clearly right. I had thought about carrying around hard-copies of papers in a backpack so that I could hand them out as I mention them, but ... ;)

One of the most difficult arguments I've had making is convincing people that they can be more rational. Sometimes people have said that they're simply incapable of assigning numbers and probabilities to beliefs, even though they acknowledge that it's superior for decision making.

[-][anonymous]110

.

This. I'm skeptical of almost every numerical probability estimate I hear unless the steps are outlined to me.

No joke intended, but how much more skeptical are you, percentage-wise, of numerical probability estimates than vague, natural language probability estimates? Please disguise your intuitive sense of your feelings as a form of math.

Ideally, deliver your answer in a C-3PO voice.

40 percent.

This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.

When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The "authority" figure would be an estimate of "if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?"

Techniques for probability estimates by Yvain is the best we have.

I agree that it can be difficult convincing people that they can be more rational. But I think starting new people off with the idea of assigning probabilities to their beliefs is the wrong tactic. It's like trying to get someone who doesn't know how to walk, to run a marathon.

What do you think about starting people off with the more accessible ideas on Less Wrong? I can think of things like: Sunk Costs Fallacy, not arguing things "by definition, and admitting to a certain level of uncertainty. I'm sure you can think of others.

I would bet that pointing people to a more specific idea, like those listed above, would make them more likely to feel like there are actual concepts on LW that they personally can learn and apply. It's sort of like the "Shock Level" theory, but instead it's "Rationality Level":

Rationality Level 0- I don't think being rational is at all a good thing. I believe 100% in my intuitions!
Rationality Level 1- I see how being rational could help me, but I doubt my personal ability to apply these techniques
Rationality Level 2- I am trying to be rational, but rarely succeed (this is where I would place myself.) Rationality Level 3- I am pretty good at this whole "rationality" thing!
Rationality Level 4- I Win At Life!

I bet with some thought, someone else can come up with a better set of "Rationality Levels".

[-]Kyre40

"So, I hear you care about rationality. What's that about?"

Rationality is about improving your thinking so that you make better decisions. You know, sometimes you make decisions that turn out bad because there is some piece of knowledge or information that you really needed to know but didn't. But sometimes it turns out even with the same information you can make a better choice if you think about things differently. In the narrow sense rationality is getting your brain to make the best use of the information you have to make the best choice. In the wider sense rationality is about filling your brain up with the best information in the first place.

That might just sound like common sense - that people should think carefully about things - but it turns out that there are a whole lot of really common mistakes that people don't realize that they are making, and it really is possible to learn better patterns of thinking that let you make better decisions.

I'm not sure if this deserves its own article, so I'm posting it here: What would be an interesting cognitive bias / debiasing technique to cover in a [Pecha kucha] (http://www.pecha-kucha.org/what) style presentation for a college writing class?

Given the format, it should be fairly easy to explain(I have less time than advertised, only 15 slides instead of 20!) So far, I've thought about doing the planning fallacy, representativeness heuristic or the disjunction fallacy. All three are ones I can already speak casually about and don't leap out at me as empowering motivated cognition (...a topic which would empower it, huh)

I would personally like to do Bayes Theorem, but I can't 1) Think of a way to compress it down to five minutes 2) Can't think of ways for other people to help compress it down to five minutes without also omitting the math.

Downvote if this is off topic. If not, please tell me why because I'll just assume it's an offtopic downvote!

It's about figuring out the mistakes that people tend to make, so you can avoid making them. ("Like what?") Like people aren't good at changing their minds. They only want to think about information that supports what they already believe. But really, I should look at all the information that comes my way and decide - is my old belief really true? Or should I change my mind based on the new information I got?

[-]NexH10

"But you can't expect people to act rationally. We are emotional creatures."

This may be difficult to answer appropriately without knowing what the hypothetical speaker means with “emotions” (or "expect", for that matter). But the phrase seems to me like a potential cached one, so ve may not know it either.

A possible elevator response below:

Rationality is not Vulcan-like behavior; you don't have to renounce to your emotions in order to act rationally. Indeed, for most people, many emotions (like affection, wonder, or love) are very valuable, and applied rationality is knowing how to obtain and protect what is truly precious for you.
What is important is to rationally understand how your emotions affect your judgment, so you can try to consciously avoid or damper unwanted emotional reactions that would otherwise have undesirable consequences for you.

Does Sark's recent tweet, "Intuitions are machines, not interior decoration," work as an elevator pitch, or is it too opaque to a non-LWer? Or is it too short? Maybe it's a fireman's pole pitch.

I find the conscious AI response to be the most compelling. Now that I think about it, that's more evidence for the usefulness of concrete examples.

"Science doesn't know everything."

Yes, but science is all about using whatever methods work to produce more new knowledge all the time. All the new knowledge we can produce with mechanisms that we know are actually trustworthy will eventually become part of science, and the only stuff that's ultimately going to get left out is information we can only generate through means we know aren't reliable at producing truth.

Biases from Wikipedia's list of cognitive biases. Cue: example of the bias; Response: name of the bias, pattern of reasoning of the bias, normative model violated by the bias.

Edit: put this on the wrong page accidentally.

[This comment is no longer endorsed by its author]Reply