I explain to the elevator that there is no need to be afraid of heights, rationally the chances of malfunction are ignorable - and more generally examining ones own biases and neuroses and correct for them will help ver achieve ver goal of helping people move between floors in an energy efficient way.
I haven't tested this, but maybe:
"People seem to be craziest about the questions that matter most for them, like their girlfriends or boyfriends, whether it was actually a mistake to enroll in their PhD program, etc. I'd like to learn how to not be that way."
I'd start by telling them about the commuting paradox (link grabbed from Yvain's post about rational house buying )- how people end up making themselves miserable by not properly estimating/valuing their own preferences.
That's a concrete example of something that negatively affects hundreds of millions people worldwide; and its applicability to real life is much better understood than the applicability of the triplet game.
I only tried the triplet pitch once, and they got the answer, but didn't feel like "oh, I had a bias," they just felt like it was a trick question. Then I generalized from one example and stopped using it.
I've stopped pitching 'Rationality' per se, but when people ask and seem plausibly interested, I say "Rationality is basically the study of making good decisions." If they inquire further, I think the new intro to Less Wrong is approximately right, although doesn't quite translate into conversational speech.
they got the answer, but didn't feel like "oh, I had a bias," they just felt like it was a trick question.
I haven't tried the triplet game on anyone yet, but this is the reaction I generally get in response to similar problems. In my (entirely anecdotal) experience, people are unable or unwilling to view rationality as a generally applicable principle. Instead, they treat it as a one-off tool that was designed to apply to a narrow set of specific problems.
"For example" -- people would say -- "you could use rationality to get a better price on your mortgage, or to demonstrate that Wiccans can't really affect reality through spells. But you couldn't use it to determine whether your homeopathic remedy really works, or whether your aunt Helga really does have prophetic dreams, or whether Christians can affect reality through prayer. These questions are altogether different from mortgage/Wicca/whatever questions, as everyone knows".
I don't think this kind of cognitive bias can be defeated by a 30-second pitch. In fact, I doubt it can be defeated at all.
I only tried the triplet pitch once, and they got the answer, but didn't feel like "oh, I had a bias," they just felt like it was a trick question. Then I generalized from one example and stopped using it.
I put a poll on my blog isomorphic to the Allais Paradox, and I ought to have seen it ahead of time, but it's alarming the extent to which some people will go to rationalise their decisions. With one respondent I whittled the scenario down to the point where he obstinately claimed his choice between A or B would change given identical odds but a different method of randomisation.
This was one of a few efforts that basically put me off trying to recruit for rationality.
This might be a question with no good answer. I am reminded of Chomsky's complaint in the video version of Manufacturing Consent that ideas like his can never get a hearing on television because they cannot be condensed into meaty meaningful sound bites.
The two data points worthy of consideration as I see it:
Elevator pitches work best for popular culture like Lady Gaga or Harry Potter. I do not think it is a coincidence that a big fraction of the new people in Less Wrong introduction threads state they came to the site through Harry Potter and the Method of. Maybe pitch only the story and not the transcendental critique?
The only successful advertising I have seen for any similar product is the Teaching Company. So if I wanted to write an elevator pitch for rationality or for Less Wrong I would first study the Teaching Company advertisements very closely.
Define success? ;)
I like to show people things like the Spinning Dancer and the "this is an attention test" video. I think of them as an invitation to reflect on exactly what kind of beings we are.
Not to troll, but if we're assuming someone needs LessWrong's services, shouldn't we create the least rational pitch we can, deploying the deepest of Darkest Arts and when they say "save me brother!" reply "it's good that you've seen the light, now we can work on not being blinded by it"?
Poor rationality skills do not necessarily translate to an incapacity for rationality.
In my case I'd just say "I prefer to be less wrong. I use the best tools to make the best outcomes I can. Wouldn't you want to?" (Usually I time this after I've given extensive advice to someone based on just that.)
Not to troll, but if we're assuming someone needs LessWrong's services, shouldn't we create the least rational pitch we can
...um, no? Same way that one shouldn't try to cure people's headaches by banging them on the head with a hammer?
Not downvoted because Hyena said "shouldn't we" and didn't perfectly hold off on proposing solutions and raise his or her ideas a bit more abstractly first, and harsh responses aren't terrible against that.
But Hyena was the first to raise an excellent point, so your response is far too strong, I think.
Phrasing it as a question was certainly enough for Hyena to get an upvote from me; it's a middle ground between "There are advantages and disadvantages of using the Dark Arts that we should discuss," and "Let's deploy the deepest Darkest Art we can!"
Sorry, I'm new on the site, so I'm missing some of the jargon. What are these "Dark Arts" of which you speak ? The reason I ask (besides my everlasting hunger for power, mwa ha harghble) is because you seem to be (to my newbie eyes) claiming to possess some set of conversational techniques that will make almost anyone believe almost anything. I have heard such claims in the past, and they have all failed spectacularly, so now I'm more than a little wary of them.
Then again, I could be completely mistaken about what you mean by "Dark Arts"; if so, I apologize.
Looking for the term in the search engine will lead you to a good description at the wiki: http://wiki.lesswrong.com/wiki/Dark_arts
Basically any technique that seeks to persuade by exploiting (or even amplifying), not correcting, the cognitive biases of others.
Holding off on proposing solutions.
Dark Arts. And here and here.
One issue is the matter of "persuasion" and "manipulation". Some people see them as words describing things that are different in kind, others see them as words describing different areas of a continuum.
See my comments here. These are some of the more common things meant by the term.
claiming to possess some set of conversational techniques that will make almost anyone believe almost anything.
I think similar sounding claims come from people claiming to be far better at manipulation than others as a means of selling you the knowledge. For this to be plausible, the skill has to come from a few simple key insights that universally apply.
The claim here is different, it's that for each person, there are ways to manipulate them beyond persuading them or more generally influencing them as they would wish to be influenced. As we are not trying to sell a simple technique that always does this, the claim is far less ambitious - it isn't that manipulation is something so simple it's easy to buy and learn, and so universal that you don't need anything else. The claim is similar in that it is about people being manipulable, but the discussion is about the morality and efficacy of pushing those levers consciously at all. Sellers of manipulation have to claim it works every time or nearly so, the discussion here is relevant if one tactic works once in a hundred tries - and the consensus here is that yes, people are somewhat manipulable, and there are many tactics.
Thanks lessdazed and others, that was very informative. In retrospect, I totally should've searched the wiki, but I kind of forgot this site had a wiki -- sorry about that.
I can see at least one problem with using the Dark Arts for the purpose of persuading people to learn about rationality: breach of trust. If your target person ever finds out that you manipulated him -- as he is in fact likely to do, assuming that he actually does learn more about rationality due to your successful manipulation attempt -- you will lose his trust, possibly forever. As the result, he may come to view rationality as a sort of seedy mind-game that evil people (such as, in his newly acquired opinion, yourself) play on each other for sport, and not as a set of generally useful mental techniques.
A simple way of describing Dark Arts is as the mirror image of (ir)rationality used for evil.
For example Eliezer writes about how our brains literally believe everything they're told, and are unable to filter out falsehoods while distracted.
The Less Wrong thing to do is to say, "Oh, better pay attention when untrue things are being said, so my brain can classify them properly."
The Dark Arts thing to do is to say, "I'd better distract people when I lie to them, so that even if they know I'm lying, they will still believe me subconsciously."
Very good question!
I'm not sure it's even worth trying for a 30-second pitch. My pitches for topics on this site generally take around three minutes. I use anchoring as my cognitive bias - specifically the "anchoring Gandhi's birth date on ludicrously far-off dates" example - and say things like "it's not that we fail to hit the target - it's that all our darts fall on the same side of it" to explain systematic bias, refer to "the mathematics of consistent decision making" and then say "it's crazy that we rely on our brains so much but don't take the time to learn about the ways in which they systematically fail".
My thinking-about-it-for-5-minutes pitch:
"I'm thrown in a game where nobody's told me the rules. There's no victory condition, but there's some stuff that I want to do. My most important piece is myself, so I'd like to figure out how to use it.
I currently think that my brain is pretty bad at doing a lot of important things (changing my mind, actually deciding to do things), so I'd like to get better at that. I want to learn more about how the world works, as well as how I work and what I want so that I can do things that will actually get me what I actually want, rather than just kind of doing things that occasionally kind of work for no particularly good reason."
I have not spent much time on this site, so I may have an incorrect understanding of rationality. However, I see rationality more as a vehicle for pursuing and understanding truth. The first argument is to convince people to value truth, and then the next step would be to present rationality as a different method of thinking which would be better able to pursue truth. Convincing someone to value truth is its own battle, especially to folks who have the postmodern belief that their own perception is valuable simply because they perceive it. Simply, if someone does value truth, introducing rationality should follow easily. If someone does not value truth, then they will not accept rationality.
I'm confused by your response. You've used a lot of pronouns, so in this context, I'm interpreting your sentence as rationality being a means to the end of truth. However, because of the pronouns, your sentence brings to mind the question: Can rationality be used as a means to ANY end?
If a person values personal happiness, can a rationalist present rationality as a way to be happy? If a person values a successful, blissful marriage, can a rationalist present rationality as a means to love your wife? And (just for the sake of testing the extremes) can rationality be a means to knowing God more deeply?
You've used a lot of pronouns
I failed to communicate, sorry, I will try again:
One can value rationality/(systematically believing true things and trying to shed false beliefs) as a means or an end.
If a person values personal happiness, can a rationalist present rationality as a way to be happy?
I mean that to care about truth you have to have something to protect. You have to care about what's true because you desperately want to actually achieve a goal, rather than fitting in with the people who talk about achieving the goal.
If rationality requires truth, and truth requires a motivation, can rationality exist as a motivation on its own? To me, it seems not.
I think my wording of the second sentence you quoted actually sabotaged the question I was really asking. Can rationality give a person happiness given that's their goal?
If rationality requires truth, and truth requires a motivation, can rationality exist as a motivation on its own?
It logically can exist as a motivation of its own, but a great many think that they have such motivation, far more than actually do. Even if one feels that one seeks truth for its own sake, it's probably not true.
I think I remember that Nietzsche did not believe it was possible.
Can rationality give a person happiness given that's their goal?
Rationality gives people different things depending on the person and their environment. The best way to predict what would happen in a hypothetical scenario is to be rational. Being able to predict things accurately probably causes more happiness than it prevents, for most. This is a mild side effect of rationality, things designed around happiness would have more of a chance of being good at affecting that (I suspect most basically fail and there are a few gems there).
My view that others, such as Eliezer, do not share is that rationality is much more related to losing than to winning. Rationality prevents people from making mistakes, this is only equivalent to winning and positively creating success if one goes on a significant not-losing streak.
So I'd say that if you are happy naturally, and unhappy when bad things happen to you, it will probably help a lot. If you are naturally unhappy, and need good things to happen to be happy, it won't make you happy at all, it will only lessen the frequency and severity of failures and problems. It helps one's net happiness but doesn't make one happy.
You may want to read http://lesswrong.com/lw/go/why_truth_and/ for an understanding of what this site thinks about that.
If I'm understanding empiricism correctly, rationalists value truth because it allows them to properly function in their world. I'm confused. Is a rationalist's success more important than the truth which gives them success?
Rifle scopes do not help help snipers shoot guns. They help snipers know where to aim to hit a target. If the military cut all funding for scopes, it's still physically possible to perform all the actions that would have been chosen had they had the equipment. It's even physically possible to shoot more accurately by firing unaimed shots than by firing aimed shots.
However, that would be a stupid idea. It's stupid because the odds are not better for a random shot than for an aimed shot.
Likewise, rationalists want to win, to hit the target. Sometimes we reason that for an individual shot, it feels like we would do better by not aiming. We check our reasoning over and over, but the output is "It is slightly better to not aim than aim here, this is an exception to the usual rule." In such cases, we aim anyway.
One problem with trying to believe false things is that those things can corrupt other beliefs and areas of study where we need truth and can't afford to be wrong. We can do better by relentlessly seeking truth, even when it seems like it would somewhat be better not to know.
Opinions may differ for cases where it seems extremely important to avoid the truth.
In short, we seek truth not for its own sake, but to win, and still seek it when it seems falsehood would probably better help us win, because that seeming is unreliable and usually wrong.
Likewise for killing people to accomplish a goal, they are analogous.
I say "we" but in truth only speak for myself.
Re: triplets, it also occurs to me that folks have a prior over triplet-rules that disfavors those that contains most or even half of triples (for any bound on all the integers), and favors those that use "interesting" (typically thought to be IQ-testing) rules. This is still no excuse, however :)
For an acquaintance that is generally interested in similar themes but unfamiliar with LW:
Thinking and deciding affect how likely we are to succeed at anything that we care about, but we've learned that people generally suck at this and fail in predictable ways. There's lots of science on this, and I'm sure you've even noticed it yourself.
Example/anecdote (personal and tailored to their interests if possible)
There's lots of info on these kinds of errors, but I find them really slippery to identify and correct in myself. LessWrong is a community designed to figure out how to do better, and design strategies to help people apply these lessons to daily life. I've found it really useful, you should check it out.
This reminds me of trying to find an elevator pitch for utilitarianism. More specifically, donating large amounts to a good charity. All I can think of is "Sell your house or hundreds of innocent people will die", but I have a feeling that won't work.
Maybe instead of playing the triplets "trick" on people, you could relate a story from your own life about how you made a mistake that rationality could have let you avoid.
I just realised that I fail massively at not immediately proposing solutions. Could everyone downvote the parent please?
You're talking with someone you like, and they ask you what you mean by rationality, or why you keep going to LessWrong meetups. Or you meet someone who might be interested in the site.
What do you say to them? If you had to explain to someone what LW-style rationality is in 30 seconds, how would you do it? What's your elevator pitch? Has anyone had any success with a particular pitch?
My Current Pitch:
My current best one, made up on the spot, lacking any foreplanning, basically consists of:
"Basically, our brains are pretty bad at forming accurate beliefs, and bad in fairly systematic ways. I could show you one, if you want."
Playing the triplet game with them, then revealing that the numbers just need to be ascending
Upon failure, "Basically, your brain just doesn't look for examples that disprove your hypothesis, so you didn't notice that it could have a been a more general rule. There are a bunch of others, and I'm interested in learning about them so that I can correct for them."
My Thoughts on That:
It's massively effective at convincing people that cognitive biases exist (when they're in the 80% that fails, which has always been the case for me so far), but pretty much entirely useless as a rationality pitch. It doesn't explain at all why people should care about having accurate beliefs, and takes it as a given that that would be important.
It's also far too dry and unfun (compared to say, Methods), and has the unfortunate side effect of making people feel like they've gotten tricked. It makes it look non-cultish though.
I suspect that other people can do better, and I'll comment later with one that I actually put thought into. There's a pretty good chance that I'll use a few of the more upvoted ones and see how they go over.