Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Summary of "The Straw Vulcan"

30 Post author: alexvermeer 26 December 2011 04:29PM

Followup to: Communicating rationality to the public: Julia Galef's "The Straw Vulcan"

The Straw VulcanI wrote a summary of Julia Galef's "The Straw Vulcan" presentation from Skepticon 4. Note that it is written in my own words, but all of the ideas should be credited to Julia and her presentation (unless I unintentionally misrepresent any of them!).

---

The classic Hollywood example of rationality is the Vulcans from Star Trek. They are depicted as an ultra-rational race that has eschewed all emotion from their lives.

But is this truly rational? What is rationality?

A “Straw Vulcan”—an idea originally defined on TV Tropes—is a straw man used to show that emotion is better than logic. Traditionally, you have your ‘rational’ character who thinks perfectly ‘logically’, but then ends up running into trouble, having problems, or failing to achieve what they were trying to achieve.

These characters have a sort of fake rationality. They don’t fail because rationality failed, but because they aren’t actually being rational. Straw Vulcan rationality is not the same thing as actual rationality.

What is real rationality?

There are two different concepts that we refer to when we use the word ‘rationality’:

1. The method of obtaining an accurate view of reality. (Epistemic Rationality) — Learning new things, updating your beliefs based on the evidence, being as accurate as possible, being as close to what is true as possible, etc.

2. The method of achieving your goals. (Instrumental Rationality) — Whatever your goals are, be them selfish or altruistic, there are better and worse ways to achieve them, and instrumental rationality helps you figure this out.

These two concepts are obviously related. You want a clear model of the world to be able to achieve your goals. You also may have goals related to obtaining an accurate model of the world.

How do these concepts of rationality relate to Straw Vulcan rationality? What is the Straw Vulcan conception of rationality?

“Straw Vulcan” Rationality Principles

Straw Vulcan Principle #1: Being rational means expecting other people to be rational too.

Galef uses an example from Star Trek where Spock, in an attempt to protect the crew of the crashed ship, decides to show aggression against the local aliens so that they will be scared and run away. Instead, they are angered by the display of aggression and attack even more fiercely, much to Spock’s dismay and confusion.

But this isn’t being rational! Spock’s model of the world is severely tarnished by his silly expectation for everyone else to be as rational as he would be. Real rationality would require you to try to understand all aspects of the situation and act accordingly.

Straw Vulcan Principle #2: Being rational means never making a decision until you have all the information.

This seems to assume that the only important criteria for making decisions is that you make the best one given all the information. But what about things like time and risk? Surely those should factor into your decisions too.

We know intuitively that this is true. If you want a really awesome sandwich you may be willing to pay an extra $1.00 for some cheese, but you wouldn’t pay $300 for a small increase in the quality of a sandwich. You want the best possible outcome, but this requires simultaneously weighing various things like time, cost, value, and risk.

What is the most rational way to find a partner? Take this example from Gerd Gigerenzer, a well-respected psychology describing how a rationalist would find a partner:

“He would have to look at the probabilities of various consequences of marrying each of them—whether the woman would still talk to him after they’re married, whether she’d take care of their children, whatever is important to him—and the utilities of each of these…After many years of research he’d probably find out that his final choice had already married another person who didn’t do these computations, and actually just fell in love with her.”

But clearly this isn’t optimal decision making. The rational thing to do isn’t to merely wait until you have as much information as you can possibly have. You need to factor in things like how long the research is taking, the decreasing number of available partners as time passes, etc.

Straw Vulcan Principle #3: Being rational means never relying on intuition.

Straw Vulcan rationality says that anything intuition-based is illogical. But what is intuition?

We have two systems in our brains, which have been unexcitingly called System 1 and System 2.

System 1—the intuitive system—is the older of the two and allows us to make quick, automatic judgments using shortcuts (i.e. heuristics) that are usually good most of the time, all while requiring very little of your time and attention.

System 2—the deliberative system—is the newer of the two and allows us to do things like abstract hypothetical thinking and make models that explain unexpected events. System 2 tends to do better when you have more resources and more time and worse when there are many factors to consider and you have limited time.

Take a sample puzzle: A bat and ball together cost $1.10. If the bat costs $1 more than the ball, how much does the ball cost?

When a group of Princeton students were given this question, about 50% of them got it wrong. The correct answer is $0.05, since then the bat would cost $1.05 for a total of $1.10. The wrong answer of $0.10 is easily generated (incorrectly) by our System 1, and our System 2 accepts it without question.

Your System 1 is prone to biases, and it is also incredibly powerful. Our intuition tends to do well with purchasing decisions or other choices about our personal lives. System 1 is also very powerful for an expert. Chess grandmasters can glance at a chessboard and say, “white checkmates in three moves,” because of the vast amount of time and mental effort spent playing chess and building up a mental knowledge base about it.

Intuition can be bad and less reliable when based on something not relevant to the task at hand or when you don’t have expert knowledge on the topic. You opinions of AI may be heavily influenced by scifi movies that have little basis in reality.

The main thing to take away from this System 1 and 2 split is that both systems have strengths and weaknesses, and rationality is about finding the best path—using both systems at the right times—to epistemic and instrumental rationality.

Being “too rational” usually means you are using your System 2 brain intentionally but poorly. For example, teenagers were criticized in an article for being “too rational” because they could reason themselves into things like drugs and speeding. But this isn’t a problem with being too rational; it’s a problem with being very bad at System 2 reasoning!

Straw Vulcan Principle #4: Being rational means not having emotions.

Rationality and emotions are often portrayed in a certain way in Straw Vulcan rationalists, such as when Spock is excited to see that Captain Kirk isn’t dead, and then quickly covers up his emotions. The simplistic Hollywood portrayal of emotions and rationality is as follows:

Note that emotions can get in the way of taking actions on our goals. For example, anxiety causes us to overestimate risks; depression causes us to underestimate how much we will enjoy an activity; and feeling threatened or vulnerable causes us to exhibit more superstitious behavior and and likely to see patterns that don’t exist.

But emotions are also important for making the decisions themselves. Without having any emotional desires we would have no reason to have goals in the first place. You would have no motivations to choose between a calm beach and a nuclear waste site for your vacation. Emotions are necessary for forming goals; rationality is lame without them!

[Galef noted in a comment that the intended meaning is in line with “Emotions are necessary for forming goals among humans, rationality has no normative value to humans without goals.”]

This leaves us with a more accurate portrayal of the relationship between emotions and rationality:

How do emotions make us irrational? Emotions can be epistemically irrational if they are based on a false model of the world. You can be angry at your husband for not asking how your presentation at work went, but then upon reflection realize you never told him about it so how would he know it happened? Your anger was based on a false model of reality.

Emotions can be instrumentally irrational if they get in the way of you achieving your goals. If you feel things are hopeless and there are no ways to change the situation, you may be wrong about that. Your emotions may prevent you from taking necessary actions.

Our emotions also influence each other. If you have a desire to be liked by others and a desire to sit on a couch all day, you may run into problems. These desires may influence and conflict with each other.

We can also change our emotions. For example, cognitive behavioral therapy has many exercises and techniques (e.g. Thought Records) for changing your emotions by changing your beliefs.

Straw Vulcan Principle #5: Being rational means valuing only quantifiable things, like money, efficiency, or productivity.

If it isn’t concrete and measurable then there is no reason to value it, right? Things like beauty, love, or joy are just irrational emotions, right?

What are the problems with this? For starters, money can’t be valuable in and of itself, because it is only a means to obtain other valued things. Also, there is no reason to assume that money and productivity are the only things of value.

The Main Takeaway

Galef finishes off with this final message:

“If you think you’re acting rationally but you consistently keep getting the wrong answer, and you consistently keep ending worse off than you could be, then the conclusion you should draw from that is not that rationality is bad, it’s that you’re bad at rationality.

In other words, you’re doing it wrong!

You're Doing It Wrong!

First three images are from measureofdoubt.com > The Straw Vulcan: Hollywood’s illogical approach to logical decisionmaking.
You're Doing It Wrong image from evilbomb.com.

Comments (22)

Comment author: lukeprog 27 December 2011 01:24:34AM 4 points [-]

An even shorter version is Why Spock is Not Rational.

Comment author: DanielLC 26 December 2011 11:53:02PM *  4 points [-]

A “Straw Vulcan”—an idea that originally comes from TV Tropes

It doesn't come from TV Tropes. TV Tropes catalogs ideas that already exist. I'd suggest saying that it's a term that was originally defined by TV Tropes.

Comment author: alexvermeer 27 December 2011 12:30:36AM 1 point [-]

That's a good point. I like your wording. Fixed, and thanks.

Comment author: Swimmy 27 December 2011 02:27:52AM 7 points [-]

I think you should change "principle" to "myth." You don't want to ruin the flow of the article; people who aren't reading carefully (which is a whole lot of people) are going to scroll through, read the bold, and think you are advising such things.

Comment author: alexvermeer 27 December 2011 02:47:00PM 3 points [-]

That crossed my mind while writing, but I didn't want to stray too far from the wording in the presentation. I just changed it to "Straw Vulcan Principle #x". Is that a good compromise?

Comment author: Swimmy 27 December 2011 06:25:17PM 0 points [-]

I think that works.

Comment author: Turgurth 18 July 2013 01:38:51AM *  3 points [-]

To add to Principle #5, in a conversational style: "if something exists, that something can be quantified. Beauty, love, and joy are concrete and measurable; you just fail at it. To be fair, you lack the scientific and technological means of doing so, but - failure is failure. You failing at quantification does not devalue something of value."

Comment author: Normal_Anomaly 26 December 2011 05:22:01PM 2 points [-]

Good presentation of good ideas. Thanks for summarizing the talk for those of us who couldn't go to Skepticon. One typo: in the second paragraph after the quote in principle 3, the bat costs $1.05, not %.05.

Comment author: alexvermeer 26 December 2011 05:35:41PM 1 point [-]

Fixed. Thanks :)

Comment author: FiftyTwo 27 December 2011 12:24:57PM 0 points [-]

Emotions can be instrumentally irrational if they get in the way of you achieving your goals. If you feel things are hopeless and there are no ways to change the situation, you may be wrong about that. Your emotions may prevent your from taking necessary actions.

Another typo, 'your' should be 'you' I think.

Comment author: alexvermeer 27 December 2011 02:48:28PM 0 points [-]

Fixed, thanks.

Comment author: dlthomas 26 December 2011 06:44:33PM 3 points [-]

Was Spock meant to actually "be rational"? Re-watching the show recently, "Spock really, really wants to think of himself as rational" seems a much better description.

Comment author: Normal_Anomaly 26 December 2011 07:49:55PM *  5 points [-]

I haven't watched the show, but I've sometimes seen essays from people saying that Kirk, Spock, and Bones represent "body, mind, and spirit." And whatever the creators' intentions, there does seem to be a popular misconception that rationalists or rational people or both act like Spock.

Comment author: dlthomas 26 December 2011 08:23:13PM 2 points [-]

I agree that there is a popular conception as you say, but I think Spock works more effectively as a warning against rational attire as opposed to rationality. I don't actually know the creators' intentions. I just think than when Spock admonishes Kirk for his illogical play in making the winning move in a chess game early on, it's plain enough what's up - although maybe it's my trouble imagining a rational theory of chess wherein the correct move is one other than the one that puts your opponent in checkmate.

Comment author: Normal_Anomaly 27 December 2011 03:21:01AM *  1 point [-]

I can't find any authoritative discussion of Spock's intended purpose. I asked someone who's seen the show in as non-loaded a way as I could, and ey said that Spock was generally intended to be perceived as rational, and that the chess games in particular are often a metaphor for the action of the episode. McCoy and Spock often function as Kirk's System 1 and System 2, giving him advice that he combines into an instrumentally rational decision. I agree that Spock is often a good example of what not to do.

Comment author: Eugine_Nier 27 December 2011 02:48:04AM 2 points [-]

there does seem to be a popular misconception that rationalists or rational people or both act like Spock.

I suspect there a reasonable amount of truth to this belief. At least I suspect Spock was a reasonable caricature of the type of self-proclaimed "rational people" prevalent during the 50s and 60s.

Comment author: KPier 26 December 2011 07:52:40PM 3 points [-]

You need to factor in things like how long the research is taking, the decreasing number of available females as time passes, etc.

Not to be picky, but could we say "available partners"? Please?

Otherwise very nice job, and upvoted.

Comment author: alexvermeer 26 December 2011 09:47:23PM 4 points [-]

Absolutely. The original example was explicit "male looks for female", but no reason for the summary to keep that. Fixed, and thanks.

Comment author: KPier 26 December 2011 11:35:45PM 2 points [-]

Thanks!

The example didn't bother me, but when it switched to second person ("you need to factor in...") the continued gendering seemed unnecessary.

Comment author: duckduckMOO 31 December 2011 02:05:20AM 0 points [-]

“If you think you’re acting rationally but you consistently keep getting the wrong answer, and you consistently keep ending worse off than you could be, then the conclusion you should draw from that is not that rationality is bad, it’s that you’re bad at rationality.

This is waaaaayyyyy too blanket. There are potentially limitless reasons you could consistently end worse off than you could be. Any situation where you need to come up with an answer more quickly than you are capable of will get you the wrong answer pretty consistently if you are rational (because your best shot is to guess). Those are good warning signals but not specifically of lacking rationality.

I'm specifically thinking epistemic rationality of the not-wearing-paradigm-glasses/suspending judgement variety can be very bad for you in the short term but good for you in the long term.

Comment author: [deleted] 09 June 2012 03:47:03PM 0 points [-]

Any situation where you need to come up with an answer more quickly than you are capable of will get you the wrong answer pretty consistently if you are rational (because your best shot is to guess).

In situations where spending too much time to choose is worse than choosing sub-optimally in a short time, then guessing is rational. It's addressed by SVP#2 in the post. Being “rational” in your sense of the word in such a situation is failing the twelfth virtue.

Comment author: duckduckMOO 10 June 2012 04:38:35PM *  1 point [-]

That is what I was saying. sometimes the rational course of action, which is to guess in situations like that, will get you the wrong answer pretty consistently, not that that course of action is irrational.

I assume you read "the wrong answer" as referring to the choice to guess rather than the outcome of the guess.