There's a core meme of rationalism that I think is fundamentally off-base. It's been bothering me for a long time — over a year now. It hasn't been easy for me, living this double life, pretending to be OK with propagating an instrumentally expedient idea that I know has no epistemic grounding. So I need to get this off my chest now: Our established terminology is not consistent with an evidence-based view of the Star Trek canon.

According to TVtropes, a straw Vulcan is a character used to show that emotion is better than logic. I think a lot of people take "straw Vulcan rationality" it to mean something like, "Being rational does not mean being like Vulcans from Star Trek."

This is not fair to Vulcans from Star Trek.

Central to the character of Spock — and something that it's easy to miss if you haven't seen every single episode and/or read a fair amount of fan fiction — is that he's being a Vulcan all wrong. He's half human, you see, and he's really insecure about that, because all the other kids made fun of him for it when he was growing up on Vulcan. He's spent most of his life resenting his human half, trying to prove to everyone (especially his father) that he's Vulcaner Than Thou. When the Vulcan Science Academy worried that his human mother might be an obstacle, it was the last straw for Spock. He jumped ship and joined Starfleet. Against his father's wishes.

Spock is a mess of poorly handled emotional turmoil. It makes him cold and volatile.

Real Vulcans aren't like that. They have stronger and more violent emotions than humans, so they've learned to master them out of necessity. Before the Vulcan Reformation, they were a collection of warring tribes who nearly tore their planet apart. Now, Vulcans understand emotions and are no longer at their mercy. Not when they apply their craft successfully, anyway. In the words of the prophet Surak, who created these cognitive disciplines with the purpose of saving Vulcan from certain doom, "To gain mastery over the emotions, one must first embrace the many Guises of the Mind."

Successful application of Vulcan philosophy looks positively CFARian.

There is a ritual called "kolinahr" whose purpose is to completely rid oneself of emotion, but it was not developed by Surak, nor, to my knowledge, was it endorsed by him. It's an extreme religious practice, and I think the wisest Vulcans would consider it misguided1. Spock attempted kolinahr when he believed Kirk had died, which I take to be a great departure from cthia (the Vulcan Way) — not because he ultimately failed to complete the ritual2, but because he tried to smash his problems with a hammer rather than applying his training to sort things out skillfully. If there ever were such a thing as a right time for kolinahr, that would not have been it.

So Spock is both a straw Vulcan and a straw man of Vulcans. Steel Vulcans are extremely powerful rationalists. Basically, Surak is what happens when science fiction authors try to invent Eliezer Yudkowsky without having met him.

 


1) I admit that I notice I'm a little confused about this. Sarek, Spock's father and a highly influential diplomat, studied for a time with the Acolytes of Gol, who are the masters of kolinahr. If I've ever known what came of that, I've forgotten. I'm not sure whether that's canon, though.

2) "Sorry to meditate and run, but I've gotta go mind-meld with this giant space crystal thing. ...It's complicated."

New Comment
28 comments, sorted by Click to highlight new comments since: Today at 12:46 PM

It isn't fair to the Vulcans from Star Trek. But that's only because the popular conception of Vulcans isn't fair to them, and the term Straw Vulcan is a reference to this popular conception.

So, this is an article about Star Trek characters, not about, well, rationality.

... Umm, okay? /me shrugs.

Basically, Surak is what happens when science fiction authors try to dream up Eliezer Yudkowsky without having met him.

I think you miss a thing about Eliezer. He tried to write the rational way. He found that he couldn't really get himself to write it and started writing HPMOR. Heavily narrated and full of parables.

Surak wouldn't have written HPMOR.

When I asked recently for the basic of rationality Leonhart wrote: "I am not in a story."

Eliezer writes things like: "And remember: To be a PC, you’ve got to involve yourself in the Plot of the Story. Which from the standpoint of a hundred million years from now, is much more likely to involve the creation of Artificial Intelligence or the next great advance in human rationality (e.g. Science) than… than all that other stuff. Sometimes I don’t really understand why so few people try to get involved in the Plot."

Eliezer is somebody with heavy reaction to the narrative of a situation. I think that's one of the reasons why he did do what he did in the basilisk situation.

He did write a koan mocking koans but it's awfully deep for someone who doesn't think koans do something. Did enough that enough people missed the point, getting Eliezer to write: "...okay, maybe this is a cult."

* 

Narrativum isn't the same thing as emotion but for emotions there Julia's CFAR post that says that CFAR cares more about emotions than LW does.


I think this whole post might be motivated by my characterisation of your Tell culture post as advocating Vulcan communication habits. I still think that's the case and that emotion based nonviolent communication is a much better framework for communication social wants and desires. If you are a rationalist and want some framework that is highly white hat and socially compatible nonviolent communication is a rational choice. Vulcan communcation isn't, even it's more Surak communication than Spock communication.

Thanks for pointing that out! I agree that folks who want to become more like Eliezer might benefit from developing their sense of narrative. Any ideas how they could train that skill?

Take the koan that Eliezer wrote. To be explicit, l reading the koan forces the reading into accepting the frame that a koan is a valid teaching tool.

The koan is about how a master gives his student a funny hat. Master Eliezer gives his reader a koan. The novice rationalist thinks Eliezer is wise so he accepts being taught through a koan.

He personally didn't get it in the first read but took a short time to process it to be fair I generally don't expect people on LW to make points on that level so I'm not focusing on reading on that level.

Moving in environments where people do take care to communicate on that level is useful for training to sense the narrative in that discussion.

Let's go back to Brienne"s examples of Tell Culture communication I labeled Vulcan: ""I just realized this interaction will be far more productive if my brain has food. I think we should head toward the kitchen."

I"m heavily underweight, if I hear that sentence speaker by someone with my own weight my brain would go: "I understand why you are underweight if you have that little relationship to what your body is doing that you don't recognize hunger and need some stimulus such as your brain getting foggy to be motivated to get food. There a good chance that I would confront the person about it.

If I hear a fat person saying that I might think: "Okay, this person obviously feels very bad about gives his body the foot it desires so he gives himself a silly excuse about how his brain needs food instead of being honest about his hunger." I might laugh but depending on my relationship to the person and how confrontative I want to be I might or might not call the person on it.

The thing about the food is a fairly simple story. Humans generally eat because they are hungry. If you are in social discussion with other people you might want to start to listen on that level.

The interesting thing about stories is that you don't need to be consciously aware of a story to process it and show reactions to it. On a NLP seminar it can be hard to follow what's said because 3-4 story strings can be active at the same time. Once quite remarkable experience was a dozen people in the audience bursting out in tears but not really knowing why because a part of the story just revolved that they weren't paying conscious attention to.

A more practical exercise would be to look at your day and asking yourself what story it tells. The hero came home from work and then spend 2 hours watching TV and three hours browsing reddit is no good story. As a heuristic if your day provides for a good story it's usually a good day even from a more utility maximizing strategy.

If you always do the thing that the story asks for than you aren't procrastinating. That doesn't mean that you always have to do the thing the story asks for but being aware helps.

If you can see the role you play out in story you can manage some fairly challenging social situations because you have perspective. If the role you play it leads to being angry at the role I play but that doesn't hurt me on a fundamental level when I see the roles and how the narrative of the situation calls for it. If my role has to serve as target towards another person has to challenge their anger to grow in their own development, so what if the story progresses into the right direction?

To close the koan loop, maybe the story would normally call you to get angry back at the other person for being angry at you. If you are completely aware of the story that becomes silly, Buddhist enlightenment is about moving beyond the story and losing your entanglement. Last week I had a situation where someone was angry with me because of their prejudices and I was just laughing and had very much fun with the situation and at the end of the interaction the person recognized the silliness of their own behavior and thanked me.

He tried to write the rational way. He found that he couldn't really get himself to write it

What are you referring to by this?

What are you referring to by this?

Eliezer was trying to write a formel book on rationality. I found that he could get out of writers block by writing HPMOR.

Afterwards SI split into MIRI and CFAR with Eliezer mainly working on the MIRI side instead of the CFAR side, so I don't know whether there will be a formel book on rationality by Eliezer at any point in time.


But I could have also attempted to refer to writing in a way that publishable in journals. I pointed to the narrative that gets my attention more than to particular instances.

This may be a case where getting it wrong twice gets it right.

Intended abstract nugget of wisdom: "If you want to think or act rationally, don't act like the caricatures on television that do things less effective than the people they are contrasted with. Just do the effective things."

Simple description of what the meme means: "Being rational does not mean being like Vulcans from Star Trek."
Mistake 1: Vulcans don't usually act like that which the meme would condemn either. The description doesn't match the intended message.

Name of meme: "Straw Vulcan [Rationality]"
Mistake 2: If you use "Straw X" in approximately the same way as in "Straw Man" then the wording of the meme doesn't suggest that the Vulcans are irrational---quite the reverse. This remains true no matter which way you split it: "(Straw Vulcan) Rationality" or "Straw (Vulcan Rationality)".

Both of those suggest that the desired thing (either the Vulcan themselves or their Rationality) has been corrupted by the accused so that it is readily criticised. It means "Don't be like fake depictions of Vulcans, be like actual rational Vulcans".

If we assuming Brienne's positive depiction then Straw Vulcan Rationality is close to perfect terminology to be using, so is the fundamental lesson. It's that intermediate description that needs to go. Because relying on making two mistakes in order to get it right is a bad habit to get in to!

I see what you are saying, but I think you have misjudged full-blooded Vulcans. They quite often fall into the stereotypes of Straw Vulcans, though they may do it less than Spock.

I think a good example is T'Pol. Isn't it a recurring plot element that she can't wrap her head around her emotional shipmates? She is always uptight, overly precise, and insistent upon presenting a controlled demeanor, just like a Straw Vulcan. Sarek is another (though perhaps less apt), with his bottled-up frustration and contempt betraying his failure to deal with his emotions effectively.

Given that T'Pol is from the Enterprise series, I'd discount her as a source of evidence about the Vulcans as depicted in the original series: Enterprise is too far removed from TOS, and everything we see in it might be brain bugs.

Do you remember the telephone game from grade-school? All of the kids would sit in a circle. The first kid would whisper something in the second kid's ear, who would turn around and whisper it to the third kid, and so on. By the time it came back to the first kid, the message had completely changed. Similarly, when you have many writers working on one show, each new writer tries to interpret the work of the writers that came before, and then extrapolate from it.

It's easy to see how this works like that old grade-school telephone game. Writer #1 creates a fictional universe. Writer #2 creates a story set in his interpretation of Writer #1's fictional universe. Writer #3 creates a story which is set in his interpretation of Writer #1's fictional universe, and which attempts to continue his interpretation of the story written by Writer #2. Writer #4 tries to write a story which is consistent with the story written by Writer #3, and which is set in his interpretation of Writer #1's fictional universe, but he's not too familiar with the story made by Writer #2 because he never saw it. It's pretty obvious that by the time you get to writer #50, you've got a real mess on your hands.

But this mess is not entirely random, unlike the telephone game. People have a tendency to simplify concepts in their minds because, well, it's easier that way. We see this most prominently in the case of racial stereotyping, where racists simplify an entire human race into one or two key characteristics. It seems to be an innate tendency that can only be solved through education, which may help explain why racism tends to be inversely correlated to education level. The same mentality which drives racism seems to drive many of these brain bugs. Rather than think critically or thoroughly, it's easier to seize upon the most visible or interesting characteristic and then simplify the situation so that nothing remains but that lone characteristic. And in the Berman-Braga age, simple-minded thinking is the order of the day.

Perhaps, but they're still Vulcans. The trope isn't Star TOS Vulcan, or Straw Non-Brain Bug Vulcan.

Thank you for causing me to read that =)

It's important to remember that this is cultural. Remember that Vulcans are the same subspecies as the highly emotional Romulans, only separating a couple of thousand years before the setting. There are real-life humans who are worse than T'Pol for that sort of thing.

I see what you are saying, but I think you have misjudged full-blooded Vulcans. They quite often fall into the stereotypes of Straw Vulcans, though they may do it less than Spock.

To be fair on both Spock and the Vulcans the difference seems to be how much screen time they have in which they happen to be the most suitable target for one of the more preachy writers. For example a Vulcan fortunate enough to be in an episode with Data is mostly protected by the even more convenient android target.

That makes sense, but as to your proffered example, I cannot recall an episode that includes both Data and a Vulcan. Has this happened? What am I missing?

Edit: Oops, I forgot about Unification. Thanks, /u/wedrifid.

That makes sense, but as to your proffered example, I cannot recall an episode that includes both Data and a Vulcan. Has this happened? What am I missing?

I'm no expert myself---because I keep happing to leave the room while my girlfriend watches Star Trek because the alternative is too much urge to heckle the decisions advocated. Nevertheless, I can recall a few episodes involving Spock) and Sarek).

Oh, right. Silly me, forgetting about Unification. Thanks!

Indeed, there are many full-blooded Vulcans that are also written as huge jerks. According to the Star Trek writers, Vulcan logic and mental discipline don't seem to have much to do with whether one is an asshole or not.

You're definitely right about Sarek, but I've not seen Enterprise, so I'm not familiar with T'Pol.

At the risk of going off topic, you're not missing much.

As someone recently said on Facebook: I am trying really hard to read any of the comments as similarly tongue in cheek, but... it's not working.

What I was actually expecting was a playful battle-of-the-Trekies where we argue largely in jest over whether, how, and why an ideal Bayesian community should emulate or differ from Steel Vulcans. I am updating pretty strongly toward "Lesswrong doesn't do humor." If I shouldn't be, feel free to explain what I'm missing. I'm sort of new here.

I'd like to see more humor on LessWrong, but I don't consider this post particularly funny.

Well, your post [at least at first glance] misses the point of what 'straw vulcan rationality' represents (a model of faulty rationality and not a statement about the actual fictional race that you care about) and it is a nitpick at best (not to mention that, in fact, most of the time when a viewer sees a vulcan in the shows said vulcan's 'rationality' is faulty).

I mean, I get how nitpicks can be funny but posting that "Wow, apparently Lesswrong really hates humor." and similar statements when your 'humorous" post doesn't get well accepted seems excessive. Not to mention that even if this post looked like a joke (which it doesn't to me) and not like trying to prove a point that misses the real point, its place wouldn't be exactly in the Discussion section of the site by my standards if there is no additional content in the post on top of the humor.

(And yes, I know that you are likely to label me as one of those humourless lesswrongers that missed your joke.)

There's plenty of humor (tumor? It's not a tumor!) to be had around here, so your conclusion seems off-base. This comment just now had me laugh out loudly, and this rather infantile quip of mine still got plenty of upvotes.

Is it because of wedrifid's comment? Took me years to understand he's all about refined subtle humor, yes quite.

Apart from the mostly confusing qualia aspect*, emotions can be thought of as heuristics which predispose an agent towards a certain set of actions, a processing-filter putting an agent in a certain mode of reasoning. Like instagram. (Long-term) decisions which are made under the influence of strong emotions aren't necessarily worse, although they typically are.

The explanation for evolution selecting for such filters is that when doing "fast reasoning" under relevant resource constraints (there always are boundaries on available resources, just not always relevant ones), your cognition takes some cues (e.g. hormonal) to 'gently' (or not) steer your decision-making process (or your consciously accessible illusion thereof) towards the 'right' direction (Fight! Flight! Cry! Submit!).

A rationalist who were hypothetically able to tweak his/her emotions such that ze always experiences the correct filter, given the time constraints, would experience a full range of emotions and still make optimal decisions. This is, of course, out of reach.

Vulcans "master" (scare quotes) these heuristics by suppressing them, since in canon those heuristics are stronger and as such less useful (which always made the genesis of the whole scenario somewhat contrived). "Not being at the mercy of their emotions" is akin to "the actions of a Vulcan who experiences emotions are indistinguishable from a Vulcan who does not". Which reduces their emotions to their qualia dimension, which, through probably inferable via StarTrek!MRIs, does equate "mastery" with "ignoring":

So Surak prefers to experience emotions, he just refuses to have those emotions have any impact on the world whatsoever. Relevant functional difference to going through with "kolinahr"? I don't see it.

* "What (some) algorithms feel like from the inside" is just kicking the can down the definitional road, now "what does 'feel like from the inside' mean?" and "which algorithms?" need to do the exact same work (any physical process instantiates an algorithm, kind of a corollary from the Church-Turing thesis).

[-][anonymous]10y20

I'm curious: does anyone find the idea of a real kolinahru-style monastic, rationalist order appealing? Would you like to wear robes and study logic, mental mastery and emotional control on a mountain somewhere? I have posted a few thoughts about a real kolinahr practice here if anyone is interested.

According to TVtropes, a straw Vulcan is a character used to show that emotion is better than logic. I think a lot of people take "straw Vulcan rationality" it to mean something like, "Being rational does not mean being like Vulcans from Star Trek."

In TV Tropes jargon, "Straw" means a caricature, a fake, superficial stereotype.

[-][anonymous]10y-20

There's a core meme of rationalism that I think is fundamentally off-base.

Fundamentally? You wrote an essay explaining how the trope refers to Spock and not Vulcans. Regardless on whether or not that is true (the sanity of each of them varies drastically depending on the episode and the writers) the flaw would be that the name is "Straw Vulcan" and not "Straw Spock". That's a superficial difference, not a fundamental one. (I really was surprised when nothing fundamental to the concept was challenged.)

The above being the case and in the name of consistency I must declare this essay superficially off base.

[This comment is no longer endorsed by its author]Reply
[+][anonymous]10y-60