I noticed that the people who didn't like the book were essentially put off by the rationality. They thought Harry was arrogant and condescending.
If they said that Harry was being arrogant and condescending, perhaps you shouldn't immediately translate this into your mind as "essentially put off by the rationality"?
In a previous version of the story (I believe Eliezer has since revised it, probably because he did realize it was going too far), Harry in one case called McGonnagal "Minerva" and considering calling her "Minnie", when McGonagall had been calling him "Mr Potter" throughout.
Harry has indeed been an arrogant and condescending little twit.
I would go even further and point out how Harry's arrogance is good for the story. Here's my approach to this critique:
"You're absolutely right that Harry!HPMOR is arrogant and condescending. It is a clear character flaw, and repeatedly gets in the way of his success. As part of a work of fiction, this is exactly how things should be. All people have flaws, and a story with a character with not flaws wouldn't be interesting to read!
Harry suffers significantly due to this trait, which is precisely what a good author does with their characters.
Later on there is an entire section dedicated to Harry learning "how to lose," and growing to not be quite as blind in this way. If his character didn't have anywhere to develop, it wouldn't be a very good story!"
While this is true, there can be a distinction between a character with flaws and a character who is extremely irritating to read about. And this is one of those judgement calls where The Audience is Always Right; it seems very reasonable to stop reading a story if the protagonist noticeably irritates you.
In general, commentary to the effect of "you should like this thing" is not very useful, especially if you are trying to figure out why someone reacted negatively.
(These discussions in which one group has an overwhelmingly strong "squick" or "ew" reaction and another group does not are fascinating to me, not least of all because they seem to pop up quite frequently here, e.g. about Eliezer's OKCupid profile and NYC cuddle piles. Both sides spew huge amounts of ink explaining their emotional reactions, and yet there never seems to be any actual sharing of understanding. In the interests of trying harder...I was also very aggravated by the first few chapters of HPMOR, and would be happy to discuss it calmly here.)
I suppose if you really can't stand the main character, there's not much point in reading the thing.
I was somewhat aggravated by the first few chapters, in particular the conversation between Harry and McGonagall about the medical kit. Was that one where you had your aggravated reaction?
I found myself sympathizing with both sides, and wishing Harry would just shut up--and then catching myself and thinking "but he's completely right. And how can he back down on this when lives are potentially at stake, just to make her feel better?"
Yes, I did find that section grating. I'm describing my emotions post-hoc here (which is not generally reliable), but what I found irritating about the first few chapters was the fact that Harry acts in an extremely arrogant way, nearly to the point of coercing the adult characters, and the story-universe appears to back him up at every turn. This is the "Atlas Shrugged" effect described downthread by CellBioGuy. Harry is probably right in most of those arguments, but he is only so effortlessly correct and competent because the story-universe is designed so that he can teach these rationality lessons to other characters. It feels like the world is unfair, and unfair in the favor of an unlikeable and arrogant character. There is a real-world corollary of this, of course--very arrogant people who always get what they want--and I suspect my emotional reactions to these real and fictional situations are very similar.
(I have since caught up with the rest of the fic, and enjoyed most of it.)
That's true, but I also think that the culture of LW and the surrounding "rationalist" community is a fairly arrogant culture, and not all rational-thinking cultures are like this (compare the number of scientific papers that say "more research is needed" vs "our paper pretty much settles this issue"). Personally I suspect that a culture less arrogant than the LW "rationalist" community is optimal. For example, Paul Graham nails it IMO... he says controversial stuff and questions establishment views, but does so in thoughtful, evenhanded, polite way.
I agree that in some instances Harry has acted a bit immature and disrespectful. However...
I don't think he does it too often. Most of the time I think he's just challenging people's beliefs (justifiably).
He's demonstrated that he's an incredibly caring and selfless person, both admirable qualities. He's dedicated to making the world a better place, and is willing to sacrifice himself to do so. Shouldn't that outweigh a little bit of immaturity?
They thought Harry was arrogant and condescending.
He certainly comes across this way, causing what Eliezer calls the status slapdown emotion, where the reader feels that Harry (and, by extention, the author) is a show-off and should show some respect.
Then I was thinking, a lot of people are "put off by rationality" for similar reasons.
Regardless of the topic, if the speaker comes across as "arrogant and condescending", her odds of convincing people instead of alienating them are not good.
What a shame. There's a lot of value in spreading rationality, and this seems to be a big obstacle in doing so.
Indeed. Pontificating in a condescending manner is a known hazard, and it is unreasonable to blame the audience for not liking it.
Any thoughts on how to make make people less "put off by rationality"?
A better question is "how to make make people less put off when teaching/explaining in general, regardless of the topic". One hint is to not behave like you are higher-status than they believe you to be.
One hint is to not behave like you are higher-status than they believe you to be.
This is a very good idea I wish I were able to follow more often. And since I am not good at learning social skills by copying, an explicit training could be useful.
To be precise, "not behaving higher-status" is relatively easy. The difficult part is to explain an idea contradicting some high-status belief while behaving so. I know it can be done, because I have seen people doing it, but it is so high above my current skill level.
To be precise, "not behaving higher-status" is relatively easy.
The point isn't to never behave in a higher-status matter. If a non-tech person asks you what's wrong with their computer and they know you are a programmer they believe that you have status as far as computer related issues go.
When people behave as if they are lower status then they are believed to be that's also bad. Nobody likes to sit in a lecture by a teacher who thinks he's lower status than his students.
The relevant social skill is to understand what other people expect from you. If they expect you to help them fix one of their problems the allocate you usually the kind of status that they think you would need for helping them solve their issue.
Personally while attempting to contradict a higher-status belief I readjust my goals. Rather than completely convince my audience I try to create some doubt that the higher-status belief might be wrong. I have found that effective ways to do this is by using sentences like "I am not sure either, but what about..." or "Hmm, but what would then happen if..?", this way not only does the audience think that they are the ones coming up with the conflicting ideas (instantly boosting their confidence in it) but also it doesn't look like you are actively challenging anything's status but rather are a confused ally, putting the audience on "your side".
Of course the drawbacks are that this can be done only so often or people will start to think you don't know anything at all and sometimes your audience cannot come up with your answers to the questions ('derailing' the conversation). But as a general rule I try not to tear down high-status ideas from a low-status position - why pick a hard fight when you can also split it into two easy fights.
One hint is to not behave like you are higher-status than they believe you to be.
I am not sure about that. Status is a function of perception and, to some degree, negotiation (usually not explicit). Just because someone wants to assign low status to you (e.g. because it would raise their own) does not mean you have to accept it and bend to their expectations.
It's a complex issue but I'd like to make two points.
First, showing high(er) status and being an asshole are very different things. People dislike the latter much more than the former.
Second, keep in mind the overall goals. High status is useful and could perfectly well be part of what you strive for. Normally you do want to convey information AND increase your status without pissing the audience off. By the way, if you're perceived as having sufficiently low status, your information would be perceived as low value. People listen to those they respect.
In some cases, people think it's rude to suggest to someone that they're wrong.
Suggesting that someone is wrong is a status move -- the person correcting is expressing that they have higher status than the person being corrected. At least this is how neurotypical people would evaluate the interaction. Their model of Harry tells them that Harry believes to have higher status than everyone else, which means that he should be punished to learn his proper place in the tribe.
While directly highlighting a deficiency and offering replacement is a straightforward way there are ways of addressing them. They often require more work but can have such benefits as not triggering adverse social mechanisms. One such route is to get really enthusiastic on the wrong idea and carry it on to it's more extreme side where it can be apparent how absurd the idea was in the first place. The original proposer gets to do the error correction and withdrawal. Thus one can be an active force for truth without being brutal about it.
Pretending to e enthusiastic about an idea in which one doesn't believe for tactical reasons can come off as condescending. It not always a safe move.
If am a doctor and complain that my patients don't allow themselfs to be injected with a needle it would be pretty premature to call them "anti-medicine" if they take pills without objections. True you can't give pills to an unconcious person and some substanes don't really survive the stomach, so you can't replace needles with pills fully. However as a doctor I am to heal things and it would run counter to my objectives to insist to use only needles (especially if there are people that would not get any treatment). Even if I thought that needles are on average better and that conditions should be improved so that needle use is enabled.
Similarly with sanity insisting on methods can make you tackle a narrower problem than the one you are facing.
Being right while being a dick about it is being a dick. If people then object saying you're acting like a dick, reacting "but I'm RIGHT!" doesn't address that.
It looks to me like people are put off by demonstrations of superiority, not by demonstrations of rationality.
Looking at this topic more broadly than solely in terms of HPMOR and it's reviews, I would argue that for many people their exposure to the concept of rationality is predominantly made up of half rationalists.
Rationality is hard. It gives us tools that allow us to update old preconceptions of the world. However in practice we will often fail in our rationality due to insufficient information or other cognitive limits while still identifying our actions as being superior due to rational principles. It is very off putting to see others claiming superiority yet still be full of flaws in reasoning due to bounded rationality. From your perspective you might clearly see a flaw in their reasoning, perhaps one you can't communicate well, even if from their perspective they have applied rationality.
This creates cognitive dissonance for accepting the idea that rationalism leads to better reasoning.
EY wrote a bit about the dangerous of being half a rationalist within the body of this post if you want to continue this train of thought.
I would argue that for many people their exposure to the concept of rationality is predominantly made up of half rationalists.
Additionally, many people who call themselves 'rationalists' are, in my experience, just using the term to justify their own ideas whatever they are and dismiss those of others as automatically 'irrational'.
this is actually related to my pet theory that, at least in signalling status terms, it is better to call one's self "aspiring rationalist" rather than "rationalist" full stop.
The problem with that is that the first is longer, less concise, and more awkward to use :-/
See also EYs comment on FB:
People who have this [status regulating] emotion leave angry reviews on Chapter 6 of HPMOR where they complain about how much Harry is disrespecting Professor McGonagall and they can’t read this story.
I wasn't very satisfied with Eliezer's comment there. He seemed to be missing the point of those complaints. They didn't mind necessarily that Harry acted as he did, but they found it out-of-character that McGonagall would respond to Harry's actions as she did.
Comes off as if its an excerpt from Atlas Shrugged or something, with the wonderful hero so obviously superior to everyone...
Thanks for the link. I think the idea of status slapdown is (part?) of what I was getting at in 1).
Yes, it's that "rationality" is arrogant and condescending.
Most of the high-status people I have met go out of their way to be pleasant even to much lower-status people. The selfish ones do this because it's easier to manipulate friends than enemies; the others have other reasons such as being "nice" or "friendly", but the end result is the same.
Rationality, on the other hand, believes "that which can be destroyed by the truth should be," and tries to claim the highest possible status (others are illogical, have goals that don't make sense, and aren't trying to save all of history from, for example, acausal AI blackmail, and therefore are inferior). Understandably, this doesn't persuade most people. As the people of Melos found during the Peloponnesian War, abstract arguments about what is "right" don't carry weight unless both sides have enough strength/status to threaten the other.
Rationality is like mathematics: useful but not adequate on its own for guiding real-world decision-making. Other required ingredients for high effectiveness are ample domain-specific knowledge and mostly-accurate gut feelings. There are numerous examples of forebrain lesions that disrupt emotional processing also causing poor decision-making, and examples of irrational and rarely-self-aware people who are very successful by conventional measures.
In negotiations, an effective strategy is to argue within the other side's moral framework (set of norms). The strategy of calling the other side stupid tends not to work very well, unless it's part of an appeal within a third party's moral framework (e.g. a powerful observer's) to get them to intervene. Thus, two effective strategies would be 1. actually do tests to show that assimilating the Rationality memeplex improves conventional success, or 2. "be nicer to everyone" (as Herm. tells Harry in HPMOR).
Additionally, Rationality discussion dives right into some massively unpleasant topics: apocalypse scenarios, amorality on a civilization-wide scale, the insanity of baseline humans, and the epistemological uncertainty of the Universe. While true and often useful, there is a high cognitive cost for carrying around this information. For example: nearly all great mathematicians have been men, because men can in general survive while being very naive about the real world, while women in general in existing societies must spend additional effort to be liked in order to stay safe, leaving less mental effort available for mathematics. Thus, many people recognize that losing their false beliefs will make them less happy and less effective (e.g. unable to be part of a religion, or to relate to people as people instead of bags of evolved motivations) and choose not to participate in Rationality.
In short, if you tell someone they're stupid for not agreeing with you, don't expect to change their mind.
There's a trope / common pattern / cautionary tale, of people claiming rationality as their motivation for taking actions that either ended badly in general, or ended badly for the particular people who got steamrollered into agreeing with the 'rational' option.
People don't like being fooled, and learn safeguards against situations they remember as 'risky' even when they can't prove that this time there is a tiger in the bush. These safeguards protect them against insurance salesmen who 'prove' using numbers that the person needs to buy a particular policy.
Aren't we all forgetting something big and obvious[1] that's staring us in the face? :-/ There are people out there for whom "rationality" is counter to their values! Imagine someone who reads the horoscope every morning, who always trusts their gut feelings and emotions, who's a sincere believer in homeopathy, etc etc (whatever you think an irrational person believes). Such a person would probably strongly rationality, rationalists, and the complex of ideas surrounding rationality, for probably understandable reasons (i.e. if a group consistently belittles your treasured beliefs, you're liable to hate and dislike the group). Such people might dislike R!Harry because they'd see rationality as a magic feather, and seeing it working in the story (to an uncanny degree, I might add) would be reading an author tract for them. Imagine a black person reading a fanfic where, through the power of !RACISM! (exaggeration mine), Harry gets everything handed to him on a silver platter.
[1]disclaimer: just because it's big and obvious doesn't mean it's actually more right or important, but only that it's easier to see and think about
Imagine someone who reads the horoscope every morning, who always trusts their gut feelings and emotions, who's a sincere believer in homeopathy, etc etc (whatever you think an irrational person believes). Such a person would probably strongly rationality, rationalists, and the complex of ideas surrounding rationality, for probably understandable reasons
A bit offtopic to the discussion itself, but trusting your "gut feelings" is rational in certain circumstances (or, in the more precise lingo, in certain conditions System 1 will be faster /and/ more correct than System 2). I actually don't remember whether HPMOR teaches it somewhere (if anybody knows, could you share a link?).
I often use this as a bridge when explaining applied rationality techniques to friends, because it makes the techniques more relatable and gives me a good opening ("I'm totally with you on trusting your gut feelings in situation X, and here're some interesting explanations from psychology research on why it works", using familiar topics to show your System 1 that research can explain life and that research can be interesting!, so then I can continue into "on the other hand it's probably not good to do the same in situation Y, here's some interesting research on why it's so"). It also helps dispel the Straw Vulcan view of rationality.
The point being that advocating for (applied) rationality can sometimes come across as saying "you, being human, have no idea how to make good decisions in an environment populated by humans, wipe everything clean and begin anew", instead of "making decisions is hard, but here're some things you already seem to do right, and here're some things you could get better at".
Such a person would probably strongly [missing verb?] rationality, rationalists, and the complex of ideas surrounding rationality, for probably understandable reasons
Since I kind of like your comment, I'd liked to know how that sentence should have sounded. Strongly dislike, hate, mistrust?
All three options fit the bill, actually, but I was going for strongly dislike. Man, I must have been more tired than I realized to miss a whole word like that.
I don't think it's rationality per se, but rather flamboyant displays of exceptional intelligence. For most people, this is very difficult to like.
Given the social dynamics involved, I suspect this isn't easily fixable.
There's a difference between displays of intelligence and excessive displays of intelligence. A working definition could be that it's excessive when its purpose is to show off rather than to communicate the point at hand. (I think we agree here, just sort of thinking out loud and making sure.)
So obviously it's a bad idea to show off. To me the question really is what do you do when you need to communicate something, but doing so will be seen as excessive. What are the best strategies to mitigate the perceived excessiveness?
So obviously it's a bad idea to show off.
That really depends on the situation. Sometimes it's an excellent idea in order to establish dominance or to impress someone.
High status is useful.
What are the best strategies to mitigate the perceived excessiveness?
An obvious one would be to flavor your communication with some low-status signals.
I guess something along the lines of trying to impress people.
But I think Lumifer is right in saying that it depends on the situation. I meant to say that it's usually a bad idea to show off because it'll usually just make people dislike you.
If you go see a circus performance, do you think the performers are wrong to try to impress the audience?
Questioning normal rules, scripts, roles, etc. marks you as a defection risk. Defectors explicitly reason about cooperative behaviors in order to subvert them.
I think it is, they just phrase it differently - "I don't trust someone like that". For explicit reasoning about cooperative behaviors in order to subvert them, it's "too smart for his own good".
I think it is, they just phrase it differently
No, I still don't think so. The expression "defection risk" implies the one-shot prisoner dilemma context and neither such a situation is common in real life, nor normal people think in such categories (correctly, too).
"I don't trust someone like that" should just be interpreted directly according to its plain meaning. Not trusting someone does not imply a PD-like context and/or an expectation of defection.
"Too smart for his own good" I understand as meaning "He's smart enough to figure out how to bend the rules or go around them, but he is not smart enough to figure all the consequences of that and weight them properly". Again, nothing related to PD.
The expression "defection risk" implies the one-shot prisoner dilemma context and neither such a situation is common in real life, nor normal people think in such categories (correctly, too).
You're reading too much into what RomeoStevens wrote - at no point did he explicitly mention the one-shot Prisonner's Dilemma.
A pretty common usage here is to use the Prisonner's Dilemma as a simplified model (think spherical cow on a frictionless plane, or perfect gas) of many morally-relevant situations.
This model is not what people explicitly think about (just like people don't explicitly think about social status when they are outraged or dismissive, or don't explicitly think about expected utility when deciding), but it may still be a good (simplified) model of what people think.
And RomeoStevens is referring to what people think, he's just using "defection risk" as a shorthand. If you ask normal people, they'll usually talk in terms of trust.
You may object that the model is not good enough, but you'll need a better argument than "it's not what people think" (nobody is claiming it is); do you similarly object to discussing people's choices in terms of expected utility and opportunity costs?
You're reading too much into what RomeoStevens wrote - at no point did he explicitly mention the one-shot Prisonner's Dilemma.
The only two contexts I know where the expression "defection risk" has meaning is prisoner's dilemma and Cold War spy/counterintelligence games.
he's just using "defection risk" as a shorthand
I think it's a wrong expression here both connotationally and denotationally.
You may object that the model is not good enough
I'm not saying it's not good enough, I'm saying it's wrong. To repeat myself, one-shot PD is quite rare is real life. Trying to shoehorn "many morally-relevant situations" into this mold is not the right approach.
Yes, Prisoner's Dilemma, but not only one-shot. You added that detail in with no attempt at justification, and it is not justifiable.
(also, games similar to Prisoner's Dilemma with explicitly cooperative and defecting strategies, but details details)
Yes, Prisoner's Dilemma, but not only one-shot. You added that detail in with no attempt at justification, and it is not justifiable.
In the iterated Prisoner's Dilemma with communication possible, defecting is a pretty stupid move. I'm sure it happens in real life on occasion, but it's the rarity and not the norm.
With utterly vanilla, standard rules, yes.
You can have systems that are clearly instances of PD (or at least related enough that one would use the same term) where the payoff structure makes this not so clear.
Like, oh, defecting on a cooperator yields 100 points instead of the usual 5, and the other player gets -95. Then defection becomes profitable within 33 rounds of the end, not just 2.
Also, in uncertainty or accidental defections, and it ceases to be so crazy to defect. What if you play with two other players, and can see only the total number of defections and cooperations you faced? What if you aren't sure how many more moves there are left? What if there is a continuum from cooperation to defection?
Well, let's get back to reality. We were talking about the way normal people think, remember?
So consider a normal person. When, in the course of his typical life, does he have to make choices in a PD situation? At work? When he's drinking beer and watching the game with his buddies or when she's watching a show and gossiping with her girlfriends? When looking for a mate? In relationships with parents or kids?
PD is a neat construct and certainly occurs in real life -- rarely. But for your regular bloke or gal it's a non-issue and they don't spend time thinking in terms of a theoretical situation they don't care about.
At this point, Emile's post seems appopriate ( http://lesswrong.com/r/discussion/lw/kox/why_are_people_put_off_by_rationality/b71b )
Your response to it was that defection risk means this one very specific thing. She said that it's LW-shorthand for a much more general thing.
Considering that you weren't the one originally using the term, maybe you should use the definition that makes sense?
I was reading reviews of HPMOR on Goodreads and I noticed that the people who didn't like the book were essentially put off by the rationality. They thought Harry was arrogant and condescending.
I don't think it's the rationality that puts them off. I appreciate good grammar, but if the protagonist of Harry Potter and the Misplaced Modifier spent the book correcting people's use of the word "whom" I would want ring the git's neck.
I don't think it's necessarily a problem to suggest to people that they're wrong, but it can be off-putting to be told that one is solidly and provably wrong on a point.
I was reading reviews of HPMOR on Goodreads and I noticed that the people who didn't like the book were essentially "put off by the rationality". They thought Harry was arrogant and condescending.
Then I was thinking, a lot of people are "put off by rationality" for similar reasons. What a shame. There's a lot of value in spreading rationality, and this seems to be a big obstacle in doing so.
Any thoughts on how to make people less "put off by rationality"? I think the core issues are: