Related: politics is the mind killer, other optimizing
When someone says something stupid, I get an urge to correct them. Based on the stories I hear from others, I'm not the only one.
For example, some of my friends are into this rationality thing, and they've learned about all these biases and correct ways to get things done. Naturally, they get irritated with people who haven't learned this stuff. They complain about how their family members or coworkers aren't rational, and they ask what is the best way to correct them.
I could get into the details of the optimal set of arguments to turn someone into a rationalist, or I could go a bit meta and ask: "Why would you want to do that?"
Why should you spend your time correcting someone else's reasoning?
One reason that comes up is that it's valuable for some reason to change their reasoning. OK, when is it possible?
-
You actually know better than them.
-
You know how to patch their reasoning.
-
They will be receptive to said patching.
-
They will actually change their behavior if the accept the patch.
It seems like it should be rather rare for those conditions to all be true, or even to be likely enough for the expected gain to be worth the cost, and yet I feel the urge quite often. And I'm not thinking it through and deciding, I'm just feeling an urge; humans are adaptation executors, and this one seems like an adaptation. For some reason "correcting" people's reasoning was important enough in the ancestral environment to be special-cased in motivation hardware.
I could try to spin an ev-psych just-so story about tribal status, intellectual dominance hierarchies, ingroup-outgroup signaling, and whatnot, but I'm not an evolutionary psychologist, so I wouldn't actually know what I was doing, and the details don't matter anyway. What matters is that this urge seems to be hardware, and it probably has nothing to do with actual truth or your strategic concerns.
It seems to happen to everyone who has ideas. Social justice types get frustrated with people who seem unable to acknowledge their own privilege. The epistemological flamewar between atheists and theists rages continually across the internet. Tech-savvy folk get frustrated with others' total inability to explore and use Google. Some aspiring rationalists get annoyed with people who refuse to decompartmentalize or claim that something is in a separate magisteria.
Some of those border on being just classic blue vs green thinking, but from the outside, the rationality example isn't all that different. They all seem to be motivated mostly by "This person fails to display the complex habits of thought that I think are fashionable; I should {make fun | correct them | call them out}."
I'm now quite skeptical that my urge to correct reflects an actual opportunity to win by improving someone's thinking, given that I'd feel it whether or not I could actually help, and that it seems to be caused by something else.
The value of attempting a rationality-intervention has gone back down towards baseline, but it's not obvious that the baseline value of rationality interventions is all that low. Maybe it's a good idea, even if there is a possible bias supporting it. We can't win just by reversing our biases; reversed stupidity is not intelligence.
The best reason I can think of to correct flawed thinking is if your ability to accomplish your goals directly depends on their rationality. Maybe they are your business partner, or your spouse. Someone specific and close who you can cooperate with a lot. If this is the case, it's near the same level of urgency as correcting your own.
Another good reason (to discuss the subject at least) is that discussing your ideas with smart people is a good way to make your ideas better. I often get my dad to poke holes in my current craziness, because he is smarter and wiser than me. If this is your angle, keep in mind that if you expect someone else to correct you, it's probably not best to go in making bold claims and implicitly claiming intellectual dominance.
An OK reason is that creating more rationalists is valuable in general. This one is less good than it first appears. Do you really think your comparative advantage right now is in converting this person to your way of thinking? Is that really worth the risk of social friction and expenditure of time and mental energy? Is this the best method you can think of for creating more rationalists?
I think it is valuable to raise the sanity waterline when you can, but using methods of mass instruction like writing blog posts, administering a meetup, or launching a whole rationality movement is a lot more effective than arguing with your mom. Those options aren't for everybody of course, but if you're into waterline-manipulation, you should at least be considering strategies like them. At least consider picking a better time.
Another reason that gets brought up is that turning people around you into rationalists is instrumental in a selfish way, because it makes life easier for you. This one is suspect to me, even without the incentive to rationalize. Did you also seriously consider sabotaging people's rationality to take advantage of them? Surely that's nearly as plausible a-priori. For what specific reason did your search process rank cooperation over predation?
I'm sure there are plenty of good reasons to prefer cooperation, but of course no search process was ever run. All of these reasons that come to mind when I think of why I might want to fix someone's reasoning are just post-hoc rationalizations of an automatic behavior. The true chain of cause-and-effect is observe->feel->act; no planning or thinking involved, except where it is necessary for the act. And that feeling isn't specific to rationality, it affects all mental habits, even stupid ones.
Rationality isn't just a new memetic orthodoxy for the cool kids, it's about actually winning. Every improvement requires a change. Rationalizing strategic reasons for instinctual behavior isn't change, it's spending your resources answering questions with zero value of information. Rationality isn't about what other people are doing wrong; it's about what you are doing wrong.
I used to call this practice of modeling other people's thoughts to enforce orthodoxy on them "incorrect use of empathy", but in terms of ev-psych, it may be exactly the correct use of empathy. We can call it Memetic Tribalism instead.
(I've ignored the other reason to correct people's reasoning, which is that it's fun and status-increasing. When I reflect on my reasons for writing posts like this, it turns out I do it largely for the fun and internet status points, but I try to at least be aware of that.)
The nearest reason why I want people around me to become more rational is because irrationality (in some specific forms) repels me. I admit this is how being a member of a tribe can feel from inside. (In a parallel branch of the multiverse I could be a theist, repelled by atheists. I mean, how could you not dislike the people who throw away infinities of utilons, only because they are so overconfident about their human reasoning abilities, which outside view suggests are pretty pathetic.)
But I also believe that having a higher sanity waterline is a good thing. With a specific person, sabotaging their rationality to exploit them may sometimes bring me more utilons than cooperating with them. But what about a population as a whole? I enjoy having higher standards of living, I enjoy having internet, I enjoy having the possibility of hearing different opinions and not having to follow religious leaders. I would enjoy even more if driverless cars became commonplace, if medicine could make us live even better and longer, and if psychology could help significantly beyond the placebo effect. All these things require some general standard of rationality. -- We often complain how low that level is, so for the sake of fairness I would like to note that it could be even much lower. Imagine a society where every problem is solved by asking a local shaman, and a typical answer is that a problem was caused by a witch, and you must kill the witch to fix the problem. And if you somehow step out of the line, you become the best candidate for a witch. Some humans live like this, too. -- If only during the recent century all the money and energy spent on horoscopes would be spent on medicine instead, maybe 100 years could be now the average lifespan, and 150 years rather likely for those who take care to exercise and avoid sugar. Think about all other improvements we could get if only people became more rational. (We would get some new harmful things, too.)
I agree that even if I feel that people should become more rational, trying to correct them is probably not the best way, and quite often it does more harm than good. (I mean harm to the person who wastes their time trying to correct others. Waste of time, and frustration.) I used to spend a lot of time correcting people online. Finding LessWrong helped me a lot; now that I know there is one website where people can discuss rationally, the existence of others feels less painful. It also helped to realize that inferential distances are too big to be overcome by a comment in a discussion. I feel certain in many situations that I know better than other people, but I have updated my estimate of fixing their reasoning to near epsilon. (Unless the other person specifically asks to be fixed, which almost never happens.) Writing a blog or starting a local rationalist group would be better. (I just need to overcome my akrasia.)
So, instead of doing stupid stuff that feels good, if we agree that having more rationalists on this planet is a good idea, what next? I know that CFAR is doing workshops for a few dozens of participants. The LessWrong blog is here, available for everyone. That is already pretty awesome, but it is unlikely that it is the best thing that could be done. What else could have a higher impact?
My ideas in five minutes -- write a book about rationality (books have higher status than blogs, can be read by people who don't procrastinate online); create a "LessWrong for dummies" website (obviously with a different name) explaining the uncontroversial LW/CFAR topics to a public in a simplified form. Actually, we could start with the website and then publish it as a book. But it needs a lot of time and talent. Alternative idea: do something to impress the general population and make rationality more fashionable (moderate use of Dark Arts allowed); for example organize a discussion about rationality on a university with rationalists who also happen to be millionaires (or otherwise high status), and minicamp-style exercises for participants as a followup. Requires the rationalist celebrities and someone to do the exercises.
I don't think enough has been spent on horoscopes to do that much good. On the other hand, if people gave up on lotteries, that might have some impact.
I agree that figuring out how to teach rationality to people with average intelligence is an important goal, even if "Thinking Clearly for Dummies" is an amusing title.