Related: politics is the mind killer, other optimizing
When someone says something stupid, I get an urge to correct them. Based on the stories I hear from others, I'm not the only one.
For example, some of my friends are into this rationality thing, and they've learned about all these biases and correct ways to get things done. Naturally, they get irritated with people who haven't learned this stuff. They complain about how their family members or coworkers aren't rational, and they ask what is the best way to correct them.
I could get into the details of the optimal set of arguments to turn someone into a rationalist, or I could go a bit meta and ask: "Why would you want to do that?"
Why should you spend your time correcting someone else's reasoning?
One reason that comes up is that it's valuable for some reason to change their reasoning. OK, when is it possible?
-
You actually know better than them.
-
You know how to patch their reasoning.
-
They will be receptive to said patching.
-
They will actually change their behavior if the accept the patch.
It seems like it should be rather rare for those conditions to all be true, or even to be likely enough for the expected gain to be worth the cost, and yet I feel the urge quite often. And I'm not thinking it through and deciding, I'm just feeling an urge; humans are adaptation executors, and this one seems like an adaptation. For some reason "correcting" people's reasoning was important enough in the ancestral environment to be special-cased in motivation hardware.
I could try to spin an ev-psych just-so story about tribal status, intellectual dominance hierarchies, ingroup-outgroup signaling, and whatnot, but I'm not an evolutionary psychologist, so I wouldn't actually know what I was doing, and the details don't matter anyway. What matters is that this urge seems to be hardware, and it probably has nothing to do with actual truth or your strategic concerns.
It seems to happen to everyone who has ideas. Social justice types get frustrated with people who seem unable to acknowledge their own privilege. The epistemological flamewar between atheists and theists rages continually across the internet. Tech-savvy folk get frustrated with others' total inability to explore and use Google. Some aspiring rationalists get annoyed with people who refuse to decompartmentalize or claim that something is in a separate magisteria.
Some of those border on being just classic blue vs green thinking, but from the outside, the rationality example isn't all that different. They all seem to be motivated mostly by "This person fails to display the complex habits of thought that I think are fashionable; I should {make fun | correct them | call them out}."
I'm now quite skeptical that my urge to correct reflects an actual opportunity to win by improving someone's thinking, given that I'd feel it whether or not I could actually help, and that it seems to be caused by something else.
The value of attempting a rationality-intervention has gone back down towards baseline, but it's not obvious that the baseline value of rationality interventions is all that low. Maybe it's a good idea, even if there is a possible bias supporting it. We can't win just by reversing our biases; reversed stupidity is not intelligence.
The best reason I can think of to correct flawed thinking is if your ability to accomplish your goals directly depends on their rationality. Maybe they are your business partner, or your spouse. Someone specific and close who you can cooperate with a lot. If this is the case, it's near the same level of urgency as correcting your own.
Another good reason (to discuss the subject at least) is that discussing your ideas with smart people is a good way to make your ideas better. I often get my dad to poke holes in my current craziness, because he is smarter and wiser than me. If this is your angle, keep in mind that if you expect someone else to correct you, it's probably not best to go in making bold claims and implicitly claiming intellectual dominance.
An OK reason is that creating more rationalists is valuable in general. This one is less good than it first appears. Do you really think your comparative advantage right now is in converting this person to your way of thinking? Is that really worth the risk of social friction and expenditure of time and mental energy? Is this the best method you can think of for creating more rationalists?
I think it is valuable to raise the sanity waterline when you can, but using methods of mass instruction like writing blog posts, administering a meetup, or launching a whole rationality movement is a lot more effective than arguing with your mom. Those options aren't for everybody of course, but if you're into waterline-manipulation, you should at least be considering strategies like them. At least consider picking a better time.
Another reason that gets brought up is that turning people around you into rationalists is instrumental in a selfish way, because it makes life easier for you. This one is suspect to me, even without the incentive to rationalize. Did you also seriously consider sabotaging people's rationality to take advantage of them? Surely that's nearly as plausible a-priori. For what specific reason did your search process rank cooperation over predation?
I'm sure there are plenty of good reasons to prefer cooperation, but of course no search process was ever run. All of these reasons that come to mind when I think of why I might want to fix someone's reasoning are just post-hoc rationalizations of an automatic behavior. The true chain of cause-and-effect is observe->feel->act; no planning or thinking involved, except where it is necessary for the act. And that feeling isn't specific to rationality, it affects all mental habits, even stupid ones.
Rationality isn't just a new memetic orthodoxy for the cool kids, it's about actually winning. Every improvement requires a change. Rationalizing strategic reasons for instinctual behavior isn't change, it's spending your resources answering questions with zero value of information. Rationality isn't about what other people are doing wrong; it's about what you are doing wrong.
I used to call this practice of modeling other people's thoughts to enforce orthodoxy on them "incorrect use of empathy", but in terms of ev-psych, it may be exactly the correct use of empathy. We can call it Memetic Tribalism instead.
(I've ignored the other reason to correct people's reasoning, which is that it's fun and status-increasing. When I reflect on my reasons for writing posts like this, it turns out I do it largely for the fun and internet status points, but I try to at least be aware of that.)
So we should focus on increasing our social skills, with the specific goal of befriending influential people, and influence politics. Without officially becoming politicians ourselves, because that messes with one's brain. Unless we consciously decide to sacrifice a few of us.
Can we agree on which political goals would be desirable? Funding for aging research seems like a good candidate. (Even a libertarial opposed to any kind of taxation and government spending could agree that assuming the government already takes the money and spends it anyway, it is better if the money is spent on aging research, as opposed to research of rare diseases of cute puppies.) Opposing obvious stupidities could be other thing. Unfortunately, no politician can become popular by doing nothing else but opposing obvious stupidities, although I personally would love to see more such politicians.
Then we would need a proper protocol on sacrificing rationalists for politics. A rationalist who becomes a politician could fix a lot of things, but inevitably they would stop being a rationalist. I guess it is impossible to keep a functioning mind... and even if by some miracle one could do it, then they could not discuss their opinions and goals on LW openly anyway.
Actually, a LW community could easily ruin a rational politician's career by asking them questions where a honest answer means political suicide, but a less-than-honest answer is easily disproved by the community. Imagine a Prisonners' Dilemma among politicians where two politicians agree to support each other's ideas for mutual benefit. Each of them dislikes the other idea, but considers the world with both of them better than the world with none of them. But for the plan to work, both of the politicians must pretend to support both ideas wholeheartedly. And now the LW community would openly ask the former rationalist politician about the other idea, and present their own speculations about the motives; an saying "please, for greater utility, let's not discuss this" would probably have the opposite effect.
So there would need to be some firewall between the politician and the community. For example that the politician discusses with the community only in specific articles, where it is allowed to discuss only the explicitly allowed topics. (You could send the politician a PM suggesting a new topic, but you would be forbidden to say publicly that you did so.)
I cannot determine whether this is presented ironically.