Today's post, Rationality: Common Interest of Many Causes was originally published on 29 March 2009. A summary (taken from the LW wiki):

 

Many causes benefit particularly from the spread of rationality - because it takes a little more rationality than usual to see their case, as a supporter, or even just a supportive bystander. Not just the obvious causes like atheism, but things like marijuana legalization. In the case of my own work this effect was strong enough that after years of bogging down I threw up my hands and explicitly recursed on creating rationalists. If such causes can come to terms with not individually capturing all the rationalists they create, then they can mutually benefit from mutual effort on creating rationalists. This cooperation may require learning to shut up about disagreements between such causes, and not fight over priorities, except in specialized venues clearly marked.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Church vs. Taskforce, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
4 comments, sorted by Click to highlight new comments since:

There are many movements which would benefit from more rationality. But there are also many movements which would be harmed by more rationality. I wonder whether the latter have already formed a silent coalition...

My friends have a proverb: "The easiest way to recognize an intelligent person is that in the moment they enter the room, all the idiots instinctively join against them." And it often seems to work (possible selection bias here).

I can imagine that when a person enjoys something irrational (horoscopes, some political movement, etc.) and they often get opposed rationally, it may condition them to hate anything that sounds too rational. Later when they hear someone using rational arguments, even if it is completely unrelated to their favourite cause, they get angry.

That's a sure sign that your friends aren't very bright. Charisma is itself a learned skill, and intelligent people very often possess it. While it is not a sure sign of intelligence, if everyone doesn't like you, then you're in trouble.

"All the idiots" is a pretty arbitrary term, and it is also really questionable - how do you define idiots, then? As those who disagree with you?

Manipulating idiots is actually generally easier than manipulating average people, and idiots are not any sort of single group - low-functioning persons often fight amongst themselves.

I agree that one can learn skills to impress people, and those skills are very useful. Also that intelligent people have an advantage in this: the intelligence itself is impressive, and can help learn those skills faster.

On the other hand, humans are not automatically strategic (even the intelligent ones), "charisma" is not a one-dimensional value (different things impress different people), and there is also some tension... let's say it this way: most good scientists would fail doing a car salesman's job, and most car salesmen would fail to do good science.

The quoted proverb does not use LW terminology, so "idiots" in this context means highly irrational people, and "an intelligent person" means both intelligent and rational person. The translation would be that irrational people are surprisingly quick at detecting rational people and making coalitions against them.

The problem with "manipulating idiots" is when you are also trying to accomplish a different goal with a different audience at the same time. For example your audience is 50% reasonable people and 50% "idiots" mixed together, and you are trying to explain a reasonable plan to those reasonable people... but either you speak to the reasonable people and the "idiots" interrupt you, or you manipulate the "idiots" and the reasonable people see the problems with your arguments.

Do you really expect that curbing discussion of which projects are the best projects is likely to do anything but help out the projects who do not abide by such decisions? Indeed, do you think it will lead to better decisions or more people joining in? Because I don't.

I don't think it is a rational outcome, or likely outcome at all. If I feel my project is the best project, then it is entirely rational for me to promote said project. If your criticism of other projects is indeed reasonable, then it would be unreasonable not to voice said concerns, especially if you feel that the dollars are unlikely to be well-spent.

Really this is exactly the sort of thing that someone with a bad project would tend to promote in the first place, isn't it?