Note: After writing this post, I realized there's a lot I need to learn about this subject. I've been thinking a lot about how I use the word "elitism" and what it meant to me. I was unaware that there are a large number of people who use the word to describe themselves and mean something totally different from the definition that I had. This resulted in my perception that people who were using the word to describe themselves were being socially inept. I now realize that it's not a matter of social ineptness, that it may be more of a matter of political sides. I also realized that mind-kill reactions may be influencing us here (myself included). So, now my goal is to make sure I understand both sides thoroughly to transcend these mind-kill reactions and explain to others how I accomplished this so that none of us has to have them. I think these sides can get along better. That is what I ultimately want - for the gifted population and the rest of the world to understand one another better, for the privileged and the disadvantaged to understand one another better, and for the tensions between those groups to be reduced so that we can work together effectively. I realize that this is not a simple undertaking, but this is a very important problem to me. I see this being an ongoing project in my life. If I don't seem to understand your point of view on this topic, please help me update. I want to understand it.
TLDR: OMG a bunch of people seem to want to use the word "elitist" to describe LessWrong but I know that this can provoke hatred. I don't want to be smeared as an elitist. I can't fathom why it would be necessary for us to call ourselves "elitists".
I have noticed a current of elitism on LessWrong. I know that not every person here is an elitist, but there are enough people here who seem to believe elitism is a good thing (13 upvotes!?) that it's worth addressing this conflict. In my experience, the word "elitism" is a triggering word - it's not something you can use easily without offending people. Acknowledging intellectual differences is a touchy subject also, very likely to invite accusations of elitism. From what I've seen, I'm convinced that using the word "elitism" casually is a mistake, and referring to intellectual differences incautiously is also risky.
Here, I analyze the motives behind the use of the word elitism, make a suggestion for what the main conflict is, mention a possible solution, talk about whether the solution is elitist, what elitism really means, and what the consequences may be if we allow ourselves to be seen as elitists.
The theme I am seeing echoed throughout the threads where elitist comments surfaced is "We want quality" and "We want a challenging learning environment". I agree that quality goals and a challenging environment are necessary for refining rationality, but I disagree that elitism is needed.
I think the problem comes in at the point where we think about how challenging the environment should be. There's a conflict between the website's main vision: spreading rationality (detailed in: Rationality: Common Interest of Many Causes) and striving for the highest quality standards possible (detailed in Well-Kept Gardens Die By Pacifism).
If the discussions are geared for beginners, advanced people will not learn. If the discussions are geared for advanced people, beginners are frustrated. It's built into our brains. Psychologist Mihaly Csikszentmihalyi, author of "Flow: The psychology of optimal experience" regards flow, the feeling of motivation and pleasure you get when you're appropriately challenged, to be the secret to happiness and he explains that if you aren't appropriately challenged, you're either going to feel bored or frustrated depending on whether the challenge is too small or too great for your ability level.
Because our brains never stop rewarding and punishing us with flow, boredom and frustration, we strive for that appropriate challenge constantly. Because we're not all at the same ability level, we're not all going to flow during the same discussions. We can't expect this to change, and it's nobody's fault.
This is a real conflict, but we don't have to choose between the elitist move of blocking everyone that's not at our level vs. the flow killing move of letting the challenge level in discussions decrease to the point where it results in everyone's apathy - we can solve this.
Why bother to solve it? If your hope is to raise the sanity waterline, you cannot neglect those who are interested in rational thought but haven't yet gotten very far. Doing so would limit your impact to a small group, failing to make a dent in overall sanity. If you neglect the small group of advanced rationalists, then you've lost an important source of rational insights that people at every level might learn from and you will have failed to attract the few and precious teachers who will assist the beginners in developing further faster.
And there is a solution; summarized in one paragraph: Make several areas divided by their level of difficulty. Advanced learners can learn in the advanced area, beginners in the beginner area. That way everyone learns. Not every advanced person is a teacher, but if you put a beginner area and an advanced area on the same site, some people from the advanced area will help get the beginners further. One-on-one teaching isn't the only option - advanced people might write articles for beginners and get through to thousands at once. They might write practice quizzes for them to do (not hard to implement from a web developer's perspective). There are other things. (I won't get into them here.)
This brings me to another question: if LessWrong separates the learning levels, would the separation qualify as elitism?
I think we can all agree that people don't learn well in classes that are too easy for them. If you want advanced people to improve, it's an absolute necessity to have an advanced area. I'm not questioning that. I'm questioning whether it qualifies under the definition of elitism:
e·lit·ism
Spreading rationality empowers people. If you wanted to take power over them, you'd horde it. By posting our rational insights in public, we share them. We are not hoarding them and demanding to be made rulers because of our power. We are giving them away and hoping they improve the world.
Using rationality as a basis for rule makes no sense anyway. If you have a better map of the territory, people should update because you have a better map (assuming you overcome inferential distances). Forcing an update because you want to rule would only amount to an appeal to authority or coercion. That's not rational. If you show them a more complete map and they update, that isn't about you - you should be updating your map when the time comes, too. It's the territory that rules us all. You are only sharing your map.
For the second definition, there are two pieces. "Consciousness of or pride in" and "select or favored group". I can tell you one thing for certain: if you form a group of intellectual elitists, they will not be considered "select or favored" by the general population. They will be treated as the scum on the bottom of scum's shoe.
For that reason, any group of intellectual elitists will quickly become an oxymoron. First, they'll have to believe that they are "select and favored" when they are not, and perhaps justify this with "we are so deserving of being select and favored that no one can see it but us" (which may make them hopelessly unable to update). Second, the attitude of superiority is likely to provoke such anti-intellectual counter-prejudice that the resulting oppression could make them ineffectual. Powerless to get anywhere because they are so hated, their "superiority" will make them into second class citizens. You don't achieve elite status by being an intellectual elitist.
In the event that LessWrong was considered "select" or "favored" by the outside population, would "consciousness" of that qualify the members as elitists? If you use the literal definition of "consciousness", you can claim a literal "yes" - but it would mean that simply acknowledging a (hypothetical) fact (independent market research surveys, we'll say) should be taken as automatic proof that you're an arrogant scumbag. That would be committing Yvain's "worst argument in the world", guilt by association. We can't assume that everyone who acknowledges popularity or excellence is guilty of wrongdoing.
So let's ask this: Why does elitism have negative connotations? What does it REALLY mean when people call a group of intellectuals "elitists"?
I think the answer to this is in Jane Elliot's brown eyes, blue eyes experiment. If you're not familiar with it, a school teacher named Jane Elliot, horrified by the assassination of Martin Luther King, Jr. decided to teach her class a lesson about prejudice. She divided the class into two groups - brown eyes and blue eyes. She told them things like brown eyed kids are smarter and harder-working than blue eyed kids. The children reacted dramatically:
When people complain of elitism, what they seem to be reacting to is a concern that feeling "better than others" will be used as an excuse for abuse - either via coercion, or by sabotaging their sense of self-worth and intellectual performance.
The goal of LessWrong is to spread rationality in order to make a bigger difference in the world. This has nothing to do with abusing people. Just because some people with advanced abilities choose to use them as an excuse to abuse other people, it doesn't mean that anybody here has to do that. Just because some of us might have advanced abilities and are also aware of them does not mean we need to commit Yvain's "the worst argument in the world" by assuming the guilt that comes with elitism. We can reject this sort of thinking. If people tell you that you're an elitist because you want a challenging social environment to learn in, or because you want to make the project that is the LessWrong blog as high quality as it can be, you can refuse to be labeled guilty.
Refusing to be guilty by association takes more work than accepting the status quo but what would happen if we allowed ourselves to be disrespected for challenging ourselves and striving for quality? If we agree with them, we're viewing positive character traits as part of a problem. That encourages people to shoot themselves in the foot - and they can point that same gun at all of humanity's potential, demanding that nobody seeks the challenging social environment they need to grow, that nobody sets any learning goals to strive for because quality standards are elitist. To allow a need for challenges and standards to be smeared as elitist will only hinder the spread of rationality.
How many may forgo refining rationality because they worry it will make them look like an elitist?
These are the reasons I choose to be non-abusive and to send a message to the world that non-abusive intellectuals exist.
What do you think of this?
I don't think people who feel comfortable posting average youtube comments are going to be welcome or useful at LessWrong, I don't think this is a problem, and there are a lot of people like that.
Raising the sanity waterline on a grand scale should affect the comments on youtube, but we're a long way from that.
This being said, I'd like to see more rationality materials for people of average intelligence, but that's another long term possibility. Not does there not seem to be huge interest in the project, figuring out simple explanations for new ideas is work, and it seems to be be a relatively rare talent.
I only recently ran into a good simple explanation for Bayes-- that the more detailed a prediction becomes, the less likely it is to be true. And I got it from a woman who doesn't post on LW because she thinks the barriers to entry are too high. (It's possible that this explanation was on LW, and I didn't see it or it didn't register--- has anyone seen it here?)
There's some degree of natural sorting on LW-- I'm not the only person who doesn't read the more mathematical or technical material here, and I'm not commenting on that material, either.
I don't think having separate ranked areas is going to solve the problem of people living down to expectations.
That looks like part of the definition of probability.
Bayes would be more like 'If you've got two ideas about what's going on, and one of them says one thing's going to happen, and the other says a different thing, but in the event it's the first thing that happens, then you should believe the first idea more and the second idea less'.
Or to get a bit less abstract, say you're playing dungeons and dragons, and and orc hits you with a sword, and you're pretty sure that orcs do ei... (read more)