Occasionally, concerns have been expressed from within Less Wrong that the community is too homogeneous. Certainly the observation of homogeneity is true to the extent that the community shares common views that are minority views in the general population.
Maintaining a High Signal to Noise Ratio
The Less Wrong community shares an ideology that it is calling ‘rationality’(despite some attempts to rename it, this is what it is). A burgeoning ideology needs a lot of faithful support in order to develop true. By this, I mean that the ideology needs a chance to define itself as it would define itself, without a lot of competing influences watering it down, adding impure elements, distorting it. In other words, you want to cultivate a high signal to noise ratio.
For the most part, Less Wrong is remarkably successful at cultivating this high signal to noise ratio. A common ideology attracts people to Less Wrong, and then karma is used to maintain fidelity. It protects Less Wrong from the influence of outsiders who just don't "get it". It is also used to guide and teach people who are reasonably near the ideology but need some training in rationality. Thus, karma is awarded for views that align especially well with the ideology, align reasonably well, or that align with one of the directions that the ideology is reasonably evolving.
Rationality is not a religion – Or is it?
Therefore, on Less Wrong, a person earns karma by expressing views from within the ideology. Wayward comments are discouraged with down-votes. Sometimes, even, an ideological toe is stepped on, and the disapproval is more explicit. I’ve been told, here and there, one way or another, that expressing extremely dissenting views is: stomping on flowers, showing disrespect, not playing along, being inconsiderate.
So it turns out: the conditions necessary for the faithful support of an ideology are not that different from the conditions sufficient for developing a cult.
But Less Wrong isn't a religion or a cult. It wants to identify and dis-root illusion, not create a safe place to cultivate it. Somewhere, Less Wrong must be able challenge its basic assumptions, and see how they hold up to new and all evidence. You have to allow brave dissent.
-
Outsiders who insist on hanging around can help by pointing to assumptions that are thought to be self-evident by those who "get it", but that aren’t obviously true. And which may be wrong.
-
It’s not necessarily the case that someone challenging a significant assumption doesn’t get it and doesn’t belong here. Maybe, occasionally, someone with a dissenting view may be representing the ideology more than the status quo.
Shouldn’t there be a place where people who think they are more rational (or better than rational), can say, “hey, this is wrong!”?
A Solution
I am creating this top-level post for people to express dissenting views that are simply too far from the main ideology to be expressed in other posts. If successful, it would serve two purposes. First, it would remove extreme dissent away from the other posts, thus maintaining fidelity there. People who want to play at “rationality” ideology can play without other, irrelevant points of view spoiling the fun. Second, it would allow dissent for those in the community who are interested in not being a cult, challenging first assumptions and suggesting ideas for improving Less Wrong without being traitorous. (By the way, karma must still work the same, or the discussion loses its value relative to the rest of Less Wrong. Be prepared to lose karma.)
Thus I encourage anyone (outsiders and insiders) to use this post “Dissenting Views” to answer the question: Where do you think Less Wrong is most wrong?
I've also had mixed feelings about the concept of being "less wrong." Anyone else?
Of course, it is harder to identify and articulate what is wrong than what is right: we know many ways of thinking that lead away from truth, but it harder to know when ways of thinking lead toward the truth. So the phrase "less wrong" might merely be an acknowledgment of fallibilism. All our ideas are riddled with mistakes, but it's possible to make less mistakes or less egregious mistakes.
Yet "less wrong" and "overcoming bias" sound kind of like "playing to not lose," rather than "playing to win." There is much more material on these projects about how to avoid cognitive and epistemological errors, rather than about how to achieve cognitive and epistemological successes. Eliezer's excellent post on underconfidence might help us protect an epistemological success once we somehow find one, and protect it even from our own great knowledge of biases, yet the debiasing program of LessWrong and Overcoming Bias is not optimal for showing us how to achieve such successes in the first place.
The idea might be that if we run as fast as we can away from falsehood, and look over our shoulder often enough, we will eventually run into the truth. Yet without any basis for moving towards the truth, we will probably just run into even more falsehood, because there are exponentially more possible crazy thoughts than sane thoughts. Process of elimination is really only good for solving certain types of problems, where the right answer is among our options and the number of false options to eliminate is finite and manageable.
If we are in search of a Holy Grail, we need a better plan than being able to identify all the things that are not the Holy Grail. Knowing that an African swallow is not a Holy Grail will certainly help us not not find the true Holy Grail because we erroneously mistake a bird for it, but it tells us absolutely nothing about where to actually look for the Holy Grail.
The ultimate way to be "less wrong" is radical skepticism. As a fallibilist, I am fully aware that we may never know when or if we are finding the truth, but I do think we can use heuristic to move towards it, rather than merely trying to move away from falsehood and hoping we bump into the truth backwards. That's why I've been writing about heuristic here and here, and why I am glad to see Alicorn writing about heuristics to achieve procedural knowledge.
For certain real-world projects that shall-not-be-named to succeed, we will need to have some great cognitive and epistemological successes, not merely avoid failures.
And if you play the lottery long enough, you'll eventually win. When your goal is to find something, approach usually works better than avoidance. This is especially true for learning -- I remember reading a book where a seminar presenter described an experiment he did in his seminars, of sending a volunteer out of the room while the group picked an object in the room.
After the volunteer returned, their job was to find the ob... (read more)