Occasionally, concerns have been expressed from within Less Wrong that the community is too homogeneous. Certainly the observation of homogeneity is true to the extent that the community shares common views that are minority views in the general population.
Maintaining a High Signal to Noise Ratio
The Less Wrong community shares an ideology that it is calling ‘rationality’(despite some attempts to rename it, this is what it is). A burgeoning ideology needs a lot of faithful support in order to develop true. By this, I mean that the ideology needs a chance to define itself as it would define itself, without a lot of competing influences watering it down, adding impure elements, distorting it. In other words, you want to cultivate a high signal to noise ratio.
For the most part, Less Wrong is remarkably successful at cultivating this high signal to noise ratio. A common ideology attracts people to Less Wrong, and then karma is used to maintain fidelity. It protects Less Wrong from the influence of outsiders who just don't "get it". It is also used to guide and teach people who are reasonably near the ideology but need some training in rationality. Thus, karma is awarded for views that align especially well with the ideology, align reasonably well, or that align with one of the directions that the ideology is reasonably evolving.
Rationality is not a religion – Or is it?
Therefore, on Less Wrong, a person earns karma by expressing views from within the ideology. Wayward comments are discouraged with down-votes. Sometimes, even, an ideological toe is stepped on, and the disapproval is more explicit. I’ve been told, here and there, one way or another, that expressing extremely dissenting views is: stomping on flowers, showing disrespect, not playing along, being inconsiderate.
So it turns out: the conditions necessary for the faithful support of an ideology are not that different from the conditions sufficient for developing a cult.
But Less Wrong isn't a religion or a cult. It wants to identify and dis-root illusion, not create a safe place to cultivate it. Somewhere, Less Wrong must be able challenge its basic assumptions, and see how they hold up to new and all evidence. You have to allow brave dissent.
-
Outsiders who insist on hanging around can help by pointing to assumptions that are thought to be self-evident by those who "get it", but that aren’t obviously true. And which may be wrong.
-
It’s not necessarily the case that someone challenging a significant assumption doesn’t get it and doesn’t belong here. Maybe, occasionally, someone with a dissenting view may be representing the ideology more than the status quo.
Shouldn’t there be a place where people who think they are more rational (or better than rational), can say, “hey, this is wrong!”?
A Solution
I am creating this top-level post for people to express dissenting views that are simply too far from the main ideology to be expressed in other posts. If successful, it would serve two purposes. First, it would remove extreme dissent away from the other posts, thus maintaining fidelity there. People who want to play at “rationality” ideology can play without other, irrelevant points of view spoiling the fun. Second, it would allow dissent for those in the community who are interested in not being a cult, challenging first assumptions and suggesting ideas for improving Less Wrong without being traitorous. (By the way, karma must still work the same, or the discussion loses its value relative to the rest of Less Wrong. Be prepared to lose karma.)
Thus I encourage anyone (outsiders and insiders) to use this post “Dissenting Views” to answer the question: Where do you think Less Wrong is most wrong?
No, an example of a technique that is harmful, but whose harm would have been difficult for a reasonable person to predict in advance. The potential downside of the cookie trick is easy to notice and easy to reverse (well, I guess you can't easily reverse gaining epsilon weight, but you can limit it to epsilon), so as a reason not to try it's very weak.
I take my point back. If you can only try one thing, it makes sense to just act if there is only one option, but to demand a good reason before wasting your chance if there are multiple options. (Formally, this is because the opportunity cost of failure is greater in the latter case.) Realistically, "willpower to engage in psychological modification" seems like it would often be a limiting factor producing this effect; still, I would expect irrational choice avoidance to be a factor in many cases of people demanding a reason to favor one option.