Or, what do you want to see more or less of from Less Wrong?
I'm thinking about community norms, content and topics discussed, karma voting patterns, et cetera. There are already posts and comment sections filled with long lists of proposed technical software changes/additions, let's not make this post another one.
My impression is that people sometimes make discussion posts about things that bother them, and sometimes a bunch of people will agree and sometimes a bunch of people will disagree, but most people don't care that much (or they have a life or something) and thus don't want to dedicate a post just to complaining. This post is meant to make it socially and cognitively easy to offer critique.
I humbly request that you list downsides of existing policies even when you think the upsides outweigh them, for all the obvious reasons. I also humbly request that you list a critique/gripe even if you don't want to bother explaining why you have that critique/gripe, and even in cases where you think your gripe is, ahem, "irrational". In general, I think it'd be really cool if we erred on the side of listing things which might be problems even if there's no obvious solution or no real cause for complaint except for personal distaste for the color green (for example).
I arrogantly request that we try to avoid impulsive downvoting and non-niceness for the duration of this post (and others like it). If someone wants to complain that Less Wrong is a little cultish without explaining why then downvoting them to oblivion, while admittedly kind of funny, is probably a bad idea. :)
Has this topic being discussed in detail?
Personally, the reputation system mainly taught me how to play, but not for what reasons, other than maximizing my karma score. It works like a dog-collar, administering electric shocks when the dog approaches a certain barrier. The dog learns where it can go, on grounds of pain.
Humans can often only infer little detail from the change of a number, the little they learn mostly being misinterpreted. People complaining about downvotes are a clear indication for this being the case.
If people write, "I know I'll be downvoted for this, but...", what they mean is, that they learnt, that what they are going to write will be punished, but that they do not know why and are more than superficially interested to learn how they are wrong.
Has it been shown that reputation systems cultivate discourse and teach novel insights rather than turning communities into echo chambers and their members into karma score maximizer's?
If it was my sole intention, I could probably accumulate a lot of karma. Only because I often ignore what I learnt about the reputation system, and write what interests me, I manage to put forth some skepticism. But can a community, that is interested in truth and the refinement of rationality, rely on people to ignore the social pressure and strong incentive being applied by a reputation system, in favor of honesty and diversity?
How much of what is written on Less Wrong, and how it is written, is an effect of the reputation system? How much is left unsaid?
I do not doubt that reputation systems can work, in principle. If everyone involved was perfectly rational, with a clear goal in mind, a reputation system could provide valuable feedback. But once you introduce human nature, it might become practically unfeasible, or have adverse side-effects.
Perhaps we should have a social norm of asking anyone who says "I know I'll be downvoted for this" why they think so.