I'm worried that LW doesn't have enough good contrarians and skeptics, people who disagree with us or like to find fault in every idea they see, but do so in a way that is often right and can change our minds when they are. I fear that when contrarians/skeptics join us but aren't "good enough", we tend to drive them away instead of improving them.
For example, I know a couple of people who occasionally had interesting ideas that were contrary to the local LW consensus, but were (or appeared to be) too confident in their ideas, both good and bad. Both people ended up being repeatedly downvoted and left our community a few months after they arrived. This must have happened more often than I have noticed (partly evidenced by the large number of comments/posts now marked as written by [deleted], sometimes with whole threads written entirely by deleted accounts). I feel that this is a waste that we should try to prevent (or at least think about how we might). So here are some ideas:
- Try to "fix" them by telling them that they are overconfident and give them hints about how to get LW to take their ideas seriously. Unfortunately, from their perspective such advice must appear to come from someone who is themselves overconfident and wrong, so they're not likely to be very inclined to accept the advice.
- Create a separate section with different social norms, where people are not expected to maintain the "proper" level of confidence and niceness (on pain of being downvoted), and direct overconfident newcomers to it. Perhaps through no-holds-barred debate we can convince them that we're not as crazy and wrong as they thought, and then give them the above-mentioned advice and move them to the main sections.
- Give newcomers some sort of honeymoon period (marked by color-coding of their usernames or something like that), where we ignore their overconfidence and associated social transgressions (or just be extra nice and tolerant towards them), and take their ideas on their own merits. Maybe if they see us take their ideas seriously, that will cause them to reciprocate and take us more seriously when we point out that they may be wrong or overconfident.
OTOH, I don’t think group think is a big problem. Criticism by folks like Will Newsome, Vladimir Slepnev and especially Wei Dai is often upvoted. (I upvote almost every comment of Dai or Newsome if I don’t forget it. Dai makes always very good points and Newsome is often wrong but also hilariously funny or just brilliant and right.) Of course, folks like this Dymytry guy are often downvoted, but IMO with good reason.
Stream of consciousness. Judge me that ye may be judged. If you judge it by first-level Less Wrong standards, it should be downvoted (vague unjustifiied assertions, thoughtlessly rude), but maybe the information is useful. I look first for the heavily downvoted posts and enjoy the responses to them best.
I found the discussion on dietary supplementation interesting, in your link and elsewhere. As I recall, the tendency was for the responses (not entrants, but peoples comments around town) to be both crazy and stupid (with many exceptions, e.g., Yvain, Xacharaiah). I recall another thread on the topic where the correct comment ("careful!") was downvoted and its obvious explanation ("evolution works!") offered afterward was upvoted. Since I detected no secondary reasons for this, it was interesting in implying Less Wrongians did not see the obvious. Low certainties attached since I know I know nothing about this place. I'm deliberately being vague.
In general, Less Wrongians strike me as a group of people of impaired instrumental rationality who are working to overcome it. Give or take, most of you seem to be smarter than average but also less trustworthy, less able to exhibit strong commitments, etc. Probably this has been written somewhere hereabouts, but a lot of irrationalities are hard to overcome local optima; have you really gone far enough onto the other side? Incidentally, that could be a definition for x-rationality (if never actually done): Actually epistemically rational enough that it's instrumentally useful. Probably a brutally hard threshold to achieve and seems untrue of here, as I believe I've seen threads comment.
I was curious about the background of the people offering lessons at the rationality bootcamp, and saw some blog entry by one of them against, oh, being conservative in outlook (re: risk aversion). It was incredibly stupid; I mean, almost exclusively circular reasoning. You obviously deviate from the norm in your risk aversion. You're not obviously more successful than the norm (or are you? perhaps I'm mistaken). Maybe it's just a tough row to hoe, but that's the real task.
Personal comment: I realize Dmitry has been criticized a bit elsewhere and the voting trend doesn't support generalization to the community at large, but my conversation with him illustrates what I generally believe about this place. I knew more than he did. I said enough that he should realize this. He didn't realize it and shoehorned his response into a boring framework. I had specific advice to give, which I didn't get to, and realized I was reluctant to give (most Less Wrong stuff seems weak to me).
A whole lot of Less Wrong seems to be going for less detail, less knowledge, more use of frameworks of universal applicability and little precision. The sequences seem similar to me: Boring where I can judge meaning, meaningless where I can't. And always too long. I've read about four paragraphs of them in total. The quality of conversation here is high for a blog, of course, but low for a good academic setting. Some of the mild sneering at academics around here sounds ridiculous (an AI researcher believes in God). AI's a weak field. All round, papers don't quite capture any field and are often way way behind what people roughly feel.
Real question: Do you want me here?
I like you guys. I agree with you philosophically. I have nothing much to offer unless I put some effort into it (e.g., actually read what people write, etc). No confusion: You should be downvoting posts like this in general. You might want to make an exception 'cause it's worth hearing a particular rambling mindset once. My effort is better spent elsewhere (I can't imagine you'd disagree). I can't see anything that can be offered to me. I feel like I was more rational at age 7 than you are now (I wrote a pro and con list for castrating myself for the longevity and potential continuity of personality gains; e.g., maintaining the me of 7). A million other things. I'm working on real problems in other areas now.
??? Seriously?