EDIT: Thanks to people not wanting certain words google-associated with LW: Phyg
Lesswrong has the best signal/noise ratio I know of. This is great. This is why I come here. It's nice to talk about interesting rationality-related topics without people going off the rails about politics/fail philosophy/fail ethics/definitions/etc. This seems to be possible because a good number of us have read the lesswrong material (sequences, etc) which innoculate us against that kind of noise.
Of course Lesswrong is not perfect; there is still noise. Interestingly, most of it is from people who have not read some sequence and thereby make the default mistakes or don't address the community's best understanding of the topic. We are pretty good about downvoting and/or correcting posts that fail at the core sequences, which is good. However, there are other sequences, too, many of them critically important to not failing at metaethics/thinking about AI/etc.
I'm sure you can think of some examples of what I mean. People saying things that you thought were utterly dissolved in some post or sequence, but they don't address that, and no one really calls them out. I could dig up a bunch of quotes but I don't want to single anyone out or make this about any particular point, so I'm leaving it up to your imagination/memory.
It's actually kindof frustrating seeing people make these mistakes. You could say that if I think someone needs to be told about the existence of some sequence they should have read before posting, I ought to tell them, but that's actually not what I want to do with my time here. I want to spend my time reading and participating in informed discussion. A lot of us do end up engaging mistaken posts, but that lowers the quality of discussion here because so much time and space has been spent battling ignorance instead of advancing knowledge and dicussing real problems.
It's worse than just "oh here's some more junk I have to ignore or downvote", because the path of least resistance ends up being "ignore any discussion that contains contradictions of the lesswrong scriptures", which is obviously bad. There are people who have read the sequences and know the state of the arguments and still have some intelligent critique, but it's quite hard to tell the difference between that and someone explaining for the millionth time the problem with "but won't the AI know what's right better than humans?". So I just ignore it all and miss a lot of good stuff.
Right now, the only stuff I can be resonably guaranteed is intelligent, informed, and interesting is the promoted posts. Everything else is a minefield. I'd like there to be something similar for discussion/comments. Some way of knowing "these people I'm talking to know what they are talking about" without having to dig around in their user history or whatever. I'm not proposing a particular solution here, just saying I'd like there to be more high quality discussion between more properly sequenced LWers.
There is a lot of worry on this site about whether we are too exclusive or too phygish or too harsh in our expectation that people be well-read, which I think is misplaced. It is important that modern rationality have a welcoming public face and somewhere that people can discuss without having read three years worth of daily blog posts, but at the same time I find myself looking at the moderation policy of the old sl4 mailing list and thinking "damn, I wish we were more like that". A hard-ass moderator righteously wielding the banhammer against cruft is a good thing and I enjoy it where I find it. Perhaps these things (the public face and the exclusive discussion) should be separated?
I've recently seen someone saying that no-one complains about the signal/noise ratio on LW, and therefore we should relax a bit. I've also seen a good deal of complaints about our phygish exclusivity, the politics ban, the "talk to me when you read the sequences" attitude, and so on. I'd just like to say that I like these things, and I am complaining about the signal/noise ratio on LW.
Lest anyone get the idea that no-one thinks LW should be more phygish or more exclusive, let me hereby register that I for one would like us to all enforce a little more strongly that people read the sequences and even agree with them in a horrifying manner. You don't have to agree with me, but I'd just like to put out there as a matter of fact that there are some of us that would like a more exclusive LW.
Upvoted.
I agree pretty much completely and I think if you're interested in Less Wrong-style rationality, you should either read and understand the sequences (yes, all of them), or go somewhere else. Edit, after many replies: This claim is too strong. I should have said instead that people should at least be making an effort to read and understand the sequences if they wish to comment here, not that everyone should read the whole volume before making a single comment.
There are those who think rationality needs to be learned through osmosis or whatever. That's fine, but I don't want it lowering the quality of discussion here.
I notice that, in topics that Eliezer did not explicitly cover in the sequences (and some that he did), LW has made zero progress in general. This is probably one of the reasons why.
An IRC conversation I had a while ago left me with a powerful message: people will give lip service to keeping the gardens, but when it comes time to actually do it, nobody is willing to.
My recent post explains how to get true beliefs in situations like the anthropic trilemma, which post begins with the words "speaking of problems I don't know how to solve."
However, there is a bit of a remaining problem, since I don't know how to model the wrong way of doing things (naive application of Bayes' rule to questionable interpretations). well enough t... (read more)