Agreed.
One person at the Paris meetup made the really interesting and AFAICT accurate observation that the more prominent a Less Wrong post was, the less likely it was to be high quality - ie comments are better than Discussion posts are better than Main (with several obvious and honorable exceptions).
I think maybe it has to do with the knowledge that anything displayed prominently is going to have a bunch of really really smart people swarming all over it and critiquing it and making sure you get very embarrassed if any of it is wrong. People avoid posting things they're not sure about, and so the things that get main-ed tend to be restatements of things that create pleasant feelings in everyone reading them without rocking any conceivable boat, and the sort of overly meta- topics you're talking about lend themselves to those restatements - for example "We should all be more willing to try new things!" or "Let's try to be more alert for biases in our everyday life!"
Potential cures include greater willingness to upvote posts that are interesting but non-perfect, greater willingness to express small disagreements in "IAWYC but" form, and greater willingness to downvote posts that are applause lights or don't present non-obvious new material. I'm starting to do this, but hitting that downvote button when there's nothing objectively false or stupid about a post is hard.
I agree that theoretical-sciency-mathy-insightful stuff is less common now than when Eliezer was writing posts regularly. I suspect this is largely because writing such posts is hard. Few people have that kind of knowledge, thinking ability, and writing skills, and the time to do the writing.
As someone who spends many hours writing posts only to have them nit-picked to death by almost everyone who bothers to comment, I appreciate your advice to "express small disagreements in 'IAWYC but' form."
As for your suggestion to downvote posts that "don't present non-obvious new material," I'm not sure what to think about that. My recent morality post probably contains only material that is obvious to someone as thoroughly familiar with LW material as yourself or Phil Goetz or Will Newsome or Vladimir Nesov or many others, but on the other hand a great many LWers are not quite that familiar, or else haven't taken the time to apply earlier lessons to a topic like morality (and were thus confused when Eliezer skipped past these basics and jumped right into 'Empathic Metaethics' in his own metaethics sequence).
I'm somewhat puzzled by your terminology since the topics you call "meta-rationality":
about how to be rational, how to avoid akrasia, and so on.
strike me as much more practical and applied then the ones you call "applied rationality":
philosophy, value, and possible futures
which strike me as much more meta.
Going by the list of topics you're complaining about, it appears that you are the one who "would rather talk about rationality than use it."
Phil's terminology is probably the way I would have worded the same.
Posts that talk about things like "how do we use the anthropic principle", "what is morality", "what decision theory makes sense", "what is a mysterious answer to a mysterious question", etc. all seem object-level...
...whereas there's another class of posts that always uses the word "rationality" - ie "how can we be more rational in our lives", "how can we promote rationality", "am I a good enough rationalist if..." "who is/isn't a rationalist" et cetera, and these seem properly termed meta-level because they involve being rational about rationality.
I have a feeling the latter class of posts would benefit if they tried to taboo "rationality".
Does the discussion of rationality techniques have a larger market than debates over Sleeping Beauty (I'm even beginning to miss those!)
Wow, I'd forgotten all about those. Those days were fun. We actually had to, well, think occasionally. Nothing remotely challenging has cropped up in a while!
Those days were fun. We actually had to, well, think occasionally. Nothing remotely challenging has cropped up in a while!
If you like thinking about challenging theoretical rationality problems, there are plenty of those left (logical uncertainty/bounded rationality, pascal's wager/mugging/decision theory for running on error-prone hardware, moral/value uncertainty, nature of anticipation/surprise/disappointment/good and bad news, complexity/Occam's razor).
I've actually considered writing a post titled "Drowning in Rationality Problems" to complain about how little we still know about the theory of rationality and how few LWers seem to be working actively on the subject, but I don't know if that's a good way to motivate people. So I guess what I'd like to know is (and not in a rhetorical sense), what's stopping you (and others) from thinking about these problems?
By the way, have you seen how I've been using MathOverflow recently? It seems that if you can reduce some problem to a short math question in standard terms, the default next action (after giving it your own best shot) should be posting it on MO. So far I've posted two problems that interested me, and both got solved within an hour.
But decision theory ought to be a natural attractor for anyone with intellectual interests (any intellectual question -> how am I supposed to answer questions like that? -> epistemology -> Bayesianism -> nature of probability -> decision theory). What's stopping people from getting to the end of this path? Or am I just a freak in my tendency to "go meta"?
What's stopping people from getting to the end of this path?
The wealth of interesting stuff located well before the end.
I'm certainly trying to apply rationality to solve big important problems, but that is taking me a while. About half of my posts so far have been written (sneakily) for the purpose of later calling back to them when making progress in metaethics and CEV.
I share Phil's perception that LW is devoting more time to what you might call "practical rationality in everyday life" and less to the theory of rationality, and his feeling that that's less interesting (albeit perhaps more useful).
I share everyone else's opinion that Phil's terminology is bizarre.
My main concern about Less Wrong recently has been the proliferation of posts related to the Singularity and HP: MoR, which I frankly don't care about. For a site that encourages people to think outside the box, it's at times biased against unorthodox opinions, or at least, I get downvoted for arguing against the Singularity and immortality and for pointing out flaws in MoR. At these times the site seems cultish in a way that makes me feel uncomfortable.
I was drawn here both by Eliezer's meta-rationality posts and by discussions about quantum mechanics, ph...
If you want to talk about about quantum mechanics, philosophy of mathematics, game theory, and such, why not start threads about those topics instead of arguing against the Singularity and immortality and pointing out flaws in MoR—things you don't even care about?
In my vision for the future of the rationalist community, most members are interested in the core of meta-rationality and anti-akrasia and each is interested in a set of peripheral topics (various ways of putting practicing rationality, problems like Sleeping Beauty, trading tutoring, practicing skills, helping the community in practical ways, study groups, social meetings with rationalists, etc.). Some fringe members will be involved in the peripherals and rationality applications but not theory, but they probably won't last long. LW is the core, and will...
I wish there were more posts that tried to integrate math with verbal intuitions, relative to posts that are either all the way on the math side or all the way on the words side.
It seems rathrer llike Eliezer Yudkowsky's blog without (much) Eliezer Yudkowsky.
Which is unfortunate - if understandable.
I think that less Singularity discussion is the result of the related topics having been already discussed many times over. There hasn't been a new idea in AI and decision theory since a while. I'm not implying though that we've finished these topics once and for all. There is certainly a huge amount of stuff to be discovered, it's just that we don't seem to happen upon much stuff these days.
Quality is a bigger concern than subject matter. But that is easily solved by just reading posts and mainly posts by Luke. :)
Is the old concern with values, artificial intelligence, and the Singularity something for LW to grow out of?
"A community blog devoted to refining the art of human rationality" suggests those aren't actually the focus, and that when LW grows up it won't be about AI and the Singularity.
I do agree that some more application would be good, but that tends to go in discussion if at all. Better there than nowhere.
One of the big things about improving rationality is 'Getting Crap Done' and I think the problem is that for an online community wherein most of us are anonymous, there's not a lot on here to help us with that.
Now this site has helped me conceptualize and visualize in a way that I didn't realize was possible. It helped me to see things as they are, and how things could be. The problem is that whilst I'm flying ahead in terms of vision, I still sleep in and get to work late, I still play world of warcraft over going to the local toastmasters meetup, I still...
Is not 'how to be rational, how to avoid akrasia' how one puts 'rationality into practice'? Without hard working producers there is no singularity.
+1 for suitable filtering, or a decent subclustering that keeps everyone happy
I would bet that we'll soon see a resurgence of discussion on decision theory, anthropics etc. in the next few months. If I'm as typical a user as I think I am, then there are a dozen or so people who were largely drawn to LessWrong by those topics, but who stayed silent as they worked on leveling up. lukeprog's recent posts will probably accelerate that process.
More and more, LessWrong's posts are meta-rationality posts, about how to be rational, how to avoid akrasia, and so on. This is probably the intended purpose of the site. But they're starting to bore me.
Agree. The part them makes them boring is that the 'how to' stuff is, basically, rubbish. There are other communities dedicated to in the moment productivity guides. By people who know far more about the subject. Albeit people who maybe don't two box and are perhaps dedicating all their 'productivity' towards 'successful' but ultimately not very important goals.
More and more, LessWrong's posts are meta-rationality posts, about how to be rational, how to avoid akrasia, in general, without any specific application. This is probably the intended purpose of the site. But they're starting to bore me.
What drew me to LessWrong is that it's a place where I can put rationality into practice, discussing specific questions of philosophy, value, and possible futures, with the goal of finding a good path through the Singularity. Many of these topics have no other place where rational discussion of them is possible, online or off. Such applied topics have almost all moved to Discussion now, and may be declining in frequency.
This isn't entirely new. Applied discussions have always suffered bad karma on LW (statistically; please do not respond with anecdotal data). I thought this was because people downvote a post if they find anything in it that they disagree with. But perhaps a lot of people would rather talk about rationality than use it.
Does anyone else have this perception? Or am I just becoming a LW old geezer?
At the same time, LW is taking off in terms of meetups and number of posts. Is it finding its true self? Does the discussion of rationality techniques have a larger market than debates over Sleeping Beauty (I'm even beginning to miss those!) Is the old concern with values, artificial intelligence, and the Singularity something for LW to grow out of?
(ADDED: Some rationality posts are good. I am also a lukeprog fan.)