Introduction
Less Wrong is explicitly intended is to help people become more rational. Eliezer has posted that rationality means epistemic rationality (having & updating a correct model of the world), and instrumental rationality (the art of achieving your goals effectively). Both are fundamentally tied to the real world and our performance in it - they are about ability in practice, not theoretical knowledge (except inasmuch as that knowledge helps ability in practice). Unfortunately, I think Less Wrong is a failure at instilling abilities-in-practice, and designed in a way that detracts from people's real-world performance.
It will take some time, and it may be unpleasant to hear, but I'm going to try to explain what LW is, why that's bad, and sketch what a tool to actually help people become more rational would look like.
(This post was motivated by Anna Salomon's Humans are not automatically strategic and the response, more detailed background in footnote [1].)
Update / Clarification in response to some comments: This post is based on the assumption that a) the creators of Less Wrong wish Less Wrong to result in people becoming better at achieving their goals (instrumental rationality, aka "efficient productivity"), and b) Some (perhaps many) readers read it towards that goal. It is this I think is self-deception. I do not dispute that LW can be used in a positive way (read during fun time instead of the NYT or funny pictures on Digg), or that it has positive effects (exposing people to important ideas they might not see elsewhere). I merely dispute that reading fun things on the internet can help people become more instrumentally rational. Additionally, I think instrumental rationality is really important and could be a huge benefit to people's lives (in fact, is by definition!), and so a community value that "deliberate practice towards self-improvement" is more valuable and more important than "reading entertaining ideas on the internet" would be of immense value to LW as a community - while greatly decreasing the importance of LW as a website.
Why Less Wrong is not an effective route to increasing rationality.
Definition:
Work: time spent acting in an instrumentally rational manner, ie forcing your attention towards the tasks you have consciously determined will be the most effective at achieving your consciously chosen goals, rather than allowing your mind to drift to what is shiny and fun.
By definition, Work is what (instrumental) rationalists wish to do more of. A corollary is that Work is also what is required in order to increase one's capacity to Work. This must be true by the definition of instrumental rationality - if it's the most efficient way to achieve one's goals, and if one's goal is to increase one's instrumental rationality, doing so is most efficiently done by being instrumentally rational about it. [2]
That was almost circular, so to add meat, you'll notice in the definition an embedded assumption that the "hard" part of Work is directing attention - forcing yourself to do what you know you ought to instead of what is fun & easy. (And to a lesser degree, determining your goals and the most effective tasks to achieve them). This assumption may not hold true for everyone, but with the amount of discussion of "Akrasia" on LW, the general drift of writing by smart people about productivity (Paul Graham: Addiction, Distraction, Merlin Mann: Time & Attention), and the common themes in the numerous productivity/self-help books I've read, I think it's fair to say that identifying the goals and tasks that matter and getting yourself to do them is what most humans fundamentally struggle with when it comes to instrumental rationality.
Figuring out goals is fairly personal, often subjective, and can be difficult. I definitely think the deep philosophical elements of Less Wrong and it's contributions to epistemic rationality [3] are useful to this, but (like psychedelics) the benefit comes from small occasional doses of the good stuff. Goals should be re-examined regularly, but occasionally (roughly yearly, and at major life forks). An annual retreat with a mix of close friends and distant-but-respected acquaintances (Burning Man, perhaps) will do the trick - reading a regularly updated blog is way overkill.
And figuring out tasks, once you turn your attention to it, is pretty easy. Once you have explicit goals, just consciously and continuously examining whether your actions have been effective at achieving those goals will get you way above the average smart human at correctly choosing the most effective tasks. The big deal here for many (most?) of us, is the conscious direction of our attention.
What is the enemy of consciously directed attention? It is shiny distraction. And what is Less Wrong? It is a blog, a succession of short fun posts with comments, most likely read when people wish to distract or entertain themselves, and tuned for producing shiny ideas which successfully distract and entertain people. As Merlin Mann says: "Joining a Facebook group about creative productivity is like buying a chair about jogging". Well, reading a blog to overcome akrasia IS joining a Facebook group about creative productivity. It's the opposite of this classic piece of advice.
Now, I freely admit that this argument is relatively brief and minimally supported compared to what a really good, solid argument about exactly how to become more rational would be. This laziness is deliberate, conscious, and a direct expression of my beliefs about the problem with LW. I believe that most people, particularly smart ones, do way too much thinking & talking and way too little action (me included), because that is what's easy for them [4].
What I see as a better route is to gather those who will quickly agree, do things differently, (hopefully) win and (definitely) learn. Note that this general technique has a double advantage: the small group gets to enjoy immediate results, and when the time comes to change minds, they have the powerful evidence of their experience. It also reduces the problem that the stated goal of many participants ("get more rational") may not be their actual goal ("enjoy the company of rationalists in a way which is shiny fun, not Work"), since the call to action will tend to select for those who actually desire self-improvement. My hope is that this post and the description below of what actual personal growth looks like inspire one or more small groups to form.
Less Wrong: Negative Value, Positive Potential
Unfortunately, in this framework, Less Wrong is probably of negative value to those who really want to become more rational. I see it as a low-ROI activity whose shininess is tuned to attract the rationality community, and thus serves as the perfect distraction (rationality porn, rationality opium). Many (most?) participants are allowing LW to grab their attention because it is fun and easy, and thus simultaneously distracting themselves from Work (reducing their overall Work time) while convincing themselves that this distraction is helping them to become more rational. This reduces the chance that they will consciously Work towards rationality, since they feel they are already working towards that goal with their LW reading time. (Adding [4.5] in response to comments).
(Note that from this perspective, HP&TMoR is a positive - people know reading fanfic is entertainment, and being good enough entertainment to displace people's less educational alternative entertainments while teaching a little rationality increases the overall level of rationality. The key is that HP&TMoR is read in "fun time", while I believe most people see LW time as "work towards self-improvement" time. Ironic, but true for me and the friends I've polled, at least)
That said, the property of shininess-to-rationalists has resulted in a large community of rationalists, which makes LW potentially a great resource for actual training of people's individual rationality. And while catalyzing Work is much harder than getting positive feedback, I do find it heart-warming and promising that I have consistently received positive feedback from the LW community by pointing out it's errors. This is a community that wants to self-correct - which is unfortunately rare and a necessary though not sufficient criteria for improvement.
This is taking too long to write [5], and we haven't even gotten to the constructive part, so I'm going to assume that if you are still with me you no longer need as detailed arguments and I can go faster.
Some Observations On What Makes Something Useful For Self-Improvement
My version: Growth activities are Work, and hence feel like work, not fun - they involves overriding your instincts, not following them. Any growth you can get from following your instincts, you have probably had already. And consciously directing your attention is not something that can be trained by being distracted (willpower is a muscle, you exercise it by using it). Finding the best tasks to achieve your goals is not practiced by doing whatever tasks come to mind. And so forth. You may experience flow states once your attention is focused where it should be, but unless you have the incredible and rare fortune to have what is shiny match up with what is useful, the act of starting and maintaining focus and improving your ability to do so will be hard work.
The academic version: The literature on skill development ("acquisition of expertise") says that it involves "deliberate practice". The same is very likely true of acquiring expertise in rationality. The 6 tenets of deliberate practice are that it:
- Is not inherently enjoyable.
- Is not play or paid practice.
- Is relevant to the skill being developed.
- Is not simply watching the skill being performed.
- Requires effort and attention from the learner.
- Often involves activities selected by a coach or teacher to facilitate learning.
One must stretch quite a bit to fit these to "reading Less Wrong" - it's just too shiny and fun to be useful. One can (and must) enjoy the results of practice, but if the practice itself doesn't take effort, you are going to plateau fast. (I want to be clear, BTW, that I am not making a Puritan fallacy of equating effort and reward [6]). Meditation is a great example of an instrumental rationality practice: it is a boring, difficult isolation exercise for directing and noticing the direction of one's attention. It is Work.
What Would A Real Rationality Practice Look Like?
Eliezer has used the phrase "rationality dojo", which I think has many correct implications:
- It is a group of people who gather in person to train specific skills.
- While there are some theoreticians of the art, most people participate by learning it and doing it, not theorizing about it.
- Thus the main focus is on local practice groups, along with the global coordination to maximize their effectiveness (marketing, branding, integration of knowledge, common infrastructure). As a result, it is driven by the needs of the learners.
- You have to sweat, but the result is you get stronger.
- You improve by learning from those better than you, competing with those at your level, and teaching those below you.
- It is run by a professional, or at least someone getting paid for their hobby. The practicants receive personal benefit from their practice, in particular from the value-added of the coach, enough to pay for talented coaches.
In general, a real rationality practice should feel a lot more like going to the gym, and a lot less like hanging out with friends at a bar.
To explain the ones that I worry will be non-obvious:
1) I don't know why in-person group is important, but it seems to be - all the people who have replied to me so far saying they get useful rational practice out of the LW community said the growth came through attending local meetups (example). We can easily invent some evolutionary psychology story for this, but it doesn't matter why, at this point it's enough to just know.
6) There are people who can do high-quality productive work in their spare time, but in my experience they are very rare. It is very pleasant to think that "amateurs can change the world" because then we can fantasize about ourselves doing it in our spare time, and it even happens occasionally, which feeds that fantasy, but I don't find it very credible. I know we are really smart and there are memes in our community that rationalists are way better than everyone else at everything, but frankly I find the idea that people writing blog posts in their spare time will create a better system than trained professionals for improving one's ability to achieve one's goals to be ludicrous. I know some personal growth professionals, and they are smart too, and they have had years of practice and study to develop practical experience. Talk is cheap, as is time spent reading blogs: if people actually value becoming more rational, they will pay for it, and if there are good teachers, they will be worth being paid. Money is a unit of learning [7].
There are some other important aspects which such a practice would have that LW does not:
- The accumulation of knowledge. Blogs are inherently rewarding: people read what is recent, so you get quick feedback on posts and comments. However, they are inherently ephemeral for the same reason - people read what is recent, and posts are never substantially revised. The voting system helps a little, but it can't even close to fix the underlying structure. To be efficient, much less work should go into ephemeral posts, and much more into accumulating and revising a large, detailed, nuanced body of knowledge (this is exactly the sort of ""work not fun" activity that you can get by paying someone, but are unlikely to get when contributors are volunteers). In theory, this could happen on the Wiki, but in practice I have rarely seen Wikis succeed at this (with the obvious except of Le Wik).
- It would involve more literature review and pointers to existing work. The obvious highest-ROI way to start working on improving instrumental rationality is to research and summarize the best existing work for self-improvement in the directions that LW values, not to reinvent the wheel. Yes, over time LW should produce original work and perhaps eventually the best such work, but the existing work is not so bad that it should just be ignored. Far from it! In reference to (1), perhaps this should be done by creating a database of reviews and ratings by LWers of the top-rated self-improvement books, perhaps eventually with ratings taking into account the variety of skills people are seeking and ways in which they optimally learn.
- It would be practical - most units of information (posts, pages, whatever) would be about exercises or ideas that one could immediately apply in one's own life. It would look less like most LW posts (abstract, theoretical, focused on chains of logic), and more like Structured Procrastination, the Pmarca Guide To Personal Productivity, books like Eat That Frog!, Getting Things Done, and Switch [8]. Most discussion would be about topics like those in Anna's post - how to act effectively, what things people have tried, what worked, what didn't, and why. More learning through empiricism, less through logic and analysis.
In forming such a practice, we could learn from other communities which have developed a new body of knowledge about a set of skills and disseminated it with rapid scaling within the last 15 years. Two I know about and have tangentially participated in are
- PUA (how to pick up women). In fact, a social skills community based on PUA was suggested on LW a few days ago - (glad to see that others are interested in practice and not just talk!)
- CrossFit (synthesis of the best techniques for time-efficient broad-applicability fitness)
Note that both involve most of my suggested features (PUA has some "reading not doing" issues, but it's far ahead of LW in having an explicit cultural value to the contrary - for example, almost every workshop features time spent "in the field"). One feature of PUA in particular I'd like to point out is the concept of the "PUA lair" - a group of people living together with the explicit intention of increasing their PUA skills. As the lair link says: "It is highly touted that the most proficient and fastest way to improve your skills is to hang out with others who are ahead of you, and those whose goals for improvement mirror your own." [9]
Conclusion
If LW is to accomplish it's goal of increasing participant's instrumental rationality, it must dramatically change form. One of the biggest, perhaps the biggest element of instrumental rationality is the ability to direct one's attention, and a rationality blog makes people worse at this by distracting their attention in a way accepted by their community and that they will feel is useful. From The War Of Art [10]:
Often couples or close friends,even entire families, will enter into tacit compacts whereby each individual pledges (unconsciously) to remain mired in the same slough in which she and all her cronies have become so comfortable. The highest treason a crab can commit is to make a leap for the rim of the bucket.
To aid growth at rationality, Less Wrong would have to become a skill practice community, more like martial arts, PUA, and physical fitness, with an explicit focus of helping people grow in their ability to set and achieve goals, combining local chapters with global coordination, infrastructure, and knowledge accumulation. Most discussion should be among people working on a specific skill at a similar level about what is or isn't working for them as they attempt to progress, rather than obscure theories about the inner workings of the human mind.
Such a practice and community would look very different, but I believe it would have a far better chance to actually make people more rational [11]. There would be danger of cultism and the religious fervor/"one true way" that self-help movements sometimes have (Landmark), and I wonder if it's a profound distaste for anything remotely smelling of cult that has led Eliezer & SIAI away from this path. But the opposite of cult is not growth, it is to continue being an opiate for rationalists, a pleasant way of making the time pass that feels like work towards growth and thus feeds people's desire for guiltless distraction.
To be growth, we must do work, people must get paid, we must gather in person, focus on action not words, put forth great effort over time to increase our capacity, use peak experiences to knock people loose from ingrained patterns, and copy these and much more from the skill practice communities of the world. Developed by non-rationalists, sure, but the ones that last are the ones that work [12] - let's learn from their embedded knowledge.
Addendum
That was 5 hours of my semi-Work time, so I really hope it wasn't wasted, and that some of you not only listen but take action. I don't have much free time for new projects, but if people want to start a local rationality dojo in Mountain View/Sunnyvale, I'm in. And there is already talk, among some reviewers of this draft, of putting together an introductory workshop. Time will tell - and the next step is up to you.
Footnotes
[1] Anna Salomon posted Humans are not automatically strategic, a reply to the very practical A "Failure to Evaluate Return-on-Time" Fallacy. Anna's post laid out a nice rough map at what an instrumentally rational process for goal achievement would look like (consciously choosing goals, metrics, researching solutions, experimenting with implementing them, balancing exploration & exploitation - the basic recipe for success at anything), said she was keen to train this, and asked:
So, to second Lionhearted's questions: does this analysis seem right? Have some of you trained yourselves to be substantially more strategic, or goal-achieving, than you started out? How did you do it? Do you agree with (a)-(h) above? Do you have some good heuristics to add? Do you have some good ideas for how to train yourself in such heuristics?
After reading the comments, I made a comment which began:
I'm disappointed at how few of these comments, particularly the highly-voted ones, are about proposed solutions, or at least proposed areas for research. My general concern about the LW community is that it seems much more interested in the fun of debating and analyzing biases, rather than the boring repetitive trial-and-error of correcting them.
Anna's post was upvoted into the top 10 all-time on LW in a couple days, and my comment quickly became the top on the post by a large margin, so both her agenda and my concern seem to be widely shared. While I rarely take the time to write LW posts (as you would expect from someone who believes LW is not very useful), this feedback gave me hope that there might be enough untapped desire for something more effective that a post might help catalyze enough change to be worthwhile.
[2] There are many other other arguments as to why improving one's ability to do work is unlikely to be fun and easy, of course. With a large space of possible activities, and only a loose connection between "fun" and "helps you grow" (via evolutionary biology), it seems a priori unlikely that fun activities will overlap with growthful ones. And we know that a general recipe for getting better at X is to do X, so if one wants to get better at directing one's attention to the most important tasks and goals, it seems very likely that one must practice directing one's attention. Furthermore, there is evidence that, specifically, willpower is a muscle. So the case for growing one's instrumental rationality through being distracted by an entertaining rationality blog is...awfully weak.
[3] What are the most important problems in the world? Who is working most effectively to fix them and how can you help? Understanding existential risks is certainly not easy, and important to setting that portion of your goals that has to do with helping the world - which is a minor part of most people's goals, which are about their own lives and self-interest.
[4] I also believe the least effective form of debate is trying to get people to change their minds. Therefore, an extensive study and documentation to create a really good, solid argument trying to change the minds of LWers who don't quickly agree with my argument sketch would be a very low-return activity compared to getting together those who already agree and doing an experiment. And instrumental rationality is about maximizing the return on your activities, given your goals, so I try to avoid low-return activities.
[4.5] A number of commenters state that they consciously read LW during fun time, or read it to learn about biases and existential risk, not to become more rational, in which case it is likely of positive value. If you have successfully walled off your work from shiny distractions, then you are advanced in the ways of attention and may be able to use this particular drug without negative effects, and I congratulate you. If you are reading it to learn about topics of interest to rationalists and believe that you will stop there and not let it affect your productivity, just be warned that many an opiate addiction has begun with a legitimate use of painkillers.
Or to go back to Merlin's metaphor: If you buy a couch to sit on and watch TV, there's nothing wrong with that. You might even see a sports program on TV that motivates you to go jogging. Just don't buy the couch in order to further your goal of physical fitness. Or claim that couch-buyers are a community of people committed to becoming more fit, because they sometimes watch sports shows and sometimes get outside. Couch-buyers are a community of people who sit around - even if they watch sports programs. Real runners buy jogging shoes, sweat headbands, GPS route trackers, pedometers, stopwatches...
[5] 1.5 hrs so far. Time tracking is an important part of attention management - if you don't know how your time is spent, it's probably being spent badly.
[6] Specifically, I am not saying that growth is never fun, or that growth is proportional to effort, only that there are a very limited number of fun ways to grow (taking psychedelics at Burning Man with people you like and respect) and you've probably done them all already. If you haven't, sure, of course you should do them, and yes, of course discovering & cataloging such things is useful, but there really aren't very many so if you want to continue to grow you need to stop fooling yourself that reading a blog will do it and get ready to make some effort.
[7] Referencing Eliezer's great Money: The Unit of Caring, of course. I find it ironic that he understand basic economics intellectually so well as to make one of the most eloquent arguments for donating money instead of time that I've ever seen, yet seems to be trying to create a rationality improvement movement without, as far as I can tell, involving any specialists in the art of human change or growth. That is, using the method that grownups use. What you do when you want something to actually get done. You use money to employ full-time specialists.
[8] I haven't actually read this one yet, but their other book, Made To Stick, was an outstanding study of memetic engineering so I think it very likely that their book on habit formation is good too.
[9] Indeed. I happen to have a background of living in and founding intentional communities (Tortuga!), and in fact currently rent rooms to LWers Divia and Nick Tarleton, so I can attest to the value of one's social environment and personal growth goals being synchronized. Benton House is likely an example as well. Groups of rationalists living together will automatically practice, and have that practice reinforced by their primate desire for status within the group, this is almost surely the fastest way to progress, although not required or suited to everyone.
[10] The next paragraph explains why I do my best not to spend much time here:
The awakening artist must be ruthless, not only with herself but with others. Once you make your break, you can’t turn around for your buddy who catches his trouser leg on the barbed wire. The best thing you can do for that friend (and he’d tell you this himself, if he really is your friend) is to get over the wall and keep motating.
Although I suppose I am violating the advice by turning around and giving a long speech about why everyone else should make a break too :). My theory is that by saying it right once, I can refrain from wasting any more time saying it again in the future, should this attempt not work. But that may just be rationalizing. On the other hand, doing things "well or not at all" is rational in situations where the return curve is steep. Given my low evaluation of LW's usefulness, I obviously think the early part of the return curve is basically flat zero. We will see if it is hubris to think the right post can really make a difference, and that I can make that post. Certainly plenty of opportunity for bias in both those statements.
[11] Note that helping people become personally more effective is a much easier meme to spread than helping people better understand how to contribute to public goods (ie how to better understand efficient charity and existential risk). They have every incentive to do the former and little incentive to do the latter. So training people in general goal achievement (instrumental rationality) is likely to have far broader appeal and reach far more people than training them in the aspects of epistemic rationality that SIAI is most interested in. This large community who have grown through the individually beneficial part of the philosophy is then a great target market for the societally beneficial part of the philosophy. (A classic one-two punch used by spiritual groups, of course: provide value then teach values. It works. If rationalists do what works...) I've been meaning to make a post on the importance of personal benefit to spreading memes for awhile, this paragraph will have to do for now...
[12] And the ones with good memetic engineering, including use of the Dark Arts. Many difficult decisions will need to be made about what techniques are and aren't Dark Arts and which are worth using anyway. The fact remains that just like a sports MVP is almost certainly both more skilled and more lucky than his peers, a successful self-help movement is almost certainly both more effective at helping people and better memetically engineered than its peers. So copy - but filter.
Interesting - I don't know if it would work, but I'd like to hear about somebody who tried it.
Maybe a less invasive one would be a software that just shows a description of the current program the user is running - if he's on the web, the top-level domain, and if not, the name of the document he's reading/working on (Or more likely, the name of the application and the contents of it's title bar, anything deeper than that probably needs a lot more special coding).
This exists. I tried it for a while, but my life doesn't revolve around computer use enough for it to be especially interesting for me. For someone who spends most of their productive time at a computer, it might well help.