To teach people about a topic you've labeled "rationality", it helps for them to be interested in "rationality".  (There are less direct ways to teach people how to attain the map that reflects the territory, or optimize reality according to their values; but the explicit method is the course I tend to take.)

    And when people explain why they're not interested in rationality, one of the most commonly proffered reasons tends to be like:  "Oh, I've known a couple of rational people and they didn't seem any happier."

    Who are they thinking of?  Probably an Objectivist or some such.  Maybe someone they know who's an ordinary scientist.  Or an ordinary atheist.

    That's really not a whole lot of rationality, as I have previously said.

    Even if you limit yourself to people who can derive Bayes's Theorem—which is going to eliminate, what, 98% of the above personnel?—that's still not a whole lot of rationality.  I mean, it's a pretty basic theorem.

    Since the beginning I've had a sense that there ought to be some discipline of cognition, some art of thinking, the studying of which would make its students visibly more competent, more formidable: the equivalent of Taking a Level in Awesome.

    But when I look around me in the real world, I don't see that.  Sometimes I see a hint, an echo, of what I think should be possible, when I read the writings of folks like Robyn Dawes, Daniel Gilbert, Tooby & Cosmides.  A few very rare and very senior researchers in psychological sciences, who visibly care a lot about rationality—to the point, I suspect, of making their colleagues feel uncomfortable, because it's not cool to care that much.  I can see that they've found a rhythm, a unity that begins to pervade their arguments—

    Yet even that... isn't really a whole lot of rationality either.

    Even among those whose few who impress me with a hint of dawning formidability—I don't think that their mastery of rationality could compare to, say, John Conway's mastery of math.  The base knowledge that we drew upon to build our understanding—if you extracted only the parts we used, and not everything we had to study to find it—it's probably not comparable to what a professional nuclear engineer knows about nuclear engineering.  It may not even be comparable to what a construction engineer knows about bridges.  We practice our skills, we do, in the ad-hoc ways we taught ourselves; but that practice probably doesn't compare to the training regimen an Olympic runner goes through, or maybe even an ordinary professional tennis player.

    And the root of this problem, I do suspect, is that we haven't really gotten together and systematized our skills.  We've had to create all of this for ourselves, ad-hoc, and there's a limit to how much one mind can do, even if it can manage to draw upon work done in outside fields.

    The chief obstacle to doing this the way it really should be done, is the difficulty of testing the results of rationality training programs, so you can have evidence-based training methods.  I will write more about this, because I think that recognizing successful training and distinguishing it from failure is the essential, blocking obstacle.

    There are experiments done now and again on debiasing interventions for particular biases, but it tends to be something like, "Make the students practice this for an hour, then test them two weeks later."  Not, "Run half the signups through version A of the three-month summer training program, and half through version B, and survey them five years later."  You can see, here, the implied amount of effort that I think would go into a training program for people who were Really Serious about rationality, as opposed to the attitude of taking Casual Potshots That Require Like An Hour Of Effort Or Something.

    Daniel Burfoot brilliantly suggests that this is why intelligence seems to be such a big factor in rationality—that when you're improvising everything ad-hoc with very little training or systematic practice, intelligence ends up being the most important factor in what's left.

    Why aren't "rationalists" surrounded by a visible aura of formidability?  Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought?  Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

    Of this there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.

    I do not except myself from this criticism.  I am no beisutsukai, because there are limits to how much Art you can create on your own, and how well you can guess without evidence-based statistics on the results.  I know about a single use of rationality, which might be termed "reduction of confusing cognitions".  This I asked of my brain, this it has given me.  There are other arts, I think, that a mature rationality training program would not neglect to teach, which would make me stronger and happier and more effective—if I could just go through a standardized training program using the cream of teaching methods experimentally demonstrated to be effective.  But the kind of tremendous, focused effort that I put into creating my single sub-art of rationality from scratch—my life doesn't have room for more than one of those.

    I consider myself something more than a first-dan black belt, and less.  I can punch through brick and I'm working on steel along my way to adamantine, but I have a mere casual street-fighter's grasp of how to kick or throw or block.

    Why are there schools of martial arts, but not rationality dojos?  (This was the first question I asked in my first blog post.)  Is it more important to hit people than to think?

    No, but it's easier to verify when you have hit someone.  That's part of it, a highly central part.

    But maybe even more importantly—there are people out there who want to hit, and who have the idea that there ought to be a systematic art of hitting that makes you into a visibly more formidable fighter, with a speed and grace and strength beyond the struggles of the unpracticed.  So they go to a school that promises to teach that.  And that school exists because, long ago, some people had the sense that more was possible.  And they got together and shared their techniques and practiced and formalized and practiced and developed the Systematic Art of Hitting.  They pushed themselves that far because they thought they should be awesome and they were willing to put some back into it.

    Now—they got somewhere with that aspiration, unlike a thousand other aspirations of awesomeness that failed, because they could tell when they had hit someone; and the schools competed against each other regularly in realistic contests with clearly-defined winners.

    But before even that—there was first the aspiration, the wish to become stronger, a sense that more was possible.  A vision of a speed and grace and strength that they did not already possess, but could possess, if they were willing to put in a lot of work, that drove them to systematize and train and test.

    Why don't we have an Art of Rationality?

    Third, because current "rationalists" have trouble working in groups: of this I shall speak more.

    Second, because it is hard to verify success in training, or which of two schools is the stronger.

    But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.

    And conversely they don't look at the lack of visibly greater formidability, and say, "We must be doing something wrong."

    "Rationality" just seems like one more hobby or hobbyhorse, that people talk about at parties; an adopted mode of conversational attire with few or no real consequences; and it doesn't seem like there's anything wrong about that, either.

    New to LessWrong?

    New Comment
    219 comments, sorted by Click to highlight new comments since: Today at 6:30 AM
    Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

    Eliezer, I have recommended to you before that you read The Darkness That Comes Before and the associated trilogy. I repeat that recommendation now. The monastery of Ishual is your rationalist dojo, and Anasurimbor Kellhus is your beisutsukai surrounded by a visible aura of formidability. The book might even give you an idea or two.

    My only worry with the idea of these dojos is that I doubt the difference between us and Anasurimbor Kellhus is primarily a difference in rationality levels. I think it is more likely to be akrasia. Even an irrational, downright stupid person can probably think of fifty ways to improve his life, most of which will work very well if he only does them (quit smoking, quit drinking, study harder in school, go on a diet). And a lot of people with pretty well developed senses of rationality whom I know, don't use them for anything more interesting than winning debates about abortion or something. Maybe the reason rationalists rarely do that much better than anyone else is that they're not actually using all that extra brainpower they develop. The solution to that isn't more brainpower.

    Kellhus was able to sit down, enter the probability trance, decide on the be... (read more)

    I think the akrasia you describe and methods of combating it would come under the heading of "kicking", as opposing to the "punching" I've been talking about. It's an art I haven't created or learned, but it's an art that should exist.

    This "art of kicking" is what pjeby has been working toward, AFAICT. I haven't read much of his writing, though. But an "art of kicking" would be a great thing to mix in with the OB/LW corpus, if pjeby has something that works, which I think he has at least some of -- and if we and he can figure out how to hybridize kicking research and training with punching research and training.

    I'd also love to bring in more people from the entrepreneurship/sales/marketing communities. I've been looking at some of their better literature, and it has rationality techniques (techniques for not shooting yourself in the foot by wishful thinking, overconfidence, etc.) and get-things-done techniques mixed together. I love the sit-and-think math nerd types too, and we need sitting and thinking; the world is full of people taking action toward the wrong goals. But I'd expect better results from our rationalist community if we mixed in more people whose natural impulses were toward active experiments and short-term visible results.

    Pjeby's working on akrasia? I'll have to check out his site.

    That brings up a related question that I think Eliezer hinted at: what pre-existing bodies of knowledge can we search through for powerful techniques so that we don't have to re-invent the wheel? Entrepreneurship stuff is one. Lots of people have brought up pick-up artists and poker, so those might be others.

    I nominate a fourth that may be controversial: mysticism. Not the "summon demons" style of mysticism, but yoga and Zen and related practices. These people have been learning how to examine/quiet/rearrange their minds and sort out the useful processes from the useless processes for the past three thousand years. Even if they've been working off crazy metaphysics, it'd be surprising if they didn't come up with something. Eliezer talks in mystical language sometimes, but I don't know whether that's because he's studied and approves of mysticism or just likes the feel of it.

    What all of these things need is a testing process combined with people who are already high-level enough that they can sort through all the dross and determine which techniques are useful without going native or opening themselves up to the accusation that they're doing so; ie people who can sort through the mystical/pick-up artist/whatever literature and separate out the things that are useful to rationalists from the things specific to a certain worldview hostile to our own. I've seen a few good people try this, but it's a mental minefield and they tend to end up "going native".

    In the case of pickup literature, there is a lot to attract rationalists, but also a lot to inspire their ire.

    The first thing rationalists should notice about pickup is that it wins. There are no other resources in mainstream culture or psychology that are anywhere near as effective. Yet even after witnessing the striking ability of pickup theories to win, I am hesitant to say that they are actually true. For example, I acknowledge the fantastic success of notions like "women are attracted to Alpha Males," even though I don't believe that they are literally true, and I know that they are oversimplifications of evolutionary psychology. Consequently, I am an instrumentalist, not a realist, about pickup theories.

    If we started a project from scratch where we applied rationality to the domain of sex and relationships, and developed heuristics to improve ourselves in those areas, this project would have a considerable overlap with the teachings of the seduction community. At its best, pickup is "applied evolutionary psychology." Many of the common criticisms of pickup demonstrate an anger against the use of rationality and scientific thinking in the supposedly sacred... (read more)

    Also, since this particular community leans altruistic, I'd hope that such a project would emphasize the future happiness of potential partners more than does (correct me if I'm wrong) the current pickup community.

    For example, I acknowledge the fantastic success of notions like "women are attracted to Alpha Males," even though I don't believe that they are literally true, and I know that they are oversimplifications of evolutionary psychology.

    I tune out wherever I hear the term 'alpha male' in that sort of context. The original scientific concept has been butchered and abused beyond all recognition. Even more so the 'beta' concept. Beta males are the ones standing right behind the alpha ready to overthrow him and take control themselves. 'Omega' should be the synonym for 'pussy'.

    But I must admit the theory is at least vaguely in the right direction and works. Reasonably good as popular science for the general public. Better than what people believe about diet, showering, and dental hygene.

    Many of the common criticisms of pickup demonstrate an anger against the use of rationality and scientific thinking in the supposedly sacred and mystical area of sex and romance.

    Actually, the best (and most common) criticisms I see are more due to the use of lies and manipulation in the area of sex and romance.

    The evo-psych stuff (and thereby any science and rationality) is perfectly fine by me.

    -5Vaniver13y

    Yvain:

    You've hit on something that I have long felt should be more directly addressed here/at OB. Full disclosure is that I have already written a lot about this myself and am cleaning up some "posts" and chipping away here to get the karma to post them.

    It's tough to talk about meditation-based rationality because (a) the long history of truly disciplined mental practice comes out of a religious context that is, as you note, comically bogged down in superstitious metaphysics, (b) it is a more-or-less strictly internal process that is very hard to articulate (c) has become a kind of catch-all category for sloppy new-age thinking about a great number of things (wrongheaded, pop quantum theory, anyone?)

    Nevertheless, as Yvain notes, there is indeed a HUGE body of practice and tried-and-true advice, complete with levels of mastery and, if you have been lucky enough to know some the masters, that palpable awesomeness Eliezer speaks of. I'm sure all of this sounds pretty slippery and poppish, but it doesn't have to be. One thing I would like to help get going here is a rigorous discussion, for my benefit and everyone's, about how we can apply the science of cognition to the practice of meditation and vice versa.

    2Eliezer Yudkowsky15y
    Think you've got enough karma to post already.
    1anonym15y
    There has been quite a bit of research in recent years on meditation, and the pace seems to be picking up. For a high level survey of recent research on the two primary forms of Buddhist meditation, I'd recommend the following article: Attention regulation and monitoring in meditation. PDF Here
    7olimay15y
    Yvain, do check out pjeby's work. I have to admit I some points I found myself reading OB as a self help attempt. I'm glad I kept up, but dirtsimple.org was the blog I was actually looking for. Your point about mysticism is interesting, because I find pjeby's perspective on personal action and motivation has a strange isomorphism to Zen thought, even though that doesn't seem to be main intention. In fact, his emphasis seems to be de-mystifying. One of his main criticisms of existing psychological/self-help literature is that the relatively good stuff is incomprehensible to the people who need it most, because they'd need to already be in a successful, rational action mindset in order to implement what's being said. Anyway, I hope pjeby chimes up so he can offer something better than my incomplete summary...

    It doesn't take a formal probability trance to chart a path through everyday life - it was in following the results

    Couldn't agree more. Execution is crucial.

    I can come out of a probability trance with a perfect plan, an ideal path of least resistance through the space of possible worlds, but now I have to trick, bribe or force my messy, kludgy, evolved brain into actually executing the plan.

    A recent story from my experience. I had (and still have) a plan involving a relatively large chunk of of work, around a full-time month. Nothing challenging, just 'sit down and do it' sort of thing. But for some reason my brain is unable to see how this chunk of work will benefit my genes, so it just switches into a procrastination mode when exposed to this work. I tried to force myself to do it, but now I get an absolutely real feeling of 'mental nausea' every time I approach this task – yes, I literally want to hurl when I think about it.

    For a non-evolved being, say an intelligently-designed robot, the execution part would be a non-issue – it gets a plan, it executes it as perfectly as it can, give or take some engineering inefficiencies. But for an evolved being trying to be rational, it's an entirely different story.

    If one had public metrics of success at rationality, the usual status seeking and embarrassment avoidance could encourage people to actually apply their skills.

    9Vladimir_Golovin15y
    Shouldn't a common-sense 'success at life' (money, status, free time, whatever) be the real metric of success at rationality? Shouldn't a rationalist, as a General Inteligence, succeed over a non-rationalist in any chosen orderly environment, according to any chosen metric of success -- including common metrics of that environment?

    No.

    • If "general intelligence" is a binary classification, almost everyone is one. If it's continuous, rationalist and non-rationalist humans are indistinguishable next to AIXI.
    • You don't know what the rationalist is optimizing for. Rationalists may even be less likely to value common-sense success metrics.
    • Even if those are someone's goals, growth in rationality involves tradeoffs - investment of time, if nothing else - in the short term, but that may still be a long time.
    • Heck, if "rationality" is defined as anything other than "winning", it might just not win for common-sense goals in some realistic environments.
    • People with the disposition to become rationalists may tend to also not be as naturally good at some things, like gaining status.
    5Vladimir_Golovin15y
    Point-by-point: 1. Agreed. Let's throw away the phrase about General Intelligence -- it's not needed there. 2. Obviously, if we're measuring one's reality-steering performance we must know the target region (and perhaps some other parameters like planned time expenditure etc.) in advance. 3. The measurement should measure the performance of a rationalist at his/her current level, not taking into account time and resources he/she spent to level up. Measuring 'the speed or efficiency of leveling-up in rationality' is a different measurement. 4. The definitions at the beginning of the original post will do. 5. On one hand, the reality-mapping and reality-steering abilities should work for any activity, no matter whether the performer is hardware-accelerated for that activity or not. On the other hand, we should somehow take this into account -- after all, excelling at things one is not hardware-accelerated for is a good indicator. (If only we could reliably determine who is hardware-accelerated for what). (Edit: cool, it does numeric lists automatically!)
    4Annoyance15y
    Public metrics aren't enough - society must also care about them. Without that, there's no status attached and no embarrassment risked. To get this going, you'd also need a way to keep society's standards on-track, or even a small amount of noise would lead to a positive feedback loop disrupting its conception of rationality. Everyone has at least a little bit of rationality. Why not simply apply yourself to increasing it, and finding ways to make yourself implement its conclusions? Just sit under the bodhi tree and decide not to move away until you're better at implementing.

    An idea on how to make the execution part trivial – a rational planner should treat his own execution module as a part of the external environment, not as a part of 'himself'. This approach will produce plans that take into account the inefficiencies of one's execution module and plan around them.

    4thomblake15y
    I hope you realize this is potentially recursive, if this 'execution module' happens to be instrumental to rationality. Not that that's necessarily a bad thing.
    4Vladimir_Golovin15y
    No, I don't (yet) -- could you please elaborate on this?
    0Luke_A_Somers11y
    Funny how this got rerun on the same day as EY posted about progress on Löb's problem.
    3Yoav Ravid5y
    What if first, you just calculate the most beneficial actions you can take (like Scott did), and after that asses each of those using something like piers steel's procrastination equation? then you know which one you're most likely to achieve, and can can choose more wisely. also, doing the easiest first can sometimes be a good strategy to achieve all of them, steel calls it a success spiral, where you succeed time after time and it increases your motivation.
    3Psy-Kosh15y
    Well, ideally one considers the whole of themselves when doing the calculations, but it does make the calculations tricky. And that still doesn't answer exactly how to take it into account. ie, "okay, I need to take into account the properties of my execution module, find ways to actually get it to do stuff. How?"
    1Nick_Tarleton15y
    However, treating the execution module as external and fixed may demotivate attempts to improve it. (Related: Chaotic Inversion)
    2roland15y
    Yvain, you make a great point here. AFAIK it is common knowledge that a lot of great intelectuals where great procrastinators. Overcoming one's bad habits is key. But I wonder about what can be done in that regard since so much is defined by genetics.

    Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

    Because they don't win? Because they don't reliably steer reality into narrow regions other people consider desirable?

    I've met and worked with several irrationalists whose models of reality were, to put it mildly, not correlated to said reailty, with one explicit, outspoken anti-rationalist with a totally weird, alien epistemology among them. All these people had a couple of interesting things in common.

    On one hand, they were often dismal at planning – they were unable to see obvious things, and they couldn't be convinced otherwise by any arguments appealing to 'facts' and 'reality' (they universally hated these words).

    On the other hand, they were surprisingly good at execution. All of them were very energetic people who didn't fear any work or situation at all, and I almost never saw any of them procrastinating. Could this be because... (read more)

    5Rings_of_Saturn15y
    Vladimir: It seems you are being respectful of the anonymity of these people, and very well, that. But you pique my curiosity... who were these people? What kind of group was it, and what was their explicit irrationality all about? I can think of a few groups that might fit this mold, but the peculiar way you describe them makes me think you have something very specific and odd in mind. Children of the Almighty Cthulu?

    I’ll describe three most interesting cases.

    Number One is a Russian guy, now in his late 40s, with a spectacular youth. Among his trades were smuggling (during the Soviet era he smuggled brandy from Kazakhstan to Russia in the water system of a railway car), teaching in a ghetto college (where he inadvertently tamed a class of delinquents by hurling a wrench at their leader), leading a programming lab in an industrial institute, starting the first 3D visualization company in our city, reselling TV advertising time at a great margin (which he obtained by undercover deals involving key TV people and some outright gangsters), and saving the world by trying to find venture funding for a savant inventor who supposedly had a technology enabling geothermal energy extraction (I also worked together with them on this project). He was capable of totally crazy things, such as harpooning a wall portrait of a notorious Caucasus clanlord in a room full of his followers. He had lots of money during his successful periods, but was unable to convert this into a longer-term success.

    Number Two is a deaf-mute woman, now in her 40s, who owns and runs a web development company. Her speech is distorted, ... (read more)

    Thanks, Vladimir. You have interesting friends!

    4Vladimir_Nesov15y
    How do you translate that into a question of definition of truth? The third guy is sufficiently rational to be successful, I guess he's got excellent native intelligence allowing him to correctly judge people or influence their decisions, and that his verbal descriptions of his beliefs are mostly rationalization, not hurting his performance too much. If he was a rationalist, he'd probably be even more successful (or he'd find a different occupation).

    Yes, the guy is smart, swift-thinking and quick to act when it comes to getting projects up from the ground, connecting the right people and getting funding from nowhere (much less so when it comes to technical details and fine-grained planning). His actual decisions are effective, regardless of the stuff he has in the conscious part of his head.

    (Actually quite a lot of people whose 'spoken' belief systems are suboptimal or plain weird are perfectly able to drive cars, run companies, avoid tigers and otherwise deal with the reality effectively.)

    But can we call such 'hardware-accelerated' decisions rational? I don't know.

    Regarding your question. We had obvious disagreements with this guy, and I spent some time thinking about how can we resolve them. As a result, I decided that trying to resolve them (on a conscious level of course) is futile unless we have an agreement about fundamental things -- what we define as truth, and which methods can we use to derive truths from other truths.

    I didn't think much about this issue before I met him (a scientific, or more specifically, Popperian worldview was enough for me), and this was the first time I had to consciously think about the issue. I even doubt I knew the meaning of the term 'epistemology' back then :)

    2Vladimir_Golovin15y
    Rings, what groups did you have in mind?
    3Annoyance15y
    I have also noticed that people who good at manipulating and interacting with people are bad at manipulating and interacting with objective reality, and vice versa. The key difference is that the politicals are ultimately dependent on the realists, but not vice versa.

    Unfortunately, this hasn't aged very impressively.

    Despite the attempts to build the promised dojo (CFAR, Leverage/Paradigm, the EA Hotel, Dragon Army, probably several more that I'm missing), rationalists aren't winning in this way. The most impressive result so far is that a lot of mid-tier powerful people read Slate Star Codex, but I think most of that isn't about carrying on the values Eliezer is trying to construct in this sequence - Scott is a good writer on many topics, most of which are at best rationality-adjacent. The second most impressive result is the power of the effective altruism movement, but that's also not the same thing Eliezer was pointing at here. 

    The remaining positive results of the 2009 rationality community are a batch of happy group houses, and MIRI chugging along its climb (thanks to hard-to-replicate personalities like Eliezer and Nate).

    I think the "all you need is to try harder" stance is inferior to the "try to make a general postmortem of 'rationalist dojo' projects in general" stance, and I'd like to see a systematic attempt at the latter, assembling public information and interviewing people in all of these groups, and integrating all the data on why they failed to live up to their promises.

    Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

    I'm relatively new to rationality, but I've been a nihilist for nearly a decade. Since I've started taking developing my own morality seriously, I've put about 3500 hours of work into developing and strengthening my ethical framework. Looking back at myself when nihilism was just a hobbyhorse, I wasn't noticeably moral, and I certainly wasn't happy. I was a guy who knew things, but the things I knew never got put into practice. 5 years later, I'm a completely different person than I was when I started. I've made a few discoveries, but not nearly enough to account for the radical shifts in my behavior. My behavior is different because I practice.

    I know a few other nihilists. They post pictures of Nietzsche on Facebook, come up with clever arguments against religion, and have read "the Anti-Christ." They aren't more moral... (read more)

    While developing a rationality metric is obviously crucial, I have this nagging suspicion that what it may take is simply a bunch of committed wanna-be rationalists to just get together and, well, experiment and teach and argue, etc with each other in person regularly, try to foster explicit social rules that support rather than inhibit rationality, and so on.

    From there, at least use a fuzzy this "seems" to work/not work type metric, even if it's rather subjective and imprecise, as a STARTING POINT, until one can more precisely do that, until one gets a better sense of exactly what to look for, explicitly.

    But, my main point is my suspicion that "do it, even if you're not entirely sure yet what you're doing, just do it anyways and try to figure it out on the fly" may actually be what it takes to get started. If nothing else, it'll produce some nice case study in failure that at least one can look at and say "okay, let's actually try to work out what we did wrong here"

    EDIT: hrm... maybe I ought reconsider my position. Will leave this up, at least for now, but with the added note that now I'm starting to suspect myself of basically just trying to "solve the problem without having to, well, actually solve the problem"

    3billswift15y
    Before you consider taking this down, you might want to read Thomas Sowell's "A Conflict of Visions" and "Knowledge and Decisions". Some (Many) problems cannot be "solved" but must be continually "worked on". I suspect most "self-improvement" programs are of this type. (Sowell doesn't address this, his discussion and examples are all from economic and social problems, but I think they're applicable.)
    0Regex8y
    I've been predicted! This almost exactly describes what I've been up to recently... (Will make a post for it later. Still far too rough to show off. Anyone encountering this comment in 2016 or later should see a link in my profile. Otherwise, message me.) Edit: Still very rough, and I ended up going in a slightly different direction than I'd hoped. Strange looking at how much my thoughts of it changed in a mere two months. Here it is

    Every dojo has its sensei. There is a need for curriculum, but also skilled teachers to guide the earnest student. LessWrong and Overcoming Bias have, to some extent, been the dojo in which the students train. I think that you may find a lot of value in just jumping into a project like this: starting a small school that meets two times a week to practice a particular skill of rationality. A key goal to the budding school is to train the future's teachers.

    One of my barriers to improving my rationality is little awareness of what the good reading and study m... (read more)

    5pjeby15y
    More precisely, what is rationality's method for scoring matches? If you don't have that, you have no way to know whether the flying guillotine is any good, or whether you're even getting better at what you're doing within your own school. To me, the score worth caring about most, is how many of your own irrational beliefs, biases, conditioned responses, etc., you can identify and root out... using verifiable criteria for their removal... as opposed to simply being able to tell that it would be a good idea to think differently about something. (Which is why I consider Eliezer "formidable", as opposed to merely "smart": his writing shows evidence of having done a fair amount of this kind of work.) Unfortunately, this sort of measurement is no good for scoring matches, unless the participants set out to prove at the beginning that they were more wrong than their opponent, at the beginning of the "match"! But then, neither is any other sort of competitive measurement any good, as far I can see. If you use popularity, then you are subject to rhetorical effects, apparent intelligence, status, and other biasing factors. If you use some sort of reality-based contest, the result needn't necessarily correlate with rationality or thinking skills in general. And if you present a puzzle to be solved, how will you judge the solution, unless you're at least as "formidable" as the competitors?
    5NancyLebovitz14y
    Any system of measurement is subject to Goodhart's Law. This is really rough when you're trying to engage with reality.

    For a nice literary description of what it means to have an "aura of awesome" try "The String Theory" by David Foster Wallace. Wallace writes of a mid-level pro tennis player: "The restrictions on his life have been, in my opinion, grotesque... But the radical compression of his attention and sense of himself have allowed him to become a transcendent practitioner of an art."

    Perhaps in the future humans will achieve the same level of excellence at the Art of Rationality as some currently do at the Art of Tennis.

    http://www.esquire.com/features/sports/the-string-theory-0796

    On a side note, we have religious schools where a religion, such as Christianism, is part of the cursus. This indoctrinates young minds very early in their life, and leaves them scared, biased in most cases for the rest of their existence.

    If we had, on the other hand, schools where even just basics of rationality and related topics, such as game theory, economics, scientific method, probabilities, biases, etc. were taught, what a difference it would make.

    The sooner you kickstart rationality in a person, the longer they have to learn and practice it, obv... (read more)

    1taryneast13y
    To begin with, I'd like to see is a set of things one can teach ones own kids to lead them to a more rational basis as they grow up. Kind of a rationality inoculation? :)
    [-][anonymous]15y30

    deleted

    0David_Gerard13y
    Add "chimpanzee tribal politics" and "large-scale human politics" and you might be onto a winner. The PPE curriculum is worth drawing on heavily, for example - PPE plus science and technology, perhaps. But we could easily end up with a ten-year undergraduate degree :-)

    I see what you're saying about rationality being trained in a pure fashion (where engineering, the sciences in general, etc. is - hopefully - "applied rationality"). One thing I don't see you mention here but it was a theme in your 3 worlds story, and which is also a factor in martial arts training, is emotional management. That's crucial for rationality, since it will most likely be our feelings that lead us astray. Look at how the feeling of "trust" did in Madoff's investors. Muay thai and Aikido deal with emotions differently, but e... (read more)

    0khafra13y
    I wonder if this comment inspired Patrissimo's inagural post on his new Rational Poker site.
    1zaph13y
    I'll be happy to take a cut if the RP folks are so inclined :) But I think emotional management in poker and games in general is important to succeed in those arenas, and underscores the need for this component in rationality training.

    Just an observation: Few modern American karate schools ever let you hit someone, except when a lot of padding is involved. Fighting is not usually an element in exams below the blackbelt level. Competition is usually optional and not directly linked to advancement. I've seen students attain advanced belts without having any real-life fighting ability.

    (The term "dojo" is Japanese, and I think most Japanese martial artists study Judo or Aikido, which are not subject to these criticisms.)

    4roland15y
    You are looking at the wrong art Phil, go to a boxing or Muay Thai school and you will see real hitting. Btw, as a martial artist myself I don't consider karate a serious martial art and part of that is for the reasons you stated. Although I think there is full-contact Karate which is a serious art. PS: If you are looking for a good martial art look for one where the training involves a lot of realistic sparring. IMHO there should be sparring almost every time you train.
    2Psy-Kosh15y
    Which criticism? If you mean to say Aikido is competative, well, depending which flavor, it often doesn't have much in the way of competition... as such. The training method involves people pairing up and basically taking turns attacking and defending, with the "defending" person being the one actually doing whatever technique is the one in question, but the "attacker" is supposed to allow it, or at least not overly resist/fight back. Or, did I misunderstand?
    1ABranco14y
    There's a question begging to be made here: what is a good martial art? Is one that brings inner calm and equilibrium in itself? Or one that is effective in keeping aggressions away? Not that those aren't correlated, but some martial arts excel more in the former and in the environment of feudal Japan. I doubt the exuberance and aesthetics of most of those arts prove effective, however, confronting the dangers of modern cities. In this sense, something much less choreographic or devoid of ancient philosophy — such as the straightforward and objective Israeli self-defense krav maga — seems to be much more effective. What is curious here is: a great deal of krav maga training involves lots of restraining, since hitting "for real" would mean fractured necks or destroyed testes. So there's no competition, either. Can it be that in martial arts there's a somehow inverse correlation between the potential of real-life damage (and therefore effectiveness) and the realism by which the training is executed?
    0Douglas_Knight14y
    No empty-handed martial arts are extant from feudal Japan. They were illegal then, thus secret.
    3taryneast13y
    jujitsu is an empty-handed martial art of the Koryu (or traditional) school. (according to wikipedia) :)
    1Douglas_Knight13y
    Yes, jiujitsu is an exception. I learned that sometime in the past two years, but failed to update my comment ;-) The precise statement is that samurai had a monopoly on force and it was illegal for others to learn martial arts. Thus extant feudal Japanese martial arts were for samurai. Sometimes samurai were unarmed, hence jiujitsu, though it assumes both combatants are heavily armored. What I really meant in my comment was that karate was imported around the end of the shogonate and that judo and aikido were invented around 1900. However, they weren't invented from scratch, but adapted from feudal jiujitsu. They probably have as much claim to that tradition as brand-name jiujitsu. In any event, jiujitsu probably wasn't static or monolithic in 1900, either.
    0PhilGoetz14y
    Yes. Certainly for judo vs. most other martial arts. (Although I wouldn't call judo ineffective - it can be used in many situations where you wouldn't use other martial arts at all.)
    1Mercurial12y
    I'd be really interested in hearing what those circumstances are. I usually make the same claim about Aikido (e.g., you probably don't want to crush Uncle Mortimer's trachea just because he happened to grab a knife in his drunken stupor).
    2khafra12y
    I'd call the reality-joint-cleaving line the one between adrenaline-trigger training and adrenaline control training. Most training in traditional arts like Kuntao Silat and modern ones like the now-deprecated USMC LINE system involves using fear and stress as a trigger to start a sequence of techniques that end with disabling or killing the attacker. Most training in traditional arts like Tai Chi and (more) modern ones like Aikido involve retaining the ability to think clearly and act in situations where adrenaline would normally crowd out "system 2" thinking. Any art can be trained in either way. A champion boxer would probably be calm enough to use a quick, powerful jab and knock the knife out of Uncle Mortimer's hand in a safe direction. A Marine with PTSD might use the judo-like moves from the LINE system to throw him, break several bones, and stomp on his head before realizing what he was doing. A less discrete way to look at it adapts the No Free Lunch theorem: A fighting algorithm built for a specific environment like a ring with one opponent and a limited set of moves, or a field of combat with no legal repercussions and unskilled opponents, can do well in their specific setting. A more general fighting algorithm will perform more evenly across a large variety of environments, but will not beat a specialized algorithm in its own setting unless it's had a lot more training.
    3Mercurial12y
    That is an excellent point. My father and I still sometimes get into debates that pivot on this. He says that in a real fight your fight-or-flight system will kick in, so you might as well train tense and stupid since that's what you'll be when you need the skills. But I've found that it's possible to make the sphere of things that don't trigger the fight-or-flight system large enough to encompass most altercations I encounter; it's definitely the harder path, but it seems to have benefits outside of fighting skill as well. Possibly! I think that in the end, what I most care about in my art is that I can defend myself and my family from the kinds of assaults that are most likely. I'm not likely to enter any MMA competitions anytime soon, so I'm pretty okay with the possibility that my survival skills can't compete with MMA-trained fighters in a formal ring.

    Isn't this a description of what a liberal arts education is supposed to provide? The skills of 'how to think' not 'what to think'? I'm not too familiar with the curriculum since I did not attend a liberal arts college, instead I was conned into an overpriced private university, but if anyone has more info please chime in.

    4David_Gerard13y
    That's what a liberal arts curriculum was originally intended to teach, yes - it's just a bit out of date. An updated version would be worth working out and popularising.

    Has Eliezer made explicit updates about this? Maybe @Rob Bensinger knows. If he has, I'd like to see it posted prominently and clearly somewhere. Either way, I wonder why he doesn't mention it more often. Maybe he does, but only in fiction.

    [...] I think that recognizing successful training and distinguishing it from failure is the essential, blocking obstacle.

    Does this come up in the Dath Ilan stories?

    There are experiments done now and again on debiasing interventions for particular biases, but it tends to be something like, "Make the students practice thi

    ... (read more)

    What kinds of tests or contests might we have? One that I can think of would be to have students try to create some visible, small scale effect in a society, with points for efficiency.

    Elizer raises the issue of testing a rationality school. I can think of a simple way to at least approach this: test the students for well-understood cognitive biases. We have tests for plenty of biases; some of the tests don't work if you know about them, which surely these students will, but some do, and we can devise new tests.

    For example, you can do the classic test of confirmation bias where you give someone solid evidence both for and against a political position and see if they become more or less certain. Even people who know about this experiment should often still fall prey to it—if they don't, they have demonstrated their ability to escape confirmation bias.

    As a thought, could it be that one of the major obstacles standing in the way of the creation of a "rationality dojo" is the public perception (however inaccurate) that such already exists in not just one but multiple forms? Take the average high school debate club as one example: participants are expected to learn to give a reasoned argument, and to avoid fallacious reasoning while recognizing it in their opponents. Another example would be maths classes, wherein people are expected to learn how to construct a sound mathematical proof. I very much doubt that most people would understand the distinction between these and the proposed "rationality dojo", which would make it very hard to establish one.

    0ChristianKl7y
    In 2009 there were no rationality dojo's but today there are multiple one's in different cities. Debate in clubs like this is about finding good arguments, it's not about finding out which side is right.
    4hcutter7y
    I'm new here, and I still have a great deal of content to catch up on reading, so it would be helpful if you could clarify: are you here referring to the Less Wrong meetup groups as "rationality dojos", or something else which has been created to fill this void since 2009? I thought I had been very careful to draw a clear distinction between what such clubs are about and actual rationality, while still contending that the perception of the average person (the non-rationalist) is that they are the same. Was I unclear? And, if so, how could I have been more clear than I was?
    2ChristianKl7y
    I'm not referring to regular meetups. CFAR started having a weekly event they called a dojo. Given that blueprint other cities have started a similar groups. In Berlin we have a weekly dojo. Austrialia seems to have a montly dojo in Melbourne and in Sydney. Ohio also has a dojo: http://rationality-dojo.com/ Okay. I should have been more clear. It doesn't matter what the average person thinks. A group doesn't need the average person to become a member to be successful. There just need to be enough people who care enough about the idea to become a member. If a group provides value to it's members and the members tell other people about it, it can grow.
    2hcutter7y
    Thank you for clarifying. I wasn't aware of those, and to be honest they seem a bit difficult to find information about via Less Wrong as a new reader. Meetups are publicized in the sidebar, but nothing about these dojos. Not even under the About section's extensive list of links. Which surprises me, if the creation of these dojos was a goal of Eliezer's from his very first blog post here. If appeal to those who already care about rationality, followed by word of mouth advertising, is the approach that the dojos have decided to take rather than a more general appeal to the populace as part of raising the sanity waterline, then I concede the point.

    It's easy to define success in martial arts. Defining 'rationality' is harder. Have you done so yet, Eliezer?

    Even in martial arts, many of the schools of thoughts are essentially religions or cults, completely unconcerned with fighting proficiency and deeply concerned with mastering the arcane details of a sacred style passed on from teacher to student.

    Such styles often come with an unrealistic conviction that the style is devastatingly effective, but there is little concern with testing that.

    See also: http://www.toxicjunction.com/get.asp?i=V2741

    I've ... (read more)

    6Eliezer Yudkowsky15y
    Already defined "rationality" in passing in the second sentence of the article, just in case someone came in who wasn't familiar with the prior corpus. You, of course, are familiar with the corpus and the amount of work I've already put into defining rationality; and so I have made free to vote down this comment, because of that little troll. I remind everyone that anything with a hint of trollishness is a fair target for downvoting, even if you happen to disagree with it.
    0Lee_A_Arnold15y
    Eliezer, what do you say about someone who believed the world is entirely rational and then came to theism from a completely rational viewpoint, such as Kurt Gödel did?
    2Eliezer Yudkowsky15y
    I'd say, "take it to the Richard Dawkins forum or an atheism IRC channel or something, LW is for advanced rationality, not the basics".
    1Lee_A_Arnold15y
    Surely Gödel came to it through a very advanced rationality. But I'm trying to understand your own view. Your idea is that Bayesian theory can be applied throughout all conceptual organization?
    -2Eliezer Yudkowsky15y
    My view is that you should ask your questions of some different atheist on a different forum. I'm sure there will be plenty willing to debate you, but not here.
    2Lee_A_Arnold15y
    I'm not a theist, and so you have made two mistakes. I'm trying to find out why formal languages can't follow the semantics of concepts through categorial hierarchies of conceptual organization. (Because if they had been able to do so, then there would be no need to train in the Art of Rationality -- and we could easily have artificial intelligence.) The reason I asked about Gödel is because it's a very good way to find out how much people have thought about this. I asked about Bayes because you appear to believe that conditional probability can be used to construct algorithms for semantics -- sorry if I've got that wrong.
    1[anonymous]15y
    "Fat chance."
    5Vladimir_Golovin15y
    Yes, I heard such stories as well (edit: and recently read an article discussing real-world performance of Chinese and Japanese soldiers in melee/H2H combat). This is one of the reasons why I think that performance in the real world is a better way to measure success at rationality than any synthetic metric.

    Why aren't rationalists more formidable? Because it takes more than rationality to be formidable. There's also intelligence, dedication, charisma, and other factors, which rationality can do little to improve. Also, formidability is subjective, and I suspect that more intelligent people are less likely to find others formidable. As for why there isn't an art of rationality, I think it's because people can be divided into two groups: those who don't think rationality is particularly important and don't see the benefits of becoming more rational, and those who see rationality as important but are already rational for the most part, and for them, additional rationality training isn't going to result in a significant improvement.

    [+][anonymous]12y-50