Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

A Sense That More Is Possible

61 Post author: Eliezer_Yudkowsky 13 March 2009 01:15AM

Previously in seriesRaising the Sanity Waterline
Followup toTeaching the Unteachable

To teach people about a topic you've labeled "rationality", it helps for them to be interested in "rationality".  (There are less direct ways to teach people how to attain the map that reflects the territory, or optimize reality according to their values; but the explicit method is the course I tend to take.)

And when people explain why they're not interested in rationality, one of the most commonly proffered reasons tends to be like:  "Oh, I've known a couple of rational people and they didn't seem any happier."

Who are they thinking of?  Probably an Objectivist or some such.  Maybe someone they know who's an ordinary scientist.  Or an ordinary atheist.

That's really not a whole lot of rationality, as I have previously said.

Even if you limit yourself to people who can derive Bayes's Theorem—which is going to eliminate, what, 98% of the above personnel?—that's still not a whole lot of rationality.  I mean, it's a pretty basic theorem.

Since the beginning I've had a sense that there ought to be some discipline of cognition, some art of thinking, the studying of which would make its students visibly more competent, more formidable: the equivalent of Taking a Level in Awesome.

But when I look around me in the real world, I don't see that.  Sometimes I see a hint, an echo, of what I think should be possible, when I read the writings of folks like Robyn Dawes, Daniel Gilbert, Tooby & Cosmides.  A few very rare and very senior researchers in psychological sciences, who visibly care a lot about rationality—to the point, I suspect, of making their colleagues feel uncomfortable, because it's not cool to care that much.  I can see that they've found a rhythm, a unity that begins to pervade their arguments—

Yet even that... isn't really a whole lot of rationality either.

Even among those whose few who impress me with a hint of dawning formidability—I don't think that their mastery of rationality could compare to, say, John Conway's mastery of math.  The base knowledge that we drew upon to build our understanding—if you extracted only the parts we used, and not everything we had to study to find it—it's probably not comparable to what a professional nuclear engineer knows about nuclear engineering.  It may not even be comparable to what a construction engineer knows about bridges.  We practice our skills, we do, in the ad-hoc ways we taught ourselves; but that practice probably doesn't compare to the training regimen an Olympic runner goes through, or maybe even an ordinary professional tennis player.

And the root of this problem, I do suspect, is that we haven't really gotten together and systematized our skills.  We've had to create all of this for ourselves, ad-hoc, and there's a limit to how much one mind can do, even if it can manage to draw upon work done in outside fields.

The chief obstacle to doing this the way it really should be done, is the difficulty of testing the results of rationality training programs, so you can have evidence-based training methods.  I will write more about this, because I think that recognizing successful training and distinguishing it from failure is the essential, blocking obstacle.

There are experiments done now and again on debiasing interventions for particular biases, but it tends to be something like, "Make the students practice this for an hour, then test them two weeks later."  Not, "Run half the signups through version A of the three-month summer training program, and half through version B, and survey them five years later."  You can see, here, the implied amount of effort that I think would go into a training program for people who were Really Serious about rationality, as opposed to the attitude of taking Casual Potshots That Require Like An Hour Of Effort Or Something.

Daniel Burfoot brilliantly suggests that this is why intelligence seems to be such a big factor in rationality—that when you're improvising everything ad-hoc with very little training or systematic practice, intelligence ends up being the most important factor in what's left.

Why aren't "rationalists" surrounded by a visible aura of formidability?  Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought?  Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

Of this there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.

I do not except myself from this criticism.  I am no beisutsukai, because there are limits to how much Art you can create on your own, and how well you can guess without evidence-based statistics on the results.  I know about a single use of rationality, which might be termed "reduction of confusing cognitions".  This I asked of my brain, this it has given me.  There are other arts, I think, that a mature rationality training program would not neglect to teach, which would make me stronger and happier and more effective—if I could just go through a standardized training program using the cream of teaching methods experimentally demonstrated to be effective.  But the kind of tremendous, focused effort that I put into creating my single sub-art of rationality from scratch—my life doesn't have room for more than one of those.

I consider myself something more than a first-dan black belt, and less.  I can punch through brick and I'm working on steel along my way to adamantine, but I have a mere casual street-fighter's grasp of how to kick or throw or block.

Why are there schools of martial arts, but not rationality dojos?  (This was the first question I asked in my first blog post.)  Is it more important to hit people than to think?

No, but it's easier to verify when you have hit someone.  That's part of it, a highly central part.

But maybe even more importantly—there are people out there who want to hit, and who have the idea that there ought to be a systematic art of hitting that makes you into a visibly more formidable fighter, with a speed and grace and strength beyond the struggles of the unpracticed.  So they go to a school that promises to teach that.  And that school exists because, long ago, some people had the sense that more was possible.  And they got together and shared their techniques and practiced and formalized and practiced and developed the Systematic Art of Hitting.  They pushed themselves that far because they thought they should be awesome and they were willing to put some back into it.

Now—they got somewhere with that aspiration, unlike a thousand other aspirations of awesomeness that failed, because they could tell when they had hit someone; and the schools competed against each other regularly in realistic contests with clearly-defined winners.

But before even that—there was first the aspiration, the wish to become stronger, a sense that more was possible.  A vision of a speed and grace and strength that they did not already possess, but could possess, if they were willing to put in a lot of work, that drove them to systematize and train and test.

Why don't we have an Art of Rationality?

Third, because current "rationalists" have trouble working in groups: of this I shall speak more.

Second, because it is hard to verify success in training, or which of two schools is the stronger.

But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.

And conversely they don't look at the lack of visibly greater formidability, and say, "We must be doing something wrong."

"Rationality" just seems like one more hobby or hobbyhorse, that people talk about at parties; an adopted mode of conversational attire with few or no real consequences; and it doesn't seem like there's anything wrong about that, either.


Part of the sequence The Craft and the Community

Next post: "Epistemic Viciousness"

Previous post: "Raising the Sanity Waterline"

Comments (205)

Comment author: Vladimir_Golovin 13 March 2009 01:41:39PM *  30 points [-]

Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

Because they don't win? Because they don't reliably steer reality into narrow regions other people consider desirable?

I've met and worked with several irrationalists whose models of reality were, to put it mildly, not correlated to said reailty, with one explicit, outspoken anti-rationalist with a totally weird, alien epistemology among them. All these people had a couple of interesting things in common.

On one hand, they were often dismal at planning – they were unable to see obvious things, and they couldn't be convinced otherwise by any arguments appealing to 'facts' and 'reality' (they universally hated these words).

On the other hand, they were surprisingly good at execution. All of them were very energetic people who didn't fear any work or situation at all, and I almost never saw any of them procrastinating. Could this be because their minds, due to their poor predictive ability, were unable to see the real difficulty of their tasks and thus avoided auto-switching into procrastination mode?

(And a third observation – all these people excelled in political environments. They tended to interpret their surroundings primarily in terms of who is kin to whom, who is a friend of who, who is sexually attracted to whom, what others think of me, who is the most influential dude around here etc etc. What they lost due to their desynchronization with factual reality, they gained back thanks to their political aptness. Do rationalists excel in political environments?)

Comment author: Rings_of_Saturn 14 March 2009 12:16:36AM 4 points [-]


It seems you are being respectful of the anonymity of these people, and very well, that. But you pique my curiosity... who were these people? What kind of group was it, and what was their explicit irrationality all about?

I can think of a few groups that might fit this mold, but the peculiar way you describe them makes me think you have something very specific and odd in mind. Children of the Almighty Cthulu?

Comment author: Vladimir_Golovin 14 March 2009 05:30:44PM *  26 points [-]

I’ll describe three most interesting cases.

Number One is a Russian guy, now in his late 40s, with a spectacular youth. Among his trades were smuggling (during the Soviet era he smuggled brandy from Kazakhstan to Russia in the water system of a railway car), teaching in a ghetto college (where he inadvertently tamed a class of delinquents by hurling a wrench at their leader), leading a programming lab in an industrial institute, starting the first 3D visualization company in our city, reselling TV advertising time at a great margin (which he obtained by undercover deals involving key TV people and some outright gangsters), and saving the world by trying to find venture funding for a savant inventor who supposedly had a technology enabling geothermal energy extraction (I also worked together with them on this project). He was capable of totally crazy things, such as harpooning a wall portrait of a notorious Caucasus clanlord in a room full of his followers. He had lots of money during his successful periods, but was unable to convert this into a longer-term success.

Number Two is a deaf-mute woman, now in her 40s, who owns and runs a web development company. Her speech is distorted, she reads people by the lips, and I wouldn’t rate her particularly attractive – but despite all this she is able to consistently score excellent development / outsourcing contracts with top nationwide brands. Unfortunately, she often forces totally weird decisions upon the development team – and when they try to convince her that the decisions are rubbish by appealing to ‘facts’ and ‘reality’, she takes it as a direct attack on her status. A real example – she once actually imposed an official ban on criticizing her company, decisions of her company, employees and management of her company, partners of her company, products of her company and everything else directly or indirectly related to her company in all communication channels (Skype, bug trackers, IMs, phone conversations, forums etc.)!

Number Three is the most spectacular one – a Russian guy of Jewish descent, around 30, an avid status seeker with an alpha-male attitude who owns and runs several web / game outsourcing companies, plus has a high-level, high-status management/consultancy job in a well-known nationwide online company. He is almost always able to somehow secure funding for his companies and projects, including those which I personally wouldn’t consider marketable. He lives in several cities at once, is excellent at remote leadership and hiring, and is quick to act – when he learns of a talented programming or art team he could potentially partner with, he gets on a plane just to meet them in person.

It was this guy who made me seriously wonder about how immensely weird people’s worldviews can be. This guy constructs his worldview by cherry-picking pieces that appeal to his sense of truth from Abrahamic religions (excluding Islam and the Old Testament), Eastern teachings and, if I remember correctly, even fiction. He hates concepts like ‘logic’, ‘science’, ‘fact’ and ‘reality’ with a passion, and believes that they are evil concepts designed by Anglo-Saxons to corrode ‘good’ worldviews (such as the New-Testament Christian one), and he is actively protecting his worldview from being corroded by evil ideas. Here’s an actual example of his reasoning: “Dawkins is an Anglo-Saxon, and all Anglo-Saxons are evil liars, therefore all ideas Dawkins advocates are evil lies, therefore evolution is a lie and is evil.” He believes that The Enemy himself is sponsoring the evolutionary science by actually providing money, fame and other goods to its proponents. He is sincerely unable to understand how people can be genuinely altruistic without a religious upbringing (and of course, he doesn’t want to consider things like mirror neurons).

So, it was this guy who made me ask myself questions like ‘What is my definition of truth?’

Comment author: Rings_of_Saturn 14 March 2009 06:08:20PM 7 points [-]

Thanks, Vladimir. You have interesting friends!

Comment author: Vladimir_Nesov 14 March 2009 08:54:13PM 2 points [-]

How do you translate that into a question of definition of truth? The third guy is sufficiently rational to be successful, I guess he's got excellent native intelligence allowing him to correctly judge people or influence their decisions, and that his verbal descriptions of his beliefs are mostly rationalization, not hurting his performance too much. If he was a rationalist, he'd probably be even more successful (or he'd find a different occupation).

Comment author: Vladimir_Golovin 14 March 2009 09:39:02PM *  7 points [-]

Yes, the guy is smart, swift-thinking and quick to act when it comes to getting projects up from the ground, connecting the right people and getting funding from nowhere (much less so when it comes to technical details and fine-grained planning). His actual decisions are effective, regardless of the stuff he has in the conscious part of his head.

(Actually quite a lot of people whose 'spoken' belief systems are suboptimal or plain weird are perfectly able to drive cars, run companies, avoid tigers and otherwise deal with the reality effectively.)

But can we call such 'hardware-accelerated' decisions rational? I don't know.

Regarding your question. We had obvious disagreements with this guy, and I spent some time thinking about how can we resolve them. As a result, I decided that trying to resolve them (on a conscious level of course) is futile unless we have an agreement about fundamental things -- what we define as truth, and which methods can we use to derive truths from other truths.

I didn't think much about this issue before I met him (a scientific, or more specifically, Popperian worldview was enough for me), and this was the first time I had to consciously think about the issue. I even doubt I knew the meaning of the term 'epistemology' back then :)

Comment author: Vladimir_Golovin 14 March 2009 06:20:01PM *  2 points [-]

I can think of a few groups that might fit this mold

Rings, what groups did you have in mind?

Comment author: Annoyance 13 March 2009 03:01:56PM 2 points [-]

I have also noticed that people who good at manipulating and interacting with people are bad at manipulating and interacting with objective reality, and vice versa.

The key difference is that the politicals are ultimately dependent on the realists, but not vice versa.

Comment author: Yvain 13 March 2009 02:20:39AM *  72 points [-]

Eliezer, I have recommended to you before that you read The Darkness That Comes Before and the associated trilogy. I repeat that recommendation now. The monastery of Ishual is your rationalist dojo, and Anasurimbor Kellhus is your beisutsukai surrounded by a visible aura of formidability. The book might even give you an idea or two.

My only worry with the idea of these dojos is that I doubt the difference between us and Anasurimbor Kellhus is primarily a difference in rationality levels. I think it is more likely to be akrasia. Even an irrational, downright stupid person can probably think of fifty ways to improve his life, most of which will work very well if he only does them (quit smoking, quit drinking, study harder in school, go on a diet). And a lot of people with pretty well developed senses of rationality whom I know, don't use them for anything more interesting than winning debates about abortion or something. Maybe the reason rationalists rarely do that much better than anyone else is that they're not actually using all that extra brainpower they develop. The solution to that isn't more brainpower.

Kellhus was able to sit down, enter the probability trance, decide on the best course of action for the immediate future, and just go do it. When I tried this, I never found the problem was in the deciding - it doesn't take a formal probability trance to chart a path through everyday life - it was in following the results. Among the few Kellhus-worthy stories I've ever heard from reality was you deciding the Singularity was the most important project, choosing to devote your life to it, and not having lost that resolve fifteen years later. If you could bottle that virtue, it would be worth more than the entire Bayesian corpus combined. I don't doubt that it's positively correlated with rationality, but I do doubt it's a 1 or even .5 correlation.

Comment author: Eliezer_Yudkowsky 13 March 2009 05:05:34PM 22 points [-]

I think the akrasia you describe and methods of combating it would come under the heading of "kicking", as opposing to the "punching" I've been talking about. It's an art I haven't created or learned, but it's an art that should exist.

Comment author: AnnaSalamon 13 March 2009 07:06:19PM *  24 points [-]

This "art of kicking" is what pjeby has been working toward, AFAICT. I haven't read much of his writing, though. But an "art of kicking" would be a great thing to mix in with the OB/LW corpus, if pjeby has something that works, which I think he has at least some of -- and if we and he can figure out how to hybridize kicking research and training with punching research and training.

I'd also love to bring in more people from the entrepreneurship/sales/marketing communities. I've been looking at some of their better literature, and it has rationality techniques (techniques for not shooting yourself in the foot by wishful thinking, overconfidence, etc.) and get-things-done techniques mixed together. I love the sit-and-think math nerd types too, and we need sitting and thinking; the world is full of people taking action toward the wrong goals. But I'd expect better results from our rationalist community if we mixed in more people whose natural impulses were toward active experiments and short-term visible results.

Comment author: Yvain 13 March 2009 11:04:30PM *  19 points [-]

Pjeby's working on akrasia? I'll have to check out his site.

That brings up a related question that I think Eliezer hinted at: what pre-existing bodies of knowledge can we search through for powerful techniques so that we don't have to re-invent the wheel? Entrepreneurship stuff is one. Lots of people have brought up pick-up artists and poker, so those might be others.

I nominate a fourth that may be controversial: mysticism. Not the "summon demons" style of mysticism, but yoga and Zen and related practices. These people have been learning how to examine/quiet/rearrange their minds and sort out the useful processes from the useless processes for the past three thousand years. Even if they've been working off crazy metaphysics, it'd be surprising if they didn't come up with something. Eliezer talks in mystical language sometimes, but I don't know whether that's because he's studied and approves of mysticism or just likes the feel of it.

What all of these things need is a testing process combined with people who are already high-level enough that they can sort through all the dross and determine which techniques are useful without going native or opening themselves up to the accusation that they're doing so; ie people who can sort through the mystical/pick-up artist/whatever literature and separate out the things that are useful to rationalists from the things specific to a certain worldview hostile to our own. I've seen a few good people try this, but it's a mental minefield and they tend to end up "going native".

Comment author: HughRistik 14 March 2009 02:40:21AM *  22 points [-]

In the case of pickup literature, there is a lot to attract rationalists, but also a lot to inspire their ire.

The first thing rationalists should notice about pickup is that it wins. There are no other resources in mainstream culture or psychology that are anywhere near as effective. Yet even after witnessing the striking ability of pickup theories to win, I am hesitant to say that they are actually true. For example, I acknowledge the fantastic success of notions like "women are attracted to Alpha Males," even though I don't believe that they are literally true, and I know that they are oversimplifications of evolutionary psychology. Consequently, I am an instrumentalist, not a realist, about pickup theories.

If we started a project from scratch where we applied rationality to the domain of sex and relationships, and developed heuristics to improve ourselves in those areas, this project would have a considerable overlap with the teachings of the seduction community. At its best, pickup is "applied evolutionary psychology." Many of the common criticisms of pickup demonstrate an anger against the use of rationality and scientific thinking in the supposedly sacred and mystical area of sex and romance. Yet it falls prey to certain ideological notions that limit its general innovativeness and empirical exploration, and some of its techniques are morally questionable.

I would be happy to say more on the relationship between pickup and rationality at some point, and you can tell me how much I've "gone native."

Comment author: wedrifid 07 April 2011 06:02:01PM *  8 points [-]

For example, I acknowledge the fantastic success of notions like "women are attracted to Alpha Males," even though I don't believe that they are literally true, and I know that they are oversimplifications of evolutionary psychology.

I tune out wherever I hear the term 'alpha male' in that sort of context. The original scientific concept has been butchered and abused beyond all recognition. Even more so the 'beta' concept. Beta males are the ones standing right behind the alpha ready to overthrow him and take control themselves. 'Omega' should be the synonym for 'pussy'.

But I must admit the theory is at least vaguely in the right direction and works. Reasonably good as popular science for the general public. Better than what people believe about diet, showering, and dental hygene.

Comment author: MBlume 14 March 2009 02:47:06AM 10 points [-]

Also, since this particular community leans altruistic, I'd hope that such a project would emphasize the future happiness of potential partners more than does (correct me if I'm wrong) the current pickup community.

Comment author: taryneast 07 April 2011 11:51:22AM 8 points [-]

Many of the common criticisms of pickup demonstrate an anger against the use of rationality and scientific thinking in the supposedly sacred and mystical area of sex and romance.

Actually, the best (and most common) criticisms I see are more due to the use of lies and manipulation in the area of sex and romance.

The evo-psych stuff (and thereby any science and rationality) is perfectly fine by me.

Comment author: Vaniver 07 April 2011 12:22:50PM *  -2 points [-]

This seems to me like criticizing the presence of lies in humor- that is, it's something normal and acceptable in practice but unsettling in theory.

Comment author: CuSithBell 07 April 2011 03:42:10PM 7 points [-]

We disagree.

You seem to be suggesting that lies and manipulation in pickup serve to lead the target to a desirable outcome they would not deliberately choose, as in humor. I and many others have repeatedly asserted here that this is not the case. There are pickup techniques that are simply not acceptable - attacking self-esteem, manufacturing breakups, etc.

You (collectively) need to abandon this soldier.

Comment author: wedrifid 07 April 2011 05:30:35PM 8 points [-]

You seem to be suggesting that lies and manipulation in pickup serve to lead the target to a desirable outcome they would not deliberately choose, as in humor. I and many others have repeatedly asserted here that this is not the case.

I assume you mean to include 'all' in there. Some pickup practitioners (and pickup strategies) do use lies and manipulation without consideration of whether the outcome is desirable (and the means appropriate.) That is a legitimate concern. It would certainly not be reasonable to assert this is the norm, which you didn't make clear in your declaration of repeated assertion.

There are pickup techniques that are simply not acceptable - attacking self-esteem

Here it is important not to beware of other optimising. For the average Joe and Jane a courtship protocol that involves attacking each other's self esteem would just be obnoxious and unpleasant. So I wouldn't 'accept' in that sense self esteem lowering tactics to that kind of target. Yet for particularly high status folks within that kind of social game self-esteem attacks are just how it is played - by both sexes. They attack the heck out of each other with social weapons to assure each other that they have the social prowess to handle each other. And they both love every minute of it. Of course even if you take away 90% of their self esteem they probably still have more that enough left!

The biggest problem with self esteem attacking as a strategy come when clumsy PUAs try to use a tactic that is appropriate for 10s on 6s and 7s (in terms of approximate rank in the dating social hierarchy). That is just unpleasant (not to mention ineffective.) A related problem is confusing gender atypical girl with a gender typical girl (often due to complete ignorance of the possibility of that kind of difference). Again that will be unpleasant for the target in question - instead of exactly what she needs to facilitate a satisfying sexual encounter.

Rather than being 'simply not acceptable', pickup techniques that involve attacking self esteem are complexly not acceptable, depending on the context and parties involved.

manufacturing breakups

I am comfortable in labelling individuals who do this as assholes and do anything possible to keep them out of my social circle and generally undermine their status.

You (collectively) need to abandon this soldier.

You collectively? Exactly which collective are you referring to here? It would be reasonable to level the gist of your objection at Vaniver - or at least his specific comment here. But if you mean to level it at the ancestor (by HughRistik) then you are totally missing the mark.

The biggest opportunity to improve discourse on these kind of subjects - and to actually potentially benefit those participating in the dating game - is to abandon judgements on collectives.

Comment author: CuSithBell 07 April 2011 06:31:51PM 2 points [-]

I assume you mean to include 'all' in there. Some pickup practitioners (and pickup strategies) do use lies and manipulation without consideration of whether the outcome is desirable (and the means appropriate.) That is a legitimate concern. It would certainly not be reasonable to assert this is the norm, which you didn't make clear in your declaration of repeated assertion.

In context, I was responding to a generalization with a counter based on exceptions to a proposed rule. I agree there is variety within the pickup community. I disagree that it is uniformly a force for good - and thus that opposition to it is based on dislike for science.

Here it is important not to beware of other optimising. For the average Joe and Jane a courtship protocol that involves attacking each other's self esteem would just be obnoxious and unpleasant. [...]

You're right. I meant to indicate the case of attacking someone's self-esteem in order to make them feel bad (and become pliable), rather than to engage them in a duel of wits.

You collectively? Exactly which collective are you referring to here?

The posters on lesswrong who claim that opposition to pickup on lesswrong is due to women being uncomfortable with explicit analysis of social reality, or (relatedly) that pickup is a uniformly altruistic enterprise (wrt sexual partners).

It's only a judgment on a collective because it's a judgment on a position, and the collective is people who hold that position.

Comment author: Rings_of_Saturn 14 March 2009 12:50:06AM 9 points [-]


You've hit on something that I have long felt should be more directly addressed here/at OB. Full disclosure is that I have already written a lot about this myself and am cleaning up some "posts" and chipping away here to get the karma to post them.

It's tough to talk about meditation-based rationality because (a) the long history of truly disciplined mental practice comes out of a religious context that is, as you note, comically bogged down in superstitious metaphysics, (b) it is a more-or-less strictly internal process that is very hard to articulate (c) has become a kind of catch-all category for sloppy new-age thinking about a great number of things (wrongheaded, pop quantum theory, anyone?)

Nevertheless, as Yvain notes, there is indeed a HUGE body of practice and tried-and-true advice, complete with levels of mastery and, if you have been lucky enough to know some the masters, that palpable awesomeness Eliezer speaks of. I'm sure all of this sounds pretty slippery and poppish, but it doesn't have to be. One thing I would like to help get going here is a rigorous discussion, for my benefit and everyone's, about how we can apply the science of cognition to the practice of meditation and vice versa.

Comment author: Eliezer_Yudkowsky 16 March 2009 07:53:46AM 2 points [-]

Think you've got enough karma to post already.

Comment author: anonym 16 March 2009 07:08:53AM *  1 point [-]

There has been quite a bit of research in recent years on meditation, and the pace seems to be picking up. For a high level survey of recent research on the two primary forms of Buddhist meditation, I'd recommend the following article: Attention regulation and monitoring in meditation. PDF Here

Comment author: olimay 16 March 2009 05:49:57AM 6 points [-]

Yvain, do check out pjeby's work. I have to admit I some points I found myself reading OB as a self help attempt. I'm glad I kept up, but dirtsimple.org was the blog I was actually looking for.

Your point about mysticism is interesting, because I find pjeby's perspective on personal action and motivation has a strange isomorphism to Zen thought, even though that doesn't seem to be main intention. In fact, his emphasis seems to be de-mystifying. One of his main criticisms of existing psychological/self-help literature is that the relatively good stuff is incomprehensible to the people who need it most, because they'd need to already be in a successful, rational action mindset in order to implement what's being said.

Anyway, I hope pjeby chimes up so he can offer something better than my incomplete summary...

Comment author: Vladimir_Golovin 13 March 2009 11:48:15AM *  9 points [-]

It doesn't take a formal probability trance to chart a path through everyday life - it was in following the results

Couldn't agree more. Execution is crucial.

I can come out of a probability trance with a perfect plan, an ideal path of least resistance through the space of possible worlds, but now I have to trick, bribe or force my messy, kludgy, evolved brain into actually executing the plan.

A recent story from my experience. I had (and still have) a plan involving a relatively large chunk of of work, around a full-time month. Nothing challenging, just 'sit down and do it' sort of thing. But for some reason my brain is unable to see how this chunk of work will benefit my genes, so it just switches into a procrastination mode when exposed to this work. I tried to force myself to do it, but now I get an absolutely real feeling of 'mental nausea' every time I approach this task – yes, I literally want to hurl when I think about it.

For a non-evolved being, say an intelligently-designed robot, the execution part would be a non-issue – it gets a plan, it executes it as perfectly as it can, give or take some engineering inefficiencies. But for an evolved being trying to be rational, it's an entirely different story.

Comment author: Vladimir_Golovin 13 March 2009 12:48:17PM 13 points [-]

An idea on how to make the execution part trivial – a rational planner should treat his own execution module as a part of the external environment, not as a part of 'himself'. This approach will produce plans that take into account the inefficiencies of one's execution module and plan around them.

Comment author: thomblake 13 March 2009 09:15:22PM 4 points [-]

I hope you realize this is potentially recursive, if this 'execution module' happens to be instrumental to rationality. Not that that's necessarily a bad thing.

Comment author: Vladimir_Golovin 14 March 2009 06:32:52PM 3 points [-]

No, I don't (yet) -- could you please elaborate on this?

Comment author: Luke_A_Somers 24 March 2013 01:21:43AM 0 points [-]

Funny how this got rerun on the same day as EY posted about progress on Löb's problem.

Comment author: Psy-Kosh 13 March 2009 09:12:14PM 3 points [-]

Well, ideally one considers the whole of themselves when doing the calculations, but it does make the calculations tricky.

And that still doesn't answer exactly how to take it into account. ie, "okay, I need to take into account the properties of my execution module, find ways to actually get it to do stuff. How?"

Comment author: Nick_Tarleton 13 March 2009 09:25:02PM *  1 point [-]

However, treating the execution module as external and fixed may demotivate attempts to improve it.

(Related: Chaotic Inversion)

Comment author: RobinHanson 13 March 2009 01:23:57PM 14 points [-]

If one had public metrics of success at rationality, the usual status seeking and embarrassment avoidance could encourage people to actually apply their skills.

Comment author: Vladimir_Golovin 13 March 2009 01:52:38PM *  7 points [-]

Shouldn't a common-sense 'success at life' (money, status, free time, whatever) be the real metric of success at rationality? Shouldn't a rationalist, as a General Inteligence, succeed over a non-rationalist in any chosen orderly environment, according to any chosen metric of success -- including common metrics of that environment?

Comment author: Nick_Tarleton 13 March 2009 09:38:36PM 11 points [-]


  • If "general intelligence" is a binary classification, almost everyone is one. If it's continuous, rationalist and non-rationalist humans are indistinguishable next to AIXI.
  • You don't know what the rationalist is optimizing for. Rationalists may even be less likely to value common-sense success metrics.
  • Even if those are someone's goals, growth in rationality involves tradeoffs - investment of time, if nothing else - in the short term, but that may still be a long time.
  • Heck, if "rationality" is defined as anything other than "winning", it might just not win for common-sense goals in some realistic environments.
  • People with the disposition to become rationalists may tend to also not be as naturally good at some things, like gaining status.
Comment author: Vladimir_Golovin 14 March 2009 06:49:56PM *  5 points [-]


  1. Agreed. Let's throw away the phrase about General Intelligence -- it's not needed there.

  2. Obviously, if we're measuring one's reality-steering performance we must know the target region (and perhaps some other parameters like planned time expenditure etc.) in advance.

  3. The measurement should measure the performance of a rationalist at his/her current level, not taking into account time and resources he/she spent to level up. Measuring 'the speed or efficiency of leveling-up in rationality' is a different measurement.

  4. The definitions at the beginning of the original post will do.

  5. On one hand, the reality-mapping and reality-steering abilities should work for any activity, no matter whether the performer is hardware-accelerated for that activity or not. On the other hand, we should somehow take this into account -- after all, excelling at things one is not hardware-accelerated for is a good indicator. (If only we could reliably determine who is hardware-accelerated for what).

(Edit: cool, it does numeric lists automatically!)

Comment author: Annoyance 13 March 2009 03:14:10PM 4 points [-]

Public metrics aren't enough - society must also care about them. Without that, there's no status attached and no embarrassment risked.

To get this going, you'd also need a way to keep society's standards on-track, or even a small amount of noise would lead to a positive feedback loop disrupting its conception of rationality.

Everyone has at least a little bit of rationality. Why not simply apply yourself to increasing it, and finding ways to make yourself implement its conclusions?

Just sit under the bodhi tree and decide not to move away until you're better at implementing.

Comment author: roland 13 March 2009 07:16:05AM 2 points [-]


you make a great point here. AFAIK it is common knowledge that a lot of great intelectuals where great procrastinators. Overcoming one's bad habits is key. But I wonder about what can be done in that regard since so much is defined by genetics.

Comment author: Psy-Kosh 13 March 2009 03:59:25AM *  10 points [-]

While developing a rationality metric is obviously crucial, I have this nagging suspicion that what it may take is simply a bunch of committed wanna-be rationalists to just get together and, well, experiment and teach and argue, etc with each other in person regularly, try to foster explicit social rules that support rather than inhibit rationality, and so on.

From there, at least use a fuzzy this "seems" to work/not work type metric, even if it's rather subjective and imprecise, as a STARTING POINT, until one can more precisely do that, until one gets a better sense of exactly what to look for, explicitly.

But, my main point is my suspicion that "do it, even if you're not entirely sure yet what you're doing, just do it anyways and try to figure it out on the fly" may actually be what it takes to get started. If nothing else, it'll produce some nice case study in failure that at least one can look at and say "okay, let's actually try to work out what we did wrong here"

EDIT: hrm... maybe I ought reconsider my position. Will leave this up, at least for now, but with the added note that now I'm starting to suspect myself of basically just trying to "solve the problem without having to, well, actually solve the problem"

Comment author: billswift 13 March 2009 05:11:34AM *  3 points [-]

Before you consider taking this down, you might want to read Thomas Sowell's "A Conflict of Visions" and "Knowledge and Decisions". Some (Many) problems cannot be "solved" but must be continually "worked on". I suspect most "self-improvement" programs are of this type. (Sowell doesn't address this, his discussion and examples are all from economic and social problems, but I think they're applicable.)

Comment author: Regex 11 October 2015 02:32:35AM *  0 points [-]

I've been predicted! This almost exactly describes what I've been up to recently... (Will make a post for it later. Still far too rough to show off. Anyone encountering this comment in 2016 or later should see a link in my profile. Otherwise, message me.)

Edit: Still very rough, and I ended up going in a slightly different direction than I'd hoped. Strange looking at how much my thoughts of it changed in a mere two months. Here it is

Comment author: Daniel_Burfoot 13 March 2009 04:21:16AM 6 points [-]

For a nice literary description of what it means to have an "aura of awesome" try "The String Theory" by David Foster Wallace. Wallace writes of a mid-level pro tennis player: "The restrictions on his life have been, in my opinion, grotesque... But the radical compression of his attention and sense of himself have allowed him to become a transcendent practitioner of an art."

Perhaps in the future humans will achieve the same level of excellence at the Art of Rationality as some currently do at the Art of Tennis.


Comment author: ryleah 28 February 2014 09:57:41PM *  4 points [-]

Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

I'm relatively new to rationality, but I've been a nihilist for nearly a decade. Since I've started taking developing my own morality seriously, I've put about 3500 hours of work into developing and strengthening my ethical framework. Looking back at myself when nihilism was just a hobbyhorse, I wasn't noticeably moral, and I certainly wasn't happy. I was a guy who knew things, but the things I knew never got put into practice. 5 years later, I'm a completely different person than I was when I started. I've made a few discoveries, but not nearly enough to account for the radical shifts in my behavior. My behavior is different because I practice.

I know a few other nihilists. They post pictures of Nietzsche on Facebook, come up with clever arguments against religion, and have read "the Anti-Christ." They aren't more moral just because they subscribe to an ethos that requires them to develop their own morality, and from that evidence I can assume that rationalists won't be more rational just because they subscribe to an ethos that demands the think more rationally. Changing your mind requires more than just reading smart things and agreeing with them. It requires practice.

In the spirit put up or shut up, I'm going to make a prediction. My prediction is that if we keep track of how often we use a rationalist technique in the real world, we will find that frequency of use correlates to the frequency at which we visualize and act out using that technique. Once we start quantifying frequency of use, we'll be able to better understand how rationalism impacts our abilities to reach our goals. Until we differentiate between enthusiasts and practitioners, we might as well be tracking whether liking a clever article on Facebook correlates to success.

Comment author: PhilGoetz 13 March 2009 05:30:07AM *  3 points [-]

Just an observation: Few modern American karate schools ever let you hit someone, except when a lot of padding is involved. Fighting is not usually an element in exams below the blackbelt level. Competition is usually optional and not directly linked to advancement. I've seen students attain advanced belts without having any real-life fighting ability.

(The term "dojo" is Japanese, and I think most Japanese martial artists study Judo or Aikido, which are not subject to these criticisms.)

Comment author: roland 13 March 2009 08:05:51AM *  4 points [-]

You are looking at the wrong art Phil, go to a boxing or Muay Thai school and you will see real hitting. Btw, as a martial artist myself I don't consider karate a serious martial art and part of that is for the reasons you stated. Although I think there is full-contact Karate which is a serious art.

PS: If you are looking for a good martial art look for one where the training involves a lot of realistic sparring. IMHO there should be sparring almost every time you train.

Comment author: Psy-Kosh 13 March 2009 06:22:28AM 2 points [-]

Which criticism? If you mean to say Aikido is competative, well, depending which flavor, it often doesn't have much in the way of competition... as such. The training method involves people pairing up and basically taking turns attacking and defending, with the "defending" person being the one actually doing whatever technique is the one in question, but the "attacker" is supposed to allow it, or at least not overly resist/fight back.

Or, did I misunderstand?

Comment author: ABranco 13 October 2009 04:03:55AM 1 point [-]

There's a question begging to be made here: what is a good martial art? Is one that brings inner calm and equilibrium in itself? Or one that is effective in keeping aggressions away?

Not that those aren't correlated, but some martial arts excel more in the former and in the environment of feudal Japan. I doubt the exuberance and aesthetics of most of those arts prove effective, however, confronting the dangers of modern cities.

In this sense, something much less choreographic or devoid of ancient philosophy — such as the straightforward and objective Israeli self-defense krav maga — seems to be much more effective.

What is curious here is: a great deal of krav maga training involves lots of restraining, since hitting "for real" would mean fractured necks or destroyed testes. So there's no competition, either.

Can it be that in martial arts there's a somehow inverse correlation between the potential of real-life damage (and therefore effectiveness) and the realism by which the training is executed?

Comment author: Douglas_Knight 13 October 2009 04:15:39AM 0 points [-]

some martial arts excel more in the former and in the environment of feudal Japan.

No empty-handed martial arts are extant from feudal Japan. They were illegal then, thus secret.

Comment author: taryneast 07 April 2011 12:17:58PM *  2 points [-]

jujitsu is an empty-handed martial art of the Koryu (or traditional) school. (according to wikipedia) :)

Comment author: Douglas_Knight 08 April 2011 12:28:41AM 1 point [-]

Yes, jiujitsu is an exception. I learned that sometime in the past two years, but failed to update my comment ;-)

The precise statement is that samurai had a monopoly on force and it was illegal for others to learn martial arts. Thus extant feudal Japanese martial arts were for samurai. Sometimes samurai were unarmed, hence jiujitsu, though it assumes both combatants are heavily armored.

What I really meant in my comment was that karate was imported around the end of the shogonate and that judo and aikido were invented around 1900. However, they weren't invented from scratch, but adapted from feudal jiujitsu. They probably have as much claim to that tradition as brand-name jiujitsu. In any event, jiujitsu probably wasn't static or monolithic in 1900, either.

Comment author: PhilGoetz 13 October 2009 04:13:59AM 0 points [-]

Can it be that in martial arts there's a somehow inverse correlation between the potential of real-life damage (and therefore effectiveness) and the realism by which the training is executed?

Yes. Certainly for judo vs. most other martial arts. (Although I wouldn't call judo ineffective - it can be used in many situations where you wouldn't use other martial arts at all.)

Comment author: Mercurial 22 November 2011 01:36:48AM 1 point [-]

[Judo] can be used in many situations where you wouldn't use other martial arts at all.

I'd be really interested in hearing what those circumstances are. I usually make the same claim about Aikido (e.g., you probably don't want to crush Uncle Mortimer's trachea just because he happened to grab a knife in his drunken stupor).

Comment author: khafra 22 November 2011 09:28:12PM *  2 points [-]

I'd call the reality-joint-cleaving line the one between adrenaline-trigger training and adrenaline control training. Most training in traditional arts like Kuntao Silat and modern ones like the now-deprecated USMC LINE system involves using fear and stress as a trigger to start a sequence of techniques that end with disabling or killing the attacker. Most training in traditional arts like Tai Chi and (more) modern ones like Aikido involve retaining the ability to think clearly and act in situations where adrenaline would normally crowd out "system 2" thinking.

Any art can be trained in either way. A champion boxer would probably be calm enough to use a quick, powerful jab and knock the knife out of Uncle Mortimer's hand in a safe direction. A Marine with PTSD might use the judo-like moves from the LINE system to throw him, break several bones, and stomp on his head before realizing what he was doing.

A less discrete way to look at it adapts the No Free Lunch theorem: A fighting algorithm built for a specific environment like a ring with one opponent and a limited set of moves, or a field of combat with no legal repercussions and unskilled opponents, can do well in their specific setting. A more general fighting algorithm will perform more evenly across a large variety of environments, but will not beat a specialized algorithm in its own setting unless it's had a lot more training.

Comment author: Mercurial 23 November 2011 04:22:08AM 3 points [-]

I'd call the reality-joint-cleaving line the one between adrenaline-trigger training and adrenaline control training.

That is an excellent point. My father and I still sometimes get into debates that pivot on this. He says that in a real fight your fight-or-flight system will kick in, so you might as well train tense and stupid since that's what you'll be when you need the skills. But I've found that it's possible to make the sphere of things that don't trigger the fight-or-flight system large enough to encompass most altercations I encounter; it's definitely the harder path, but it seems to have benefits outside of fighting skill as well.

A less discrete way to look at it adapts the No Free Lunch theorem...

Possibly! I think that in the end, what I most care about in my art is that I can defend myself and my family from the kinds of assaults that are most likely. I'm not likely to enter any MMA competitions anytime soon, so I'm pretty okay with the possibility that my survival skills can't compete with MMA-trained fighters in a formal ring.

Comment author: BrandonReinhart 13 March 2009 01:40:32AM *  9 points [-]

Every dojo has its sensei. There is a need for curriculum, but also skilled teachers to guide the earnest student. LessWrong and Overcoming Bias have, to some extent, been the dojo in which the students train. I think that you may find a lot of value in just jumping into a project like this: starting a small school that meets two times a week to practice a particular skill of rationality. A key goal to the budding school is to train the future's teachers.

One of my barriers to improving my rationality is little awareness of what the good reading and study material is. A curriculum of reading material -- rationalist homework -- would help me greatly. Furthermore, I have no friends that are similarly interested in the subject to bounce ideas off of or "train" with.

I train Jiu-Jitsu with several friends. We learn the same lessons, but learn at different rates. We discover different insights and share them. We practice techniques on each other and on opponents more and less skilled than ourselves. This dynamic is something rationalist dojos could benefit from.

Edit, Additional Comments:

The sense that a particular skill should be systematized and trained comes, in part, from the realization that the training conveys a measurable formidability. Problem #1 and Problem #2, then, are entangled: without a way to validate training, one can not say "I have defeated my opponent because of my training" and it is the ability to demonstrate mastery that motivates others to become students.

Many readers of this site and Overcoming Bias are here because of the demonstration of budding formidability found in the insights in OB posts. We read insights and learn techniques -- sloppily, like learning to wrestle from a mail order program -- and we want to be able to produce similar insights and be similarly formidable.

And conversely they don't look at the lack of visibly greater formidability, and say, "We must be doing something wrong."

Is the "lack of visibly greater formidability" actually visible? The wandering master sees the local students' deficiencies that the students are blind to. It is only when those students' established leader is defeated by the wandering master that the greater formidability becomes apparent. Where is rationality's flying guillotine?

Comment author: pjeby 13 March 2009 04:09:01AM 4 points [-]

More precisely, what is rationality's method for scoring matches? If you don't have that, you have no way to know whether the flying guillotine is any good, or whether you're even getting better at what you're doing within your own school.

To me, the score worth caring about most, is how many of your own irrational beliefs, biases, conditioned responses, etc., you can identify and root out... using verifiable criteria for their removal... as opposed to simply being able to tell that it would be a good idea to think differently about something. (Which is why I consider Eliezer "formidable", as opposed to merely "smart": his writing shows evidence of having done a fair amount of this kind of work.)

Unfortunately, this sort of measurement is no good for scoring matches, unless the participants set out to prove at the beginning that they were more wrong than their opponent, at the beginning of the "match"!

But then, neither is any other sort of competitive measurement any good, as far I can see. If you use popularity, then you are subject to rhetorical effects, apparent intelligence, status, and other biasing factors. If you use some sort of reality-based contest, the result needn't necessarily correlate with rationality or thinking skills in general. And if you present a puzzle to be solved, how will you judge the solution, unless you're at least as "formidable" as the competitors?

Comment author: NancyLebovitz 14 September 2010 05:12:49PM 3 points [-]

Any system of measurement is subject to Goodhart's Law. This is really rough when you're trying to engage with reality.

Comment author: infotropism 13 March 2009 02:28:49AM *  7 points [-]

On a side note, we have religious schools where a religion, such as Christianism, is part of the cursus. This indoctrinates young minds very early in their life, and leaves them scared, biased in most cases for the rest of their existence.

If we had, on the other hand, schools where even just basics of rationality and related topics, such as game theory, economics, scientific method, probabilities, biases, etc. were taught, what a difference it would make.

The sooner you kickstart rationality in a person, the longer they have to learn and practice it, obviously. But if those teachings are part of their formative experiences, from childhood to early adulthood, where their personalities, dreams and goals are being put together, how differently would they organize their lives ...

Comment author: taryneast 07 April 2011 12:09:21PM *  1 point [-]

To begin with, I'd like to see is a set of things one can teach ones own kids to lead them to a more rational basis as they grow up. Kind of a rationality inoculation? :)

Comment author: MTGandP 01 November 2012 05:08:35PM 2 points [-]

Elizer raises the issue of testing a rationality school. I can think of a simple way to at least approach this: test the students for well-understood cognitive biases. We have tests for plenty of biases; some of the tests don't work if you know about them, which surely these students will, but some do, and we can devise new tests.

For example, you can do the classic test of confirmation bias where you give someone solid evidence both for and against a political position and see if they become more or less certain. Even people who know about this experiment should often still fall prey to it—if they don't, they have demonstrated their ability to escape confirmation bias.

Comment author: haig 13 March 2009 08:33:34AM *  2 points [-]

Isn't this a description of what a liberal arts education is supposed to provide? The skills of 'how to think' not 'what to think'? I'm not too familiar with the curriculum since I did not attend a liberal arts college, instead I was conned into an overpriced private university, but if anyone has more info please chime in.

Comment author: David_Gerard 21 February 2011 02:15:16PM 3 points [-]

That's what a liberal arts curriculum was originally intended to teach, yes - it's just a bit out of date. An updated version would be worth working out and popularising.

Comment author: [deleted] 13 March 2009 11:14:09AM *  3 points [-]


Comment author: David_Gerard 21 February 2011 02:14:02PM -1 points [-]

If rationality is about winning by knowing the truth, and general intelligence is correlated with "positive life outcomes", then a training program should be based on the steps that are typically taken by smart people. So why not just train in probability theory, logic and science, and use regular exams as a measure of your "general rationality"?

Add "chimpanzee tribal politics" and "large-scale human politics" and you might be onto a winner. The PPE curriculum is worth drawing on heavily, for example - PPE plus science and technology, perhaps. But we could easily end up with a ten-year undergraduate degree :-)

Comment author: zaph 13 March 2009 10:24:25AM 3 points [-]

I see what you're saying about rationality being trained in a pure fashion (where engineering, the sciences in general, etc. is - hopefully - "applied rationality"). One thing I don't see you mention here but it was a theme in your 3 worlds story, and which is also a factor in martial arts training, is emotional management. That's crucial for rationality, since it will most likely be our feelings that lead us astray. Look at how the feeling of "trust" did in Madoff's investors. Muay thai and Aikido deal with emotions differently, but each train people to overcome their basic fear reactions with something else. An awesome rationalist, to me, would be someone who can maintain rationality when the situation is one of high emotion.

Comment author: khafra 08 April 2011 07:17:12PM 0 points [-]

I wonder if this comment inspired Patrissimo's inagural post on his new Rational Poker site.

Comment author: zaph 09 April 2011 12:57:39AM 1 point [-]

I'll be happy to take a cut if the RP folks are so inclined :) But I think emotional management in poker and games in general is important to succeed in those arenas, and underscores the need for this component in rationality training.

Comment author: Annoyance 13 March 2009 02:57:37PM 0 points [-]

It's easy to define success in martial arts. Defining 'rationality' is harder. Have you done so yet, Eliezer?

Even in martial arts, many of the schools of thoughts are essentially religions or cults, completely unconcerned with fighting proficiency and deeply concerned with mastering the arcane details of a sacred style passed on from teacher to student.

Such styles often come with an unrealistic conviction that the style is devastatingly effective, but there is little concern with testing that.

See also: http://www.toxicjunction.com/get.asp?i=V2741

I've read a great many comments and articles by people talking about how karate black belts are being seriously beaten by people with real-world fighting experience - pimps, muggers, etc. Becoming skilled in an esoteric discipline is useful only if that discipline is useful.

Do not seek to establish yourself as a sensei. Do not seek to become a "master of the art". Instead, try to get better at fighting - or, in this case, thinking correctly - even you don't get to wear a hood and chant about 'mysteries'.

Comment author: Vladimir_Golovin 13 March 2009 03:13:25PM *  5 points [-]

karate black belts are being seriously beaten by people with real-world fighting experience - pimps, muggers, etc

Yes, I heard such stories as well (edit: and recently read an article discussing real-world performance of Chinese and Japanese soldiers in melee/H2H combat). This is one of the reasons why I think that performance in the real world is a better way to measure success at rationality than any synthetic metric.

Comment author: Eliezer_Yudkowsky 13 March 2009 05:13:23PM 6 points [-]

Defining 'rationality' is harder. Have you done so yet, Eliezer?

Already defined "rationality" in passing in the second sentence of the article, just in case someone came in who wasn't familiar with the prior corpus.

You, of course, are familiar with the corpus and the amount of work I've already put into defining rationality; and so I have made free to vote down this comment, because of that little troll. I remind everyone that anything with a hint of trollishness is a fair target for downvoting, even if you happen to disagree with it.

Comment author: Lee_A_Arnold 13 March 2009 10:57:05PM 0 points [-]

Eliezer, what do you say about someone who believed the world is entirely rational and then came to theism from a completely rational viewpoint, such as Kurt Gödel did?

Comment author: Eliezer_Yudkowsky 13 March 2009 11:00:50PM 3 points [-]

I'd say, "take it to the Richard Dawkins forum or an atheism IRC channel or something, LW is for advanced rationality, not the basics".

Comment author: Lee_A_Arnold 13 March 2009 11:20:30PM 0 points [-]

Surely Gödel came to it through a very advanced rationality. But I'm trying to understand your own view. Your idea is that Bayesian theory can be applied throughout all conceptual organization?

Comment author: Eliezer_Yudkowsky 13 March 2009 11:32:10PM -1 points [-]

My view is that you should ask your questions of some different atheist on a different forum. I'm sure there will be plenty willing to debate you, but not here.

Comment author: Lee_A_Arnold 14 March 2009 12:56:30AM 1 point [-]

I'm not a theist, and so you have made two mistakes. I'm trying to find out why formal languages can't follow the semantics of concepts through categorial hierarchies of conceptual organization. (Because if they had been able to do so, then there would be no need to train in the Art of Rationality -- and we could easily have artificial intelligence.) The reason I asked about Gödel is because it's a very good way to find out how much people have thought about this. I asked about Bayes because you appear to believe that conditional probability can be used to construct algorithms for semantics -- sorry if I've got that wrong.

Comment author: ikrase 13 December 2012 08:55:13PM 1 point [-]

What kinds of tests or contests might we have? One that I can think of would be to have students try to create some visible, small scale effect in a society, with points for efficiency.

Comment author: blacktrance 28 February 2014 11:04:05PM -1 points [-]

Why aren't rationalists more formidable? Because it takes more than rationality to be formidable. There's also intelligence, dedication, charisma, and other factors, which rationality can do little to improve. Also, formidability is subjective, and I suspect that more intelligent people are less likely to find others formidable. As for why there isn't an art of rationality, I think it's because people can be divided into two groups: those who don't think rationality is particularly important and don't see the benefits of becoming more rational, and those who see rationality as important but are already rational for the most part, and for them, additional rationality training isn't going to result in a significant improvement.