So the second factor is that most people are rational enough for their own purposes. Oh, they go on wild flights of fancy when discussing politics or religion or philosophy, but when it comes to business they suddenly become cold and calculating. This relates to Robin Hanson on Near and Far modes of thinking. Near Mode thinking is actually pretty good at a lot of things, and Near Mode thinking is the thinking whose accuracy gives us practical benefits.
Seems to me that most of us make predictably dumb decisions in quite a variety of contexts, and that by becoming extra bonus sane (more sane/rational than your average “intelligent science-literate person without formal rationalist training”), we really should be able to do better.
Some examples of the “predictably dumb decisions” that an art of rationality should let us improve on:
I don't think you need the art of rationality much for that stuff. I think just being reminded is almost as good, if not better. Who do you think would do better on them: someone who read all of LW/OB except this post, or someone who read this post only? Now consider that reading all of LW/OB would take at least 256 times longer.
Learning about rationality won't necessarily help you realize where you're being irrational. If you've got a general method for doing that, I'd be interested, but I don't think it's been discussed much on this blog.
Don't try this on a date! (no lukeprog allowed)
Why not? Lukeprog's mistake, assuming you're talking about what I think you're talking about, seems to have been quite the opposite of trying to explain the benefits of an option from the other person's point of view:
So I broke up with Alice over a long conversation that included an hour-long primer on evolutionary psychology in which I explained how natural selection had built me to be attracted to certain features that she lacked.
I imagine he'd have had better luck, or at least not become the butt of quite so many relationship jokes on LW, if he'd gone with something like "you deserve someone who appreciates you better". Notice that from Alice's perspective, this describes exactly the same situation -- but in terms of what it means to her.
Imagine a world where the only way to become really rich is to win the lottery (and everybody is either risk averse or at least risk neutral). With an expected return of less than $1 per $1 spent on tickets, rational people don't buy lottery tickets. Only irrational people do that. As a result, all the really rich people in this world must be irrational.
In other words, it is possible to have situations where being rational increases your expected performance, but at the same time reduces your changes of being a super achiever. Thus, the claim that "rationalists should win" is not necessarily true, even in theory, if "winning" is taken to mean being among the top performers. A more accurate statement would be, "In a world with both rational and irrational agents, the rational agents should perform better on average than the population average."
There's an extent to which we live in such a world. Many people believe you can achieve your wildest dreams if you only try hard enough, because by golly, all those people on the TV did it!
But many poor/middle-class people also believe that they can never become rich (except for the lottery) because the only ways to become rich are crime, fraud, or inheritance. And this leads them to underestimate the value of hard work, education, and risk-taking.
The median rationalist will perform better than these cynics. But his average wealth will also be higher, assuming he accurately observes his chances at becoming succesful.
And this is why I am not so impressed by Eliezer's claim that an x-rationality instructor should be successful in their non-rationality life. Yes, there probably are some x-rationalists who will also be successful people. But again, correlation 0.1. Stop saying only practically successful people could be good x-rationality teachers! Stop saying we need to start having huge real-life victories or our art is useless! Stop calling x-rationality the Art of Winning! Stop saying I must be engaged in some sort of weird signalling effort for saying I'm here because I like mental clarity instead of because I want to be the next Bill Gates! It trivializes the very virtues that brought most of us to Overcoming Bias, and replaces them with what sounds a lot like a pitch for some weird self-help cult...
I think the truth is non-symmetrical: rationalism is the art of not failing, of not being stupid. I agree with you that "rationalists should win big" is not true in the sense Eliezer claims. However, rationalists should be generally above average by virtue of never failing big, never losing too much, e.g. not buying every vitamin at the health food store, not in cults, not bemoaning ancient relationships, etc.
I'm not sure if it was your intent to point this out by contrast, but I would like to point out that a reasonable art of "kicking" would not rely on you making conscious decisions, let alone explicitly rational ones. Rather, it would rely on you ensuring that your subconscious has been freed from sources of bias ahead of time, and is therefore able to safely leap to conclusions in its usual fashion. An art that requires you to think at the time things are actually happening is not much of an art.
Case in point: when reading "Stuck In The Middle With Bruce", I became aware of a subconsciously self-sabotaging behavior I'd done recently. So I "kicked" it out by crosslinking the behavior with its goal-satisfaction state. It would be crazy to wait until the next occasion for that behavior to strike, and then try to reason my way around it, when I can just fix the bloody thing in the first place. (Interestingly, I mentioned the story to my wife, and described how it related to my own behavior... and she thought of a different sort of self-sabotage she was doing, and applied the same mindhack. So, as of now, I'd say that story was one of the top 5 most ...
I voted this up, but I'm replying because I think it's a critical point.
Our brains are NOT designed to make conscious decisions about every thing that crosses our path. Trying to do that is like trying to walk everywhere instead of driving: it's technically possible, but it will take you forever and will be exhausting.
Our brains seem to work more like this: our brains process whatever it is we're doing at the time, and then feed that processed data into our subconscious for use later. Sure it jumps in every once in a while for something important, but generally it sits back and lets your subconscious do the driving.
Rationality should be about putting the best processed information down into your subconscious, so it works the way you'd like it too. Trying to do everything consciously is a poor use of your brain, as it 1) ignores the way your brain is designed to function and 2) forgoes the use of the powerful subconscious circuitry that makes up an enormous part of it.
What does "crosslinking the behavior with its goal-satisfaction state" mean? Specifically, I'm unable to guess what you mean by "crosslinking" and "the goal-satisfaction state" (of a behavior).
The fact that everything I can find on the web carefully avoids giving details and instead takes the form "We have these fantastic techniques that can solve most of your problems; sign up for our seminars and we'll teach them to you" is ... not promising.
Promising the world, giving few details, and insisting on being paid before saying anything more, seems to me to be strongly correlated with dishonesty and cultishness. Since pjeby seems like a valuable member of this community, I hope this case happens to be different; but I'd like to see some evidence.
And for this post, I use "benefits" or "practical benefits" to mean anything not relating to philosophy, truth, winning debates, or a sense of personal satisfaction from understanding things better. Money, status, popularity, and scientific discovery all count.
In my life, I've used rationality to tackle some pretty tough practical problems. The type of rationality I have been successful with hasn't been the debiasing program of Overcoming Bias, yet I have been employing scientific thinking, induction, and heuristic to certain problems in ways that are atypical for the category of people you are calling normal rationalists. I don't know whether to call this "x-rationality" or not, partly because I'm not sure the boundaries between rationality and x-rationality are always obvious, but it's certainly more advanced rationality than what people usually apply in the domains below.
On a general level, I've been studying how to get good (or at least, dramatically better) at things. Here are some areas where I've been successful using rationality:
I am highly familiar with the seduction community, and I've learned a lot from it. It's like extra-systemized folk psychology. It has certain elements of a scientific community, yet it is vulnerable to ideologies developing out of:
(a) bastardized versions of evolutionary psychology being thrown around like the proven truth, often leading to cynical and overgeneralized views of female behavior and preferences and/or overly narrow views of what works,
(b) financial biases,
(c) lack of rigor, because controlled experiments are not yet possible in this field (though I would never suggest that people wait until science catches up and gives us rigorous empirical knowledge before trying to improve their dating lives... who knows how long we will have to wait).
Yet there is promise for the community, because it's beholden to real world results. Its descriptions and prescriptions seems to have been improving, and it has gone through a couple paradigm shirts since the mid 80's.
Or from the general OB/LW picture, where inference is a thing that happens in material systems, and that yields true conclusions, when it does, for non-mysterious reasons that we can investigate and can troubleshoot?
One problem with interfacing formal/mathematical rationality with any "art that works", whether it's self-help or dating, is that when people are involved, there are feed-forward and feed-back effects, similar to Newcomb's problem, in a sense. What you predict will happen makes a difference to the outcome.
One of the recent paradigm shifts that's been happening in the last few years in the "seduction community" is the realization that using routines and patterns leads to state-dependence: that is, to a guy's self-esteem depending on the reactions of the women he's talked to on a given night. This has led to the rise of the "natural" movement: copying the beliefs and mindsets of guys who are naturally good with women, rather than the external behaviors of guys who are good with women.
Now, I'm not actually involved in the community; I'm quite happily married. However, I pay attention to developments in that field because it has huge over...
I'm not sure it's about being an epistemic vs. an instrumental rationalist, vs. about tagging your words so we follow what you mean.
Both people interested in deep truths, and people interested in immediate practical mileage, can make use of both "true models" and "models that are pragmatically useful but that probably aren't fully true".
You know how a map of north America gives you good guidance for inferences about where cities are, and yet you shouldn't interpret its color scheme as implying that the land mass of Canada is uniformly purple? Different kinds of models/maps are built to allow different kinds of conclusions to be drawn. Models come with implicit or explicit use-guidelines. And the use-guidelines of “scientific generalizations that have been established for all humans” are different than the use-guidelines of “pragmatically useful self-models, whose theoretical components haven’t been carefully and separately tested”. Mistake the latter for the former, and you’ll end up concluding that Canada is purple.
When you try to share techniques with LW, and LW balks... part of the problem is that most of us LW-ers aren’t as practiced in contact-with-th...
One common theme is recognizing when your theories aren't working and updating in light of new evidence. Many people are so sure that their beliefs about what 'should' work when it comes to dating are correct that they will keep trying and failing without ever considering that maybe their underlying theory is wrong. A common exercise used in the community to break out of these incorrect beliefs is to force yourself to go out and try things that 'can't possibly work' 10 times in a day, and then every day for a week or a month, until the false belief is banished.
I actually think the LW crowd could learn something from this approach - sometimes all the argument in the world is not as convincing as repeated confrontations with real world results. When it comes to changing behaviour (a key aspect of allowing rationality to improve results in our lives), rational argument is not usually the most effective technique. Rational argument may establish the need for change and the pattern for new behaviour but the most effective way to change behavioural habits is to just start consciously doing the new behaviour until it becomes a habit.
In any rational art of dating in which I would be interested, "winning" would be defined to include, indeed to require, respect for the happiness, well-being, and autonomy of the pursued. I don't know enough about these sub-communities to say whether they share that concern -- what is the impression you've gotten?
roland:
So you have to be aware that there is a fundamental difference in the objectives of the two which will make it extremely difficult or impossible to make BOTH happy at the same time.
ciphergoth:
my experience very much contradicts what you say here.
That's because it's a great example of theory being used to persuade people to take a certain set of "actions that work". There are other theories that contradict those theories, that are used to get other people to take action... even though the specific actions taken may be quite similar!
People self-select their schools of dating and self-help based on what theories appeal to them, not on the actual actions those schools recommend taking. ;-)
In this case, the theory roland is talking about isn't theory at all: it's a sales pitch, that attracts people who feel that dating is an unfair situation. They like what they hear, and they want to hear more. So they read more and maybe buy a product. The writer or speaker then gradually moves from this ev-psych "hook" to other theories that guide the reader to take the actions the author recommends.
That people confuse these sales pitches with actual theory is...
If in 1660 you'd asked the first members of the Royal Society to list the ways in which natural philosophy had tangibly improved their lives, you probably wouldn't have gotten a very impressive list.
Looking over history, you would not have found any tendency for successful people to have made a formal study of natural philosophy.
It would be overconfident for me to say rationality could never become useful. My point is just that we are acting like it's practically useful right now, without very much evidence for this beyond our hopes and dreams. Thus my last sentence - that "crossing the Pacific" isn't impossible, but it's going to take a different level of effort.
If in 1660, Robert Boyle had gone around saying that, now that we knew Boyle's Law of gas behavior, we should be able to predict the weather, and that that was the only point of discovering Boyle's Law and that furthermore we should never trust a so-called chemist or physicist except insofar as he successfully predicted the weather - then I think the Royal Society would be making the same mistake we are.
Boyle's Law is sort of helpful in understanding the weather, sort of. But it's step one of ten million steps, used alone it doesn't work nearly as well as just eyeballing the weather and looking for patterns, and any attempt to judge applicants to the Royal Society on their weather prediction abilities would have excluded some excellent scientists. Any attempt to restrict gas physics itself to things that were directly helpful in predicti...
I'm confused about this article. I agree with most you've said, but I'm not sure the point is exactly. I thought the entire premise of this community was that more is possible, but we're only "less wrong" at the moment. I didn't think there was any promise of results for the current state of the art. Is this post a warning, or am I overlooking this trend?
I agree we shouldn't see x-rationality as practically useful now. You don't rule out rationality becoming the superpower Eliezer portrays in his fiction. That is certainly a long ways off. Boyle's Law and weather prediction is an apt analogy. Just trying harder to apply our current knowledge won't go very far, but there should be some productive avenues.
I think I'd understand your purpose better if you could answer these questions: In your mind, how likely is it that x-rationality could be practically useful in, say, 50 years? What approaches are most likely to get us to a useful practice of rationality? Or is your point that any advances that are made will be radically different from our current lines of investigation?
Just trying to understand.
The above would be component 1 of my own reply.
Component 2 would be (to say it again) that I developed the particular techniques that are to be found in my essays, in the course of solving my problem. And if you were to try to attack that or a similar problem you would suddenly find many more OB posts to be of immensely greater use and indeed necessity. The Eliezer of 2000 and earlier was not remotely capable of getting his job done.
What you're seeing here is the backwash of techniques that seem like they ought to have some general applicability (e.g. Crisis of Faith) but which are not really a whole developed rationalist art, nor made for the purpose of optimizing everyday life.
Someone faced with the epic Challenge Of Changing Their Mind may use the full-fledged Crisis of Faith technique once that year. How much benefit is this really? That's the question, but I'm not sure the cynical answer is the right one.
What I am hoping to see here is others, having been given a piece of the art, taking that art and extending it to cover their own problems, then coming back and describing what they've learned in a sufficiently general sense (informed by relevant science) that I can actually absorb it. For that which has been developed to address e.g. akrasia outside the rationalist line, I have found myself unable to absorb.
But you're not a good test case to see whether rationality is useful in everyday life. Your job description is to fully understand and then create a rational and moral agent. This is the exceptional case where the fuzzy philosophical benefits of rationality suddenly become practical.
One of the fundamental lessons of Overcoming Bias was "All this stuff philosophers have been debating fruitlessly for centuries actually becomes a whole lot clearer when we consider it in terms of actually designing a mind." This isn't surprising; you're the first person who's really gotten to use Near Mode thought on a problem previously considered only in Far Mode. So you've been thinking "Here's this nice practical stuff about thinking that's completely applicable to my goal of building a thinking machine", and we've been thinking, "Oh, wow, this helps solve all of these complicated philosophical issues we've been worrying about for so long."
But in other fields, the rationality is domain-specific and already exists, albeit without the same thunderbolt of enlightenment and awesomeness. Doctors, for example, have a tremendous literature on evidence and decision-making as t...
An x-rationalist who becomes a doctor would not, I think, necessarily be a significantly better doctor than the rest of the medical world, because the rest of the medical world already has an overabundance of great rationality techniques and methods of improving care that the majority of doctors just don't use
Evidence-based medicine was developed by x-rationalists. And to this day, many doctors ignore it because they are not x-rationalists.
...huh. That comment was probably more helpful than you expected it to be. I'm pretty sure I've identified part of my problem as having too high a standard for what makes an x-rationalist. If you let the doctors who developed evidence-based medicine in...yes, that clears a few things up.
One thinks particularly of Robyn Dawes - I don't know him from "evidence-based medicine" per se, but I know he was fighting the battle to get doctors to acknowledge that their "clinical experience" wasn't better than simple linear models, and he was on the front lines against psychotherapy shown to perform no better than talking to any bright person.
If you read "Rational Choice in an Uncertain World" you will see that Dawes is pretty definitely on the level of "integrate Bayes into everyday life", not just Traditional Rationality. I don't know about the historical origins of evidence-based medicine, so it's possible that a bunch of Traditional Rationalists invented it; but one does get the impression that probability theorists trying to get people to listen to the research about the limits of their own minds, were involved.
After thinking on this for a while, here are my thoughts. This should probably be a new post but I don't want to start another whole chain of discussions on this issue.
I had the belief that many people on Less Wrong believed that our currently existing Art of Rationality was sufficient or close to sufficient to guarantee practical success or even to transform its practioner into an ubermensch like John Galt. I'm no longer sure anyone believes this. If they do, they are wrong. If anyone right now claims they participate in Less Wrong solely out of a calculated program to maximize practical benefits and not because they like rationality, I think they are deluded.
Where x-rationality is defined as "formal, math-based rationality", there are many cases of x-rationality being used for good practical effect. I missed these because they look more like three percent annual gains in productivity than like Brennan discovering quantum gravity or Napoleon conquering Europe. For example, doctors can use evidence-based medicine to increase their cure rate.
The doctors who invented evidence-based medicine deserve our praise. Eliezer is willing to consider them x-rationalists. But th
The Eliezer of 2000 and earlier was not remotely capable of getting his job done.
Are you more or less capable of that now? Do you have evidence that you are? Is the job tangibly closer to being completed?
nerds, scientists, skeptics and the like who like to describe their membership in terms of rationality are [not] noticibly better than average at behavioral rationality, as opposed to epistemic rationality where they are obviously better than average but still just hideously bad.
Simply applying "ordinary rationality" to behavior is extreme. People don't use reason to decide if fashion is important, they just copy. Eliezer's Secret Identities post seems to make a very similar point, which seemed to largely match this post. One point was to get rationality advice from people who actually found it useful, rather than ordinary nerds who fetishize it.
An understanding of 'x-rationality' has helped me find the world a little less depressing and a little less frustrating. Previously when observing world events, politics and some behaviours in social interactions that seemed incomprehensible without assuming depressing levels of stupidity, incompetence or malice I despaired at the state of humanity. An appreciation of human biases and evolutionary psychology (some of which stems from an interest in both going back well before I ever started reading OB) gives me a framework in which to understand events in the world which I find a lot more productive and optimistic.
An example from politics: it is hard to make any rational sense of drug prohibition when looking at the evidence of the costs and benefits. This would tend to lead to an inevitable conclusion that politicians and the voting public are either irredeemably stupid or actively seeking negative outcomes. Understanding how institutional incentives to maintain the status quo, confirmation bias and signaling effects (politicians and voters needing to be 'seen to care' and/or 'seen to disapprove') can lead to basically intelligent and well meaning people maintaining catastrophical...
I’m partly echoing badger here, but it’s worth distinguishing between three possible claims:
(1) An “art of rationality” that we do not yet have, but that we could plausibly develop with experimentation, measurements, community, etc., can help people.
(2) The “art of rationality” that one can obtain by reading OB/LW and trying to really apply its contents to one’s life, can help people.
(3) The “art of rationality” that one is likely to accidentally obtain by reading articles about it, e.g. on OB/LW, and seeing what happens to rubs off, can help people.
There are also different notions of “help people” that are worth distinguishing. I’ll share my anticipations for each separately. Yvain or others, tell me where your anticipations match or differ.
Regarding claim (3):
My impression is that even the art of rationality one obtains by reading articles about it for entertainment, does have some positive effects on the accuracy of peoples’ beliefs. A couple people reported leaving their religions. Many of us have probably discarded random political or other opinions that we had due to social signaling or happenstance. Yvain and others report “clarity-of-mind benefits”. I’d give reasonab...
By "decision", I don't mean the decision to get up in the morning, I mean the sort that's made on a conscious level and requires at least a few seconds' serious thought.
Consider yourself lucky if that doesn't describe getting up in the morning for you.
Anyway, not that this counts at all (availability bias), but I made a rational decision a couple of days ago to get some sleep instead of working later into the night on homework. I did exactly that.
In fact, I just made a rational decision-- just now-- to quit reading the article I was reading, work on homework for a few minutes and then go to bed. I haven't gotten to bed yet. Otherwise, that's going well.
...but you will disagree with me. And we are both aspiring rationalists, and therefore we resolve disagreements by experiments. I propose one.
I'm surprised you expected most of your readers to disagree. I think it's pretty clear that the techniques we work on here aren't making us much more successful than most people.
Humans aren't naturally well equipped to be extreme rationalists. The techniques themselves may be correct, but that doesn't mean we can realistically expect many people to apply them. To use the rationality-as-martial art metaphor, if you taught Shaolin kung fu to a population of fifty year old couch potatoes, they would not be able to perform most of the techniques correctly, and you should not expect to hear many true accounts of them winning fights with their skills.
Perhaps with enough work we could refine the art of human instrumental rationality into something much better than what we've got, maybe achieve a .3 correlation with success rather than a .1, but while a fighting style developed explicitly for 50 year old couch potatoes might give your class better results than other styles, you can only expect so much out of it.
Sometimes, people do worse when they try to be rational because they have a poor model of rationality.
One error I commonly see is the belief that rationality means using logic, and that logic means not believing things unless they are proven. So someone tries to be "rational" by demanding proof of X before changing their behavior, even in a case where neither priors nor utilities favor not X. The untrained person may be doing something as naive as argument-counting (how many arguments in favor of X vs. not X), and is still likely to come out ahead of the person who requires proof.
A related error is using Boolean models where they are inappropriate. The most common error of this type is believing that a phenomenon, or a class of phenomena, can have only one explanation.
Here's one example of a change I've made recently, which I think qualifies as x-rationality. When I need to make a decision that depends on a particular piece of data, I now commit to a decision threshold before I look at the data. (I feel like I took this strategy from a LW article, but I don't remember where now.)
For example, I recently had to decide whether it would be worth the potential savings in time and money to commute by motorcycle instead of by car. I set a threshold for what I considered an appropriate level of risk beforehand, and then looked up the accident statistics. The actual risk turned out to be several times larger than that.
Had I looked at the data first, I would have been tempted to find an excuse to go with my gut anyway, which simply says that motorcycles are cool. (I'm a 23-year-old guy, after all.) A high percentage of motorcyclists experience a serious or even fatal accident, so there's a decent chance that x-rationality saved me from that.
Am I the only one who is isn't entirely positive towards the heavy use of language identifying the LW community as "rationalists", including terms like "rationalist training" etc.? (Though he is by far the heaviest user of this kind of language, I'm not really talking about Eliezer here, his language use is whole topic on its own - I'm restricting this particular concern to other people, to the general LW non-Eliezer jargon). Is strongly self-identifying as a "rationalist" really such a good thing? Does it really help you solve problems? (I second the questions raised by Yvain). Though perhaps small, isn't there still a risk that the focus becomes too much on "being a rationalist" instead of on actually solving problems?
Of course, this is a blog about rationality and not about specific problems, so this kind of language is not suprising and sometimes might even be necessary. I'm just a bit hesitant towards it when the community hasn't actually shown that it's better at solving problems than people who don't self-identify as rationalists and haven't had "rationalist training", or shown that the techniques fostered here have such a high cross-domain applicability as seems to be assumed. Maybe after it has been shown that "rationalists" do better than other people, people who just solve problems, I would feel better about this kind of jargon.
Would Newton have gone even further if he'd known Bayes theory? Probably it would've been like telling the world pool champion to try using more calculus in his shots: not a pretty sight.
An interesting choice of example, given that Bayesian probability theory as we know it (inverse inference) was more or less invented by Laplace and used to address specific astronomical controversies surrounding the introduction of Newton's Laws, having to do with combining multiple uncertain observations.
This experiment seems easy to rig4; merely doing it should increase your level of conscious rational decisions quite a bit. And yet I have been trying it for the past few days, and the results have not been pretty. .... [O]ne way to fail your Art is to expect more of it than it can deliver.... Perhaps there are developments of the Art of Rationality or its associated Arts that can turn us into a Kellhus or a Galt, but they will not be reached by trying to overcome biases really really hard.
To make a somewhat uncharitable paraphrase: you read many artic...
I accept that to some degree my results say more negative things about me than about rationality, but insofar as I may be typical we need to take them into account when considering how we're going to benefit from rationality.
...my inability to communicate clearly continues to be the bane of my existence. Let me try a strained metaphor.
Christianity demands its adherents "love thy enemy", "turn the other cheek", "judge not lest ye be judged", "give everything to the poor", and follow many other pieces of excellent moral advice. Any society that actually followed them all would be a very nice place to live.
Yet real-world Christian societies are not such nice places to live. And Christians say this is not because there is anything wrong with Christianity, but because Christians don't follow their religion enough. As the old saying goes, "Christianity has not been tried and found wanting, it has been found difficult and left untried." There's some truth to this.
But it doesn't excuse Christianity's failure to make people especially moral. If Christianity as it really exists can't translate its ideals into action, then it's gone wrong som...
Only a handful responded
I am reserving my judgment for a couple of years. See how I'm doing then.
I have yet to hear what anyone even means by "rationalism" or "rationalist," let alone "x-rationality." People often refer to the "techniques" or "Art of rationality" (a particularly irksome phrase), though as best I can tell, these consist of Bayes theorem and a half-dozen or so logical fallacies that were likely known since the time of Aristotle. Now, I've had an intuitive handle on Bayes theorem since learning of it in high school pre-calc, and spotting a logical fallacy isn't particularly tough for anyo...
If I was rational enough to pick only stocks that would go up, I'd become successful regardless of how little willpower I had, as long as it was enough to pick up the phone and call my broker.
Rationality is not enough to pick the right stocks. You need to have the willpower to read the vast amount of material to enable you to do that pick.
Eliezer considers fighting akrasia to be part of the art of rationality; he compares it to "kicking" to our "punching". I'm not sure why he considers them to be the same Art rather than two related Arts.
Remember your post on haunted rationalists, and Eliezer’s reply about how it’s possible to successfully work to accept rational beliefs even with the not-so-conscious, not-so-verbal parts of oneself that might be continue to believe in ghosts after one rationally understands the arguments against?
It sounds like maybe you mean “rationa...
Extreme rationality is for important decisions, not for choosing your breakfast cereal. Really important decisions - by which I mean those that you'd sleep on, and allocate more than ten minutes of thought - typically coincide with changes in habits and routine, which don't happen more often than once in several months. For more common decisions, we only have time and energy for ordinary rationality.
Practice creates facility. Facility lowers the bar to practice. Repeat. There is no time at which rationality may not be applied, and without practice at small things, how will you apply it to big things?
But besides, isn't it altogether just more fun to think clearly? When I notice myself not doing so, it is as painfui as watching a beautiful machine labouring with leaking pipes and rust.
I don't keep fit just to catch trains or eke out a few more years from the meat.
Thank you for pressing me for concrete details.
Some of what follows goes way back before OB, which is one of various things I have studied or done -- a major one, but there are others -- on the matter of how to think better. The first, for example, I describe as inside vs. outside view, because that is what it is. The practice goes back longer; OB gave it a name.
I. Getting out of bed in the morning. That may seem a trifle, but there is no time at which rationality does not matter, and an hour a day is more than a trifle. The inside view whispers seductively to just laze on half-awake, or drift off to sleep again. The outside view reminds me that it has been my invariable experience that lazing on does not wake me up, that the only thing that does is getting up and moving around, and that in twenty minutes after getting up (my typical boot time for both mind and body) I will be more satisfied with myself, the sooner I did so.
The more clearly I can contemplate the outside view, the easier it becomes to make a move. I can't claim expert proficiency in this. I still get up much faster when I have a specific three-alarm-clock reason, the moment the wristwatch pinger goes off.
II. I beg...
Yes, yes, yes, yes, yes. And also yes.
I had a similar reaction to the fictional rationalist initiation ceremony.
That said, on further consideration, I'm not sure the "Bayesian Conspiracy" has a choice, given its goals.
It's possible that, even though these sorts of policies do turn away perfectly competant rationalists, they are the only alternative to ending up with a comfortable community of one-or-two-sigmas-above-the-mean rationalists rather than an ultra-elite x-rationality club that can bootstrap itself into the sort of excellence that we e...
If I was rational enough to pick only stocks that would go up, I'd become successful regardless of how little willpower I had, as long as it was enough to pick up the phone and call my broker.
Availability of investing is NOT a disproof that akrasia is NOT the complete explanation. Successful investing is rationality+financial education+a lot of work (Buffett is rumored to read an incredible amount of accounting statements), and hence subject to akrasia.
Better decisions are clearly one possible positive outcome of rationality training. But another significant positive outcome is reaching the same decision faster. In my work, there are a number of rationality techniques that I have learned that have not necessarily changed the end result I have come to, but that have contributed to me spending less time confused, and getting to the right result more quickly than I otherwise would have.
Anything that frees up time in this way, has real, positive, and measurable effects on my life. (Also, confusion, and things-not-working are frustrating and stressful; so the less time I spend confused, the better)
I can't think of any people who started out merely above-average, developed an interest in x-rationality, and then became smart and successful because of that x-rationality.
I'm working on this.
In the spirit of concrete reductions, I have a question for everyone here:
Let's say we took a random but very large sample of students from prestigious colleges, split them into two groups, and made Group A take a year-long class based on Overcoming Bias, in which students read the posts and then (intelligent, engaging) professors explained anything the students didn't understand. Wherever a specific technique was mentioned, students were asked to try that technique as homework.
Group B took a placebo statistics class similar to every other college statisti...
The most effective way for you to internally understand the world and make good decisions is to be super rational. However, the most effective way to get other people to aid you on your quest for success is to practice the dark arts. The degree to which the latter matters is determined by the mean rationality of the people you need to draw support from, and how important this support is for your particular ambitions.
I strongly suspect that it is unreasonable to expect people to actively apply x-rationality on a frequent, conscious basis--to do so would be to fight against human cognitive architecture, and that won't end well.
Most of our decisions are subconscious. We won't be changing this. The place of x-rationality is not to make on-the-spot decisions, it's to provide a sanity check on those decisions and, as necessary, retrain the subconscious decision making processes to better approximate rationality.
I think you are right that x-rationality doesn't help an individual win much on a day to day basis. But there are some very important challenges that humanity as a whole is failing for lack of x-rationality.
The current depression. The fact that we aren't adequately protecting the earth from asteroids. DDT being banned. Nobody's getting froze. Religion. First-past-the post elections. Most wars.
I think one reason might be that the vast majority of the decisions we make are not going to make a significant difference as to our overall success by themselves; or rather, not as significant a difference as chance or other factors (e.g., native talent) could. For example, take the example about not buying into a snake-oil health product lessdazed uses above: you've benefited from your rationality, but it's still small potatoes compared to the amount of benefit you could get from being in the right place at the right time and becoming a pop star... or ge...
[W]e should generally expect more people to claim benefits than to actually experience them.
I don't think this claim is supported. There are reasons (some presented) why we should expect this. There are also reasons (a few listed below) why we should expect the opposite. I don't see at all why we should expect either set to dominate.
Reasons I might not post a benefit I've accrued:
1) I'm too busy out enjoying my improved life. 2) The self-congratulatory thread smells too much of an affective death spiral. 3) I am unsure how much of the benefit was act...
study evolutionary psychology in some depth, which has been useful in social situations
Could you elaborate on this?
I doubt that it directly told you anything useful, but it was more likely helpful in telling you to pay attention and not to interpret things through your usual beliefs.
X-Rationality can help you succeed. But so can excellent fashion sense. It's not clear in real-world terms that x-rationality has more of an effect than fashion. And don't dismiss that with "A good x-rationalist will know if fashion is important, and study fashion." A good normal rationalist could do that too; it's not a specific advantage of x-rationalism, just of having a general rational outlook.
Yet many highly intelligent people with normal rationality have terrible fashion sense, particularly males, at least in my anecdotal experience. Di...
In the case of Hubbard, preaching irrationality and being irrational is different. Hubbard went genuinely crazy in his later years, but when he knew what he was doing when he invented Scientology. He even said in an interview once "I'm tired of writing for a penny a page. If a man really wanted to make a million dollars, he would invent a religion."
Reading OB/LW forced me to look hard at my contradictory beliefs about politics, and admit that I no longer believed certain things I used to believe, particularly about the market. If I don't get anything else out of it, that alone would be a large bonus.
Even before reading, I formulated an explanation for myself "all people who are too stupid not to jump off the roof will simply die out", market mechanisms and natural selection will remove all the really destructive consequences of everyday stupidity available to them, will collect all the low-hanging fruits, so the study of rationality will help only in rare or individually weak negative consequences issues. On average, rationalists will be more successful than non-rationalists, but differences between individuals will be greater than differences betwe...
I find my personal experience in accord with the evidence from Vladimir's thread. I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines1, I can't think of any.
Well, it did ultimately help you make SlateStarCodex and Astral Codex Ten successful, which provided a haven for non-extremist thought to thousands of people. And since the latter earned hundreds of thousands in annual revenue, you were able to create the ACX grants program that will probably make the world a...
1: Specifically, reading Overcoming Bias convinced me to study evolutionary psychology in some depth, which has been useful in social situations. As far as I know. I'd probably be biased into thinking it had been even if it hadn't, because I like evo psych and it's very hard to measure.
Oooh! I realize this is an old post, but I'm desperately curious for some concrete examples of this.
"Eliezer considers fighting akrasia to be part of the art of rationality; he compares it to "kicking" to our "punching". I'm not sure why he considers them to be the same Art rather than two related Arts."
I don't understand the implications of seeing it as part of the same art or a different art altogether.
as robin has pointed out on numerous occasions, in many situations it is in our best interest to believe, or profess to believe, things that are false. because we cannot deceive others very well, and because we are penalized for lying about our beliefs, it is often in our best interest to not know how to believe things more likely to be true. refusing to believe popular lies forces you to either lie continually or to constantly risk your relative status within a potentially useful affiliative network by professing contrarian beliefs or, almost as bad, no b...
May I humbly suggest changing the title to "Extreme Rationality: It's Not That Great"? (This will not break any links!)
It actually just occurred to me that the intelligence professions might benefit greatly from some x-rationality. We may not have to derive gravity from an apple, but the closer we come to that ideal, the less likely failures of intelligence become.
Intelligence professionals are constantly engaged a very Bayesian activity, incorporating new data into estimates of probabilities and patterns. An ideal Bayesian would be a fantastic analyst.
If people typically found great personal benefits from reading OB/LW type material, then we would not be such a minority.
We hope that that rationality is increasing, and it could be, but I don't have much confidence that 30 years from now people, even people in positions of power, will be much more rational than they are now.
2: Eliezer considers fighting akrasia to be part of the art of rationality; he compares it to "kicking" to our "punching". I'm not sure why he considers them to be the same Art rather than two related Arts.
Winning.
Your post is a great improvement on mine. Thanks, esp. for the "limiting factor" riff.
Am I alone in thinking the word "akrasia" doesn't quite describe our problem? Isn't it more like "apathy"? Some people wish to be able to do the things they want; lucky them! Me, I just wish to want to do the things I'm able to do.
I will list the only example that comes to my mind : better x-rationality techniques have actually helped me get my university diploma : not a few times getting out of a difficult situation, where I used what I knew of heuristics, biases, the limits and usual mistakes in normal rationality, how one can sound rational regardless of whether he really is ... to give off that impressive aura of someone who knows what he's doing at little cost. To sound rational when facing an audience.
To my defense, I actually faked the cues and tells of my rationality, skill...
...but you will disagree with me. And we are both aspiring rationalists, and therefore we resolve disagreements by experiments.
Your suggested experiment is good. But in this particular case, let's also try to employ the power of positivist thinking on your thesis as a whole. That is, let's break it up into a bunch of specific anticipations, and see what parts there is and isn't disagreement on, before we try to resolve those disagreements. I'll take my own stab at this with a number of short comments in a moment.
For me, the core of the rationality project is something like a determination to make your beliefs completely subservient to reality, at all costs, fighting your natural instincts for defending your beliefs, trying to win debates, etc. Not trusting your beliefs just because they are yours. Approaching the most controversial and divisive subjects with a curiosity mindset. Looking forward to changing your mind.
Most "normally rational" people can do this in technical and scientific matters, but in other domains, such as politics, philosophy, society, economic...
I think the problem with practising rationality as on LessWrong is that people end up not doing perfectly rational actions and strategies the rationale behind which they did not understand or had explained to them (usually people pick those up from environment without the explanation attached). Furthermore intelligence (as in e.g. ability to search in a big solution space for solutions) is the key requirement as well, and intelligence is hard to improve with training, especially for already well trained individuals.
I want to master x-rationality because I want to teach it. I value rational behavior in my fellow human because the historical record is clear: rational behavior is correlated with increased safety, health, and wealth of a society. I want to live in an increasingly safe, healthy, and wealthy society. I understand that "rational" behavior has a saturating plateau, or that it is only so effective, but the masters of rationality must continue to exist in every society, scientific skeptics must be cultivated. I enjoy working with the rational art...
5: In which case it will have ceased to be an experiment and become a technique instead. I've noticed this happening a lot over the past few days, and I may continue doing it.
It would still be an experiment, just a test of x-rationality (including the new technique) rather than a test of x-rationality (excluding the new technique)
And why would you want to test a version of x-rationality less than the best you have?
"techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training."
Here is my definition, I attempted to rely less on CapitalNames in it.
X-Rationality:
Ability to behave according to dictates of rationality in situations where such behaviors would be highly discomforting/counter-intuitive.
As an aside the discomforting part provides an escape hatch for people not to be rational, because they can always claim high utility value of being emotionally comfortable.
X-rationality is the kind you do with math, and humans are crap at casual math, so no surprise it becomes a weapon of last resort. (We ought to be using a damn sight more math for more or less everything - the fact our cognitive architecture doesn't support it will not persuade the universe to let us off lightly).
(Edit: I removed the second half of this comment because if after a day of thinking I can't pin down what I thought I was referring to, then I'm talking nonsense. Sorry. Next time: engage brain, open mouth, in that order.)
As far as I can tell, epistemic and instrumental rationality are two related Arts, even to EY, but are both under the banner of "Rationality" because they both work towards the same goal, that of optimal thinking (I can't cite any specific examples right now, but I'll throw it out there anyway).
Also, another reason for the comparative inefficiency of x-rationality could be lack of information. Epistemic rationality is the Art of filtering/modifying information for greater accuracy. Instrumental rationality is the Art of using all available inform...
We simply don't have the time and computing power to use full rigor on our individual decisions, so we need an alternative strategy. As I understand it, the human brain operates largely on caching. X-rationality allows us to clean and maintain our caches more thoroughly than does traditional rationality. At first, it seems reasonable to expect this to yield higher success rates.
However, our intuition vastly underestimates the size of our personal caches. Furthermore, traditional rationality is simply faster at cleaning, even if it leaves a lot of junk behi...
You say that rationality only slightly correlates with winning. I think that's because incremental increases in rationality don't necessarily lead to incremental increases in winning. Winning is governed by lots of factors, and sometimes you have to get over a critical threshold of rationality to see the results you want.
I am curious about a finer distinction of what you expect to see as evidence of instrumental benefits of x-rationality:
"Extreme rationality" is good - but "x-rationality" is pretty jargon-dense.
Yes, it's shorter - but x-sports and x-programming are not common for a good reason - nobody will know what you are talking about. I recommend caution with the use of "x-rationality".
...And this is why I am not so impressed by Eliezer's claim that an x-rationality instructor should be successful in their non-rationality life. Yes, there probably are some x-rationalists who will also be successful people. But again, correlation 0.1. Stop saying only practically successful people could be good x-rationality teachers! Stop saying we need to start having huge real-life victories or our art is useless! Stop calling x-rationality the Art of Winning! Stop saying I must be engaged in some sort of weird signalling effort for saying I'm here becau
he could have explained that the kids were doing some very impressive mathematics on a subconscious level
But he would have been wrong. It really is in the arm more than in the head.
EDIT: Okay, there's evidence it's largely in the brain. But talking about a 'subconscious level' isn't helpful.
A guy takes some golf lessons. Convinced he's got the mechanics of the swing down, he takes on a pro at a golf course, and has his ass handed to him. "Those golf lessons did me no good", he says. "Do golf lessons even correlate with being good at the sport?".
Check out Gladwell's new book - Outliers. Our success cannot be attributed to our individuality to the degree that most American's think it can. There are huge cultural influences, arbitrary society rules, birth year, etc... There's a chapter on why high IQ only matters to a certain point. Once you're "inteligent enough", practical wisdom takes over in determination of success. I don't think akrasia has that much to do with it. We live in a world of lower inteligence and have to play by those rules. It pays to be ONE step ahead of the mob,...
Related to: Individual Rationality is a Matter of Life and Death, The Benefits of Rationality, Rationality is Systematized Winning
But I finally snapped after reading: Mandatory Secret Identities
Okay, the title was for shock value. Rationality is pretty great. Just not quite as great as everyone here seems to think it is.
For this post, I will be using "extreme rationality" or "x-rationality" in the sense of "techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training." It seems pretty uncontroversial that there are massive benefits from going from a completely irrational moron to the average intelligent person's level. I'm coining this new term so there's no temptation to confuse x-rationality with normal, lower-level rationality.
And for this post, I use "benefits" or "practical benefits" to mean anything not relating to philosophy, truth, winning debates, or a sense of personal satisfaction from understanding things better. Money, status, popularity, and scientific discovery all count.
So, what are these "benefits" of "x-rationality"?
A while back, Vladimir Nesov asked exactly that, and made a thread for people to list all of the positive effects x-rationality had on their lives. Only a handful responded, and most responses weren't very practical. Anna Salamon, one of the few people to give a really impressive list of benefits, wrote:
There have since been a few more people claiming practical benefits from x-rationality, but we should generally expect more people to claim benefits than to actually experience them. Anna mentions the placebo effect, and to that I would add cognitive dissonance - people spent all this time learning x-rationality, so it MUST have helped them! - and the same sort of confirmation bias that makes Christians swear that their prayers really work.
I find my personal experience in accord with the evidence from Vladimir's thread. I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines1, I can't think of any.
Looking over history, I do not find any tendency for successful people to have made a formal study of x-rationality. This isn't entirely fair, because the discipline has expanded vastly over the past fifty years, but the basics - syllogisms, fallacies, and the like - have been around much longer. The few groups who made a concerted effort to study x-rationality didn't shoot off an unusual number of geniuses - the Korzybskians are a good example. In fact as far as I know the only follower of Korzybski to turn his ideas into a vast personal empire of fame and fortune was (ironically!) L. Ron Hubbard, who took the basic concept of techniques to purge confusions from the mind, replaced the substance with a bunch of attractive flim-flam, and founded Scientology. And like Hubbard's superstar followers, many of this century's most successful people have been notably irrational.
There seems to me to be approximately zero empirical evidence that x-rationality has a large effect on your practical success, and some anecdotal empirical evidence against it. The evidence in favor of the proposition right now seems to be its sheer obviousness. Rationality is the study of knowing the truth and making good decisions. How the heck could knowing more than everyone else and making better decisions than them not make you more successful?!?
This is a difficult question, but I think it has an answer. A complex, multifactorial answer, but an answer.
One factor we have to once again come back to is akrasia2. I find akrasia in myself and others to be the most important limiting factor to our success. Think of that phrase "limiting factor" formally, the way you'd think of the limiting reagent in chemistry. When there's a limiting reagent, it doesn't matter how much more of the other reagents you add, the reaction's not going to make any more product. Rational decisions are practically useless without the willpower to carry them out. If our limiting reagent is willpower and not rationality, throwing truckloads of rationality into our brains isn't going to increase success very much.
This is a very large part of the story, but not the whole story. If I was rational enough to pick only stocks that would go up, I'd become successful regardless of how little willpower I had, as long as it was enough to pick up the phone and call my broker.
So the second factor is that most people are rational enough for their own purposes. Oh, they go on wild flights of fancy when discussing politics or religion or philosophy, but when it comes to business they suddenly become cold and calculating. This relates to Robin Hanson on Near and Far modes of thinking. Near Mode thinking is actually pretty good at a lot of things, and Near Mode thinking is the thinking whose accuracy gives us practical benefits.
And - when I was young, I used to watch The Journey of Allen Strange on Nickleodeon. It was a children's show about this alien who came to Earth and lived with these kids. I remember one scene where Allen the Alien was watching the kids play pool. "That's amazing," Allen told them. "I could never calculate differential equations in my head that quickly." The kids had to convince him that "it's in the arm, not the head" - that even though the movement of the balls is governed by differential equations, humans don't actually calculate the equations each time they play. They just move their arm in a way that feels right. If Allen had been smarter, he could have explained that the kids were doing some very impressive mathematics on a subconscious level that produced their arm's perception of "feeling right". But the kids' point still stands; even though in theory explicit mathematics will produce better results than eyeballing it, in practice you can't become a good pool player just by studying calculus.
A lot of human rationality follows the same pattern. Isaac Newton is frequently named as a guy who knew no formal theories of science or rationality, who was hopelessly irrational in his philosophical beliefs and his personal life, but who is still widely and justifiably considered the greatest scientist who ever lived. Would Newton have gone even further if he'd known Bayes theory? Probably it would've been like telling the world pool champion to try using more calculus in his shots: not a pretty sight.
Yes, yes, beisutsukai should be able to develop quantum gravity in a month and so on. But until someone on Less Wrong actually goes and does it, that story sounds a lot like when Alfred Korzybski claimed that World War Two could have been prevented if everyone had just used more General Semantics.
And then there's just plain noise. Your success in the world depends on things ranging from your hairstyle to your height to your social skills to your IQ score to cognitive constructs psychologists don't even have names for yet. X-Rationality can help you succeed. But so can excellent fashion sense. It's not clear in real-world terms that x-rationality has more of an effect than fashion. And don't dismiss that with "A good x-rationalist will know if fashion is important, and study fashion." A good normal rationalist could do that too; it's not a specific advantage of x-rationalism, just of having a general rational outlook. And having a general rational outlook, as I mentioned before, is limited in its effectiveness by poor application and akrasia.
I no longer believe mastering all these Overcoming Bias and Less Wrong techniques will turn me into Anasûrimbor Kellhus or John Galt. I no longer even believe mastering all these Overcoming Bias techniques will turn me into Eliezer Yudkowsky (who, as his writings from 2001 indicate, had developed his characteristic level of awesomeness before he became interested in x-rationality at all)3. I think it may help me succeed in life a little, but I think the correlation between x-rationality and success is probably closer to 0.1 than to 1. Maybe 0.2 in some businesses like finance, but people in finance tend to know this and use specially developed x-rationalist techniques on the job already without making it a lifestyle commitment. I think it was primarily a Happy Death Spiral around how wonderfully super-awesome x-rationality was that made me once think otherwise.
And this is why I am not so impressed by Eliezer's claim that an x-rationality instructor should be successful in their non-rationality life. Yes, there probably are some x-rationalists who will also be successful people. But again, correlation 0.1. Stop saying only practically successful people could be good x-rationality teachers! Stop saying we need to start having huge real-life victories or our art is useless! Stop calling x-rationality the Art of Winning! Stop saying I must be engaged in some sort of weird signalling effort for saying I'm here because I like mental clarity instead of because I want to be the next Bill Gates! It trivializes the very virtues that brought most of us to Overcoming Bias, and replaces them with what sounds a lot like a pitch for some weird self-help cult...
...
...
...but you will disagree with me. And we are both aspiring rationalists, and therefore we resolve disagreements by experiments. I propose one.
For the next time period - a week, a month, whatever - take special note of every decision you make. By "decision", I don't mean the decision to get up in the morning, I mean the sort that's made on a conscious level and requires at least a few seconds' serious thought. Make a tick mark, literal or mental, so you can count how many of these there are.
Then note whether you make that decision rationally. If yes, also record whether you made that decision x-rationally. I don't just mean you spent a brief second thinking about whether any biases might have affected your choice. I mean one where you think there's a serious (let's arbitrarily say 33%) chance that using x-rationality instead of normal rationality actually changed the result of your decision.
Finally, note whether, once you came to the rational conclusion, you actually followed it. This is not a trivial matter. For example, before writing this blog post I wondered briefly whether I should use the time studying instead, used normal (but not x-) rationality to determine that yes, I should, and then proceeded to write this anyway. And if you get that far, note whether your x-rational decisions tend to turn out particularly well.
This experiment seems easy to rig4; merely doing it should increase your level of conscious rational decisions quite a bit. And yet I have been trying it for the past few days, and the results have not been pretty. Not pretty at all. Not only do I make fewer conscious decisions than I thought, but the ones I do make I rarely apply even the slightest modicum of rationality to, and the ones I apply rationality to it's practically never x-rationality, and when I do apply everything I've got I don't seem to follow those decisions too consistently.
I'm not so great a rationalist anyway, and I may be especially bad at this. So I'm interested in hearing how different your results are. Just don't rig it. If you find yourself using x-rationality twenty times more often than you were when you weren't performing the experiment, you're rigging it, consciously or otherwise5.
Eliezer writes:
Yet one way to fail your Art is to expect more of it than it can deliver. No matter how good a swimmer you are, you will not be able to cross the Pacific. This is not to say crossing the Pacific is impossible. It just means it will require a different sort of thinking than the one you've been using thus far. Perhaps there are developments of the Art of Rationality or its associated Arts that can turn us into a Kellhus or a Galt, but they will not be reached by trying to overcome biases really really hard.
Footnotes:
1: Specifically, reading Overcoming Bias convinced me to study evolutionary psychology in some depth, which has been useful in social situations. As far as I know. I'd probably be biased into thinking it had been even if it hadn't, because I like evo psych and it's very hard to measure.
2: Eliezer considers fighting akrasia to be part of the art of rationality; he compares it to "kicking" to our "punching". I'm not sure why he considers them to be the same Art rather than two related Arts.
3: This is actually an important point. I think there are probably quite a few smart, successful people who develop an interest in x-rationality, but I can't think of any people who started out merely above-average, developed an interest in x-rationality, and then became smart and successful because of that x-rationality.
4: This is a terribly controlled experiment, and the only way its data can be meaningfully interpreted at all is through what one of my professors called the "ocular trauma test" - when the data hits you between the eyes. If people claim they always follow their rational decisions, I think I will be more likely to interpret it as lack of enough cognitive self-consciousness to notice when they're doing something irrational than an honest lack of irrationality.
5: In which case it will have ceased to be an experiment and become a technique instead. I've noticed this happening a lot over the past few days, and I may continue doing it.