If you're interested in learning rationality, where should you start? Remember, instrumental rationality is about making decisions that get you what you want -- surely there are some lessons that will help you more than others.

You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren't trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.

Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people's estimates about African countries. The idea wasn't that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we're just rational maximizers.

Most academic work on irrationality has followed in K&T's footsteps. And, in turn, much of the stuff done by LW and CFAR has followed in the footsteps of this academic work. So it's not hard to believe that LW types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)

But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude their biggest problem is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences. Usually their biggest problems are far more quotidian and commonsensical.

Take Eliezer. Surely he wanted SIAI to be a well-functioning organization. And he's admitted that lukeprog has done more to achieve that goal of his than he has. Why is lukeprog so much better at getting what Eliezer wants than Eliezer is? It's surely not because lukeprog is so much better at avoiding Sequence-style cognitive biases! lukeprog readily admits that he's constantly learning new rationality techniques from Eliezer.

No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends. As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, "it was a gap in general rationality."

So if you're interested in closing the gap, it seems like the skills to prioritize aren't things like commitment effect and the sunk cost fallacy, but stuff like "figure out what your goals really are", "look at your situation objectively and list the biggest problems", "when you're trying something new and risky, read the For Dummies book about it first", etc. For lack of better terminology, let's call the K&T stuff "cognitive biases" and this stuff "practical biases" (even though it's all obviously both practical and cognitive and biases is kind of a negative way of looking at it). 

What are the best things you've found on tackling these "practical biases"? Post your suggestions in the comments.

New to LessWrong?

New Comment
70 comments, sorted by Click to highlight new comments since: Today at 8:04 AM

Basically, the problem is that K&T-style insights about cognitive biases -- and, by extension, the whole OB/LW folklore that has arisen around them -- are useless for pretty much any question of practical importance. This is true both with regards to personal success and accomplishment (a.k.a. "instrumental rationality") and pure intellectual curiosity (a.k.a. "epistemic rationality").

From the point of view of a human being, the really important questions are worlds apart from anything touched by these neat academic categorizations of biases. Whom should I trust? What rules are safe to break? What rules am I in fact expected to break? When do social institutions work as advertised, and when is there in fact conniving and off-the-record tacit understanding that I'm unaware of? What do other people really think about me? For pretty much anything that really matters, the important biases are those that you have about questions of this sort -- and knowing about the artificial lab scenarios where anchoring, conjunction fallacies, etc. are observable won't give you any advantage there.

Note that this applies to your biases about abstract intellectual topics just ... (read more)

I really like this post. Could you make the link go both ways?
That said, I think you are overstating your case.
Also, if you figure out what local social norms are and that the stories are BS, you can accomodate the norms and ignore the stories internally. You can also optimize separate internal stories and external ones, or alternatively, drop out of the official story entirely and just be some guy who hangs around and is fun to talk to and mysteriously seems to always have enough money for his needs (the secret being largely that one's needs turn out to be very cheap to fulfill, even extravagantly, if optimized for directly, and money is likewise easy to get if optimized for directly). If you aren't dependent on others, don't compete, dont make demands, and are helpful and pleasant, you can get away with not conforming.

3Vladimir_M12y
Sure.
0[anonymous]12y
If this isn't a joke, how does it balance VMs overstatement?
-1MichaelVassar12y
It's an alternative to having a well-calibrated bias towards conformity.
9CarlShulman12y
I agree for most topics, but there are applied cases of clear importance. Investment behavior provides particularly concrete and rich examples, which are a major focus for the K&T school, and "libertarian paternalists" inspired by them: index funds as preferable to overconfident trading by investors, setting defaults of employee investment plans to "save and invest" rather than "nothing," and so forth. Now, you can get these insights packaged with financial advice in books and the like, and I think that tends to be more useful than general study of biases, but the insights are nonetheless important to the tune of tens or hundreds of thousands of dollars over a lifetime.
7Will_Newsome12y
Worse than useless, give illusion of insight. (And I feel like many comments on this post are sort of exemplary of that problem—as you put it in a different context, the equivalent of magic healing crystals are being talked about in a frighteningly serious manner.)
0whowhowho11y
By Django, that needed saying!

Here are some tentative guesses about this whole rationality and success business.

Let's set aside "rationality" for a minute and talk about mental habits. Everyone seems to agree that having the right habits is key to success, perhaps most famously the author of 7 Habits of Highly Effective People. But if you look at the 7 habits the Covey identifies ("Be Proactive", "Begin with the End in Mind", "Put First Things First", "Think Win/Win", "Seek First to Understand, Then Be Understood", "Synergize", and "Sharpen the Saw") they don't look too much like what gets discussed on Less Wrong. So what gives?

I think part of the problem is the standard pattern-matching trap. Perhaps books like Covey's genuinely do address the factors that the vast majority of people need to work on in order to be more successful. But analytical folks tend not to read these books because

  • they're part of a genre that's sullied its reputation by overpromising
  • even when they don't overpromise, analytical people are rarely part of the target audience, and the books do things like give incorrect folk explanations for stuff tha
... (read more)

I really enjoyed The Seven Habits of Highly Effective People. (By contrast, I tried reading some @pjeby stuff yesterday and it had all the problems you describe cranked up to 11 and I found it incredibly difficult to keep reading.)

I don't think the selection bias thing would be a problem if the community was focused on high-priority instrumental rationality techniques, since at any level of effectiveness becoming more effective should be a reasonably high priority. (By contrast, if the community is focused on low-priority techniques it's not that big a deal (that was my attitude toward OvercomingBias at the beginning) and when it gets focused on stuff like cryo/MWI/FAI I find that an active turnoff.)

I think there's a decent chance epistemic rationality, ceteris paribus, makes you less likely to be traditionally successful My general impression from talking to very successful people is that very few of them are any good at figuring out what's true; indeed, they often seem to have set up elaborate defense mechanisms to make sure no one accidentally tells them the truth.

5pjeby12y
Technically, John was describing the problems of analytical readers, rather than the problems of self-help writers. ;-) I have noticed, though, that some of my early writing (e.g. 2010 and before) is very polarizing in style: people tend to either love it or hate it, and the "hate it" contingent seems larger on LW than anywhere else. However, most of the people who've previously said on LW that they hate my writing, seemed to enjoy this LW post, so you may find something of use there.

It's the ...

INFOMERCIAL STYLE!

... of formatting. Doesn't work for everyone ;-)

4timtyler12y
Of course, instrumental rationality is not a perfect predictor of success either. There are always stochastic factors with the potential to lead to bad outcomes. How strong a predictor it is depends on the size of such factors.

Eliezer noted (in a comment on a blog post I made about Nonprofit Kit For Dummies) that he did in fact buy the book and try to apply it. This suggests the difference was in fact Luke, and that we need How To Be Lukeprog For Dummies, which he is of course posting piecemeal ;-)

Eliezer's comment doesn't say he tried to apply the lessons in Nonprofit Kit for Dummies, though some of it he clearly did — e.g. filing the necessary paperwork to launch a 501c3!

Anyway, reading a how-to book doesn't help much unless you actually do what the book recommends. That's why it's such an important intervention to figure out How To Actually Do The Stuff You Know You Should Be Doing — also known as How to Beat Procrastination.

But the anti-akrasia techniques we've uncovered so far don't work for everyone, and there are other factors at play. For example, since a young age Eliezer has become cognitively exhausted rather quickly. He has spent years trying different things (diet, exercise, context changes, vitamins, etc.) but still hasn't found an intervention that lets him do cognitive work for as long as I can. (Luckily, the value of an hour of cognitive work from Eliezer is much higher than the value of an hour of cognitive work from me.)

Also, there was no time in history when it made sense for Eliezer Yudkowsky to spend his time doing Nonprofit Kit for Dummies stuff. (But it would have made sense, I think, for Eliezer to try harder to find someone who could do non-profit m... (read more)

8Pablo12y
I'm confused. You seem to be suggesting that procrastination is one of the main "biases" we need to overcome (or, as I would put it, that the ability to beat procrastination is one of the main "practical skills" we need to develop). But aaronsw disagrees that this is what you yourself believe: "As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, 'it was a gap in general rationality.'" (emphasis added) Could you clarify?

But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude the issue is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences.

I very often conclude that people are suffering from the planning fallacy.

Not falling prey to the planning fallacy is the most obvious and quantifiable result from applying rationality techniques in my day to day life.

6Decius12y
How often is that the reason that they aren't making progress toward their goals?
9Dreaded_Anomaly12y
Very often. Any project that goes "over budget" - that's the planning fallacy. On a smaller scale, any meeting which goes too long or has too many scheduled presentations (90% of the meetings I've attended) - that's the planning fallacy. The people who plan meetings or budget projects are aiming for the meetings to end on time and the projects to be completed within their budgets, but they're not meeting those goals.
2Decius12y
So... if there is a 10% chance that there will be a 25% cost overrun, and a 90% chance that the unexpected expenses will fall within the contingency budget, should the budget be 125% projected cost, or 102.5% projected cost? If there are 6 items on the agenda, and a 95% chance that each of them will take 5 minutes but a 5% chance that they will take 15 minutes, how long should the meeting be scheduled for? Keep in mind that meetings will expand to fill the allocated time, even if they are completed before then, and projects will tend to use their entire budget if possible. Granted, some people budget without a 'contingency' line item, but budgeting for the expected serious cost increase doesn't significantly reduce the odds of going over budget, because the frequency of a serious overrun is so low that the expected cost is much smaller than the actual cost should one occur. Expecting all projects to complete on time and within budget, now that IS a planning fallacy.
9handoflixue12y
Which is entirely the wrong way to go about the problem. If this project is critical, and it's failure will sink the company, you really, really want to be in a position to handle the 25% cost overrun. If you have ten other identically-sized, identically-important project, then the 102.5% estimate is probably going to give you enough of a contingency to handle any one of them going over budget (but what is your plan if two go over budget?) Thinking in terms of statistics, without any actual details attached, is one of the BIG failure modes I see from rationalists - and one that laypeople seem to avoid just fine, because to them the important thing is that Project X will make or break the company. I'd suggest that this is a solvable problem - I've worked in multiple offices where meetings routinely ended early. Having everyone stand helps a lot. So does making them a quick and daily occurrence (it becomes routine to show up on time). So does having a meeting leader who keeps things on-topic, understands when an issue needs to be "taken offline" or researched and brought up the next day, etc..
6sakranut12y
So, to refine Decius' formula from above, you'd want to add in a variable which represents expected marginal utility of costs. I don't think the problem here is thinking in terms of statistics; I think that the problem is attempting to use a simple model for a complicated decision. [edited for grammar]
0handoflixue12y
Both geeks and laypeople seem to use overly simply models, but (in my experience) they simplify in DIFFERENT ways: Geeks/"rationalists" seem to over-emphasize numbers, and laypeople seem to under-emphasize them. Geeks focus on hard data, while laypeople focus on intuition and common sense.
7fubarobfusco12y
"Intuition and common sense" sound more like styles of thought process, not models. The models in question might be called "folklore" and "ordinary language" — when thinking "intuitively" with "common sense", we expect the world to fit neatly into the categories of ordinary language, and for events to work out in the way that we would find plausible as a story.
1Decius12y
If you have a project which will bankrupt the company if it fails, then it does not have a budget. It has costs. If you have multiple such projects, such that if any one of them fails, the company goes bankrupt, then they all have costs instead of budget. Note that I'm assigning such a large negative value to bankruptcy such that it is trivially worse to be bankrupt with a large amount of debt as it is to be bankrupt with a smaller amount of debt- if the sunk costs fallacy applies, then there is a fate significantly worse than cancelling the project due to cost overruns; funding the project more and having it fail. Tricks to avoid long meetings are different than figuring out how long a meeting will last.
3handoflixue12y
Thus it instead being in response to the idea that meetings will expand to fill their schedule - if you don't solve that, then scheduling is that much less reliable. Yes it does; even if the budget is "100% of the company resources", that's still a constraint. Given that the odds of success probably drop drastically if you stop providing payroll, paying rent, etc., then it's constrained further. It might also be the case that spending (say) 10% of your resources elsewhere will double your profits on success, but you have a corresponding 10% chance of failure because of it. 90% chance of a major success vs 10% chance of bankruptcy is not necessarily a trivial decision.
[-][anonymous]12y280

I really don't like the word 'bias' especially in combination with 'overcoming'. It implies that there's the ideal answer being computed by your brain, but it has a bias added to it, which you can overcome to have a correct answer. Much more plausible is that you do not have the answer, and you substitute some more or less flawed heuristic. And if you just overcome this heuristic you will get dumber..

8Swimmer963 (Miranda Dixon-Luinenburg) 12y
I think your point is a good one. However, I don't think you're disagreeing with the main point of the post, since 'flawed heuristics' are more an example of the traditional 'biases' studied by researchers, which aaronsw is indeed claiming aren't that important for improving general rationality. The "not getting around to reading a 'nonprofits for dummies' book" isn't an example of overcoming a heuristic and becoming dumber, it's an example of having a knowledge/common sense gap and not applying any kind of heuristic at all. "Always read the relevant 'for dummies' book first if you want to start working on a project" is a heuristic, which is probably biased in itself, but which most people don't follow when they would be better off following it. Also, I think there is more subtlety to 'overcoming bias' than just not using that heuristic anymore (and maybe being dumber). Heuristics exist because they are useful in most circumstances, but they occasionally fail massively when subjected to new and unexpected types of situations. Realizing that thinking happens in the form of heuristics, and then trying to notice when you're in a situation where you wouldn't expect the heuristic to apply, can help with the problem of being overconfident on a given problem. Recognized ignorance is preferable to being very certain of an answer that is likely wrong, in terms of not making decisions that will blow up in your face.
3[anonymous]12y
There may be more subtlety in the ideal but I fail to see it in practice, and least of all I see any sign of lower overconfidence.

The difference in optimisation targets between LW and H&B researchers is an important thing to point out, and probably the main thing I'll take away from this post.

Biases can:-

  • Be interesting to learn about
  • Serve an academic/political purpose to research
  • Give insight into the workings of human cognition
  • Be fun to talk about
  • Actually help to achieve your goals by understanding them

And the correlations between any 2 of these things need not be strong or positive.

Is it the halo effect if we assume that a more interesting bias will better help us achieve our goals?

No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends.

There's a similar principle that I use sometimes when solving physics problems, and when building anything electronic. It's called "Do it the Right Way."

Most of the time, I take shortcuts. I try things that seem interesting. I want to rely on myself rather than on a manual. I don't want to make a list of things to do, but instead want to do things as I think of them.

This is usually fine - it's certainly fast when it works, and it's usually easy to check my answers. But as I was practicing physics problems with a friend, I realized that he was terrible at doing things my way. Instead, he did things the right way. He "used the manual." Made a mental list. Followed the list. Every time he made a suggestion, it was always the Right Way to do things.

With physics, these two approaches aren't all that far apart in terms of usefulness - though it's good to be able to do both. But if you want to do carpentry or build electronics, you have to be able to do things the Right Way.

4Robert Miles11y
To add to that, if you want to Do Things The Right Way, don't use a mental list, use a physical list. Using a checklist is one of the absolute best improvements you can make in terms of payoff per unit of effort. The famous example is Gawande, who tested using a "safe surgery" checklist for surgeons, which resulted in a 36% reduction in complications and a 47% fall in deaths.

Gerald Weinberg is a celebrated author of computer and management books. And for many years he was a management consultant. Often he would get a consulting gig at an organization he had consulted for in the past. The unhealthy organizations, he observed, had the same (crushing) worst problem during his second gig that they had during his first gig, whereas the better organizations tended to have lots of little problems, which he took as a sign that the organization was able to recognize their worst problems and slowly or quickly shrink them.

I am not sure because I do not have access to the book, but that is probably from the chapter or section "Rudy’s Law of Rutabagas" from Weinberg's book The Secrets of Consulting: A Guide to Giving and Getting Advice Successfully.

What Weinberg did after he stopping doing management consulting, by the way, is to run workshops on improving what we would call individual and team rationality, and he maintained that people learned the skills he taught a lot better in the right kind of interpersonal situations (e.g., workshops) than they do from written materials.

Hope that helps someone.

Use direct replies to this comment for suggesting things about tackling practical biases.

Carol Dweck's Mindset. While unfortunately it has the cover of a self-help book, it's actually a summary of some fascinating psychology research which shows that a certain way of conceptualizing self-improvement tends to be unusually effective at it.

2Dorikka12y
Reviews seem to indicate that the book can and should be condensed into a couple quality insights. Is there any reason to buy the actual book?

The main insight of the book is very simple to state. However, the insight was so fundamental that it required me to update a great number of other beliefs I had, so I found being able to read a book's worth of examples of it being applied over and over again was helpful and enjoyable. YMMV.

2Pablo12y
I took a look at Mindset. The book seemed to me extremely repetitive and rambling. Its teachings could be condensed in an article ten or fifteen times shorter. Fortunately, this Stanford Magazine piece seems to accomplish something close to that. So, read the piece, and forget the book.

Set a ten-minute timer and make a list of all the things you could do that would make you regret not doing them sooner. And then do those things.

I have a pretty long list like this that I try to look at every day, but I can't post it for the next two weeks for a complicated, boring reason.

9aaronsw12y
It's been two weeks. Can you post it now?
4[anonymous]12y
Indeed I can, and thank you for reminding me: D_Malik d_livers. I didn't see your comment earlier because I switched accounts to stop using a pseudonym (as you can see), and I haven't been browsing the internet much lately because I'm doing my anki backlog, which I have because I was away from home for three weeks doing SPARC and other things, which, together with the fact that my anki "ideas" deck was corrupt (because I copied it over to my ipad before first closing anki) and the fact that I couldn't de-corrupt it on my ipad and didn't have my laptop with me, made me unable to post it at the time of the grandparent comment.
8aaronsw12y
lukeprog's writings, especially Build Small Skills in the Right Order.
6D_Malik12y
Buy some nicotine gum and chew that while doing useful stuff, like working out, doing SRS reviews, thinking really hard about important things, etc.. Of course you should read up on nicotine gum before you do this. Start here.
6niceguyanon12y
I am curious about using nicotine as a low cost way to improve performance, and build positive habits for exercise. However as an ex-tobacco smoker (4 years). I am very wary of my interest in nicotine because I suspect that my interest is based on latent cravings. After reading about the positive effects of nicotine, all I could think about was taking a pull of an e-cig, I didn't give any thoughts at all to gums or patches, which should be a warning sign. I am quite conflicted about this — I am very certain I would not go back to smoking tobacco, but I see my self using e-cigs as a daily habit rather than to promote habit learning on skills and activities that I want.
2[anonymous]12y
I think you should be careful and stick to gum or lozenges (or maybe patches) if you do nicotine at all. Chewing a 4mg nicorette (gradually, as per the instructions) produces blood concentrations of nicotine about 2/3 that of a cigarette. If you cut a 4mg nicorette into 4 pieces like I do, and only take 1 piece per 30 minutes, that's even less. It's not enough to produce any sense of visceral pleasure for me (in a study on gwern's page, people couldn't distinguish between 1mg nicotine and placebo), but I think it's still enough to form habits. I don't think you should use nicotine as a way of "rewarding" things (by producing noticeable pleasure). Maybe you could get someone else to dish out nicotine only when you're doing things you want to reinforce? That way it'll be harder for you to relapse. (I'm D_Malik; I didn't see your post earlier because I changed usernames.)
6niceguyanon11y
Update I eventually purchased Walgreen branded 21mg 24 hour release patches, which I cut into 4 equal doses. I use them for days when I go weight lifting or bouldering. I feel a noticeable alertness when I use the patches. I did not notice any increase desire to smoke and have no noticeable cravings on days when I am off the patch. I decided to stay away from any instant forms of nicotine such as e-cigs or gum.
5aaronsw12y
Ray Dalio's "Principles". There's a bunch of stuff in there that I disagree with, but overall he seems pretty serious about tackling these issues -- and apparently has been very successful.

I think that you are using the word 'bias' somewhat idiosyncratically here, and that this might be causing some people to have a hard time understanding the main point of this post, which (if I may) I would summarize as follows:

Many people in this community seem to believe that, when we do not get what we want, this is primarily because we are afflicted by one or more cognitive biases, such as anchoring or scope insensitivity. But this is not so. The main source of practical irrationality is lack of certain practical skills or habits, like "figuring out what your goals really are" or "looking at your situation objectively and listing the biggest problems". What are the best ways to develop these skills?

I can vouchsafe that the June CFAR minicamp covered a lot of material on figuring out what your goals really are.

0Pablo12y
You may want to move this comment to the appropriate thread.
0Cyan12y
The material is still a work-in-progress, so minicampers have been asked not to make it public.
4Viliam_Bur12y
More meta: "Believing that not achieving our goals is caused by cognitive biases, when it is actually caused by a lack of skills and habits" is a cognitive bias, isn't it? It only needs some name that would make it easier to remember. Something like: "Nerd Over-Thinking Fallacy".
3wedrifid12y
Nerd over thinking is a different problem and occurs even when the nerds in question don't necessarily believe that the overthinking is useful.
1nshepperd12y
Nope.

As much as I think lukeprog - EY comparisons are informative, I wonder if the difference is just different amounts of energy. I hear that lukeprog is working 60 hour weeks and that EY had trouble being productive for more than 4 hours a day, and looking for citations I noticed this down the page.

That can't explain everything- there's another comment that comes to mind that I'm having difficulty finding, in which one of lukeprog's hacks dramatically increased EY's writing output- but it seems like it's part of a complete explanation.

2MileyCyrus12y
So how does one get more energy?
3Vaniver12y
Various diet and exercise changes seem to improve energy, but like mood and intelligence I suspect the range one can inhabit varies based on biological factors that are mostly beyond individual control.

Indeed. Conscientiousness is a pretty durable personality trait (as are all of the Big Five, and to make things worse, they tend to be 20-50% hereditable too!). This is why I've spent so much time looking into stimulants: 'use the Try Harder, Luke!' doesn't work very well. (Unless your last name is Muelhauser, I suppose.)

My guess would be that risk analysis and mitigation would be one of the more useful positive techniques in practical rationality. I wish every organization with executive officers had a CRO (chief risk officer) position. Of course, a person like that would be highly unpopular, as they would be constantly asking some very hard questions. Imagine that it is you against Murphy. What can go wrong? What are the odds of its going wrong? What are the odds of you mis-estimating that it will go wrong? What has gone wrong in the past? What are the potential mitigation steps? What are the odds of the mitigation steps themselves going wrong? Basically, a CRO would ensure that an organization is (almost) never blindsided, except maybe for true black swans. Otherwise the most that can happen is "a failure mode described in has occurred, we should now review, possibly update and implement the risk mitigation steps outlined". The standard business plan is certainly not a substitute for something like that.

Most companies do not do nearly enough risk analysis and management, possibly because the CEOs are required to be optimistic, and neither the CEO nor the board are personally responsible for failures. The worst that can happen is that they are booted out and get a golden parachute.

My top 2....

Looking at unlikely happenings more sensibly. Remembering that whenever something really unlikely happens to you, it's not a sign from the heavens. I must remember to take into account the number of other unlikely things that might have happened instead that I would also have noticed, and the number of things that happen in a typical time. In a city of a million people, meeting a particular person might seem like a one in a million chance. But if I know a thousand people in the city, and walk past a thousand people in an hour, the chance of bum... (read more)

http://lesswrong.com/lw/ahz/cashing_out_cognitive_biases_as_behavior/ may be of relevance. The single strongest correlation with various unhappy behaviors or outcomes (the DOI) in Bruine de Bruine 2007 (weaker than the overall correlation with succumbing to various fallacies, though!) was 'Applying Decision Rules':

Applying Decision Rules asks participants to indicate, for hypothetical individual consumers using different decision rules, which of five DVD players they would buy (e.g., “Lisa wants the DVD player with the highest average rating across featu

... (read more)

As I recall, experiments show that people who learn about anchoring are still susceptible to anchoring, but people who learn about the sunk cost fallacy are less likely to throw good money after bad.

3gwern11y
Sunk cost training has mixed results: http://www.gwern.net/Sunk%20cost#fnref37
0novalis11y
Thanks. I guess it should be no surprise that people's behavior on quizzes has very little effect on their behavior in real life. This shows that CFAR probably ought to find some way to test the effectiveness of their training other than by a written test.
2Error11y
Supportive anecdote: Since I started reading here, I've started to consciously see and avoid the sunk costs fallacy. (as opposed to several other biases that I now see and recognize, but still don't avoid, apparently because I am an idiot.)

This obviously doesn't help right here and now, but I would like to point out that CfAR is in a good position to investigate this question experimentally. We'll have to wait awhile to be sure, but it looks like they have developed decent debiasing procedures and life outcome measures. I'm also guessing that they can't train people against every bias in a single retreat. Thus the can include different biases in different curriculums and compare their practical effects.

It's a good point that academics are likely to focus on those biases that are likely to be easy to prove, not those that are likely to be important to fix. But I'd expect the most important biases to also manifest themselves in a big way and in lots of different scenarios, and therefore be relatively easy to prove.

[-][anonymous]10y00

''"A large body of evidence[1][2][3][4][5][6][7][7][8][9][10] has established that a defining characteristic of cognitive biases is that they manifest automatically and unconsciously over a wide range of human reasoning, so even those aware of the existence of the phenomenon are unable to detect, let alone mitigate, their manifestation via awareness only."''

AFAIK, currently, none of them. The entire effort is futile and the introductory paragraph to Lesswrong appears self-defeating in light of this. I think there is far more too this place that cognitive bias mitigation.