Basically, the problem is that K&T-style insights about cognitive biases -- and, by extension, the whole OB/LW folklore that has arisen around them -- are useless for pretty much any question of practical importance. This is true both with regards to personal success and accomplishment (a.k.a. "instrumental rationality") and pure intellectual curiosity (a.k.a. "epistemic rationality").
From the point of view of a human being, the really important questions are worlds apart from anything touched by these neat academic categorizations of biases. Whom should I trust? What rules are safe to break? What rules am I in fact expected to break? When do social institutions work as advertised, and when is there in fact conniving and off-the-record tacit understanding that I'm unaware of? What do other people really think about me? For pretty much anything that really matters, the important biases are those that you have about questions of this sort -- and knowing about the artificial lab scenarios where anchoring, conjunction fallacies, etc. are observable won't give you any advantage there.
Note that this applies to your biases about abstract intellectual topics just ...
I really like this post. Could you make the link go both ways?
That said, I think you are overstating your case.
Also, if you figure out what local social norms are and that the stories are BS, you can accomodate the norms and ignore the stories internally. You can also optimize separate internal stories and external ones, or alternatively, drop out of the official story entirely and just be some guy who hangs around and is fun to talk to and mysteriously seems to always have enough money for his needs (the secret being largely that one's needs turn out to be very cheap to fulfill, even extravagantly, if optimized for directly, and money is likewise easy to get if optimized for directly). If you aren't dependent on others, don't compete, dont make demands, and are helpful and pleasant, you can get away with not conforming.
Here are some tentative guesses about this whole rationality and success business.
Let's set aside "rationality" for a minute and talk about mental habits. Everyone seems to agree that having the right habits is key to success, perhaps most famously the author of 7 Habits of Highly Effective People. But if you look at the 7 habits the Covey identifies ("Be Proactive", "Begin with the End in Mind", "Put First Things First", "Think Win/Win", "Seek First to Understand, Then Be Understood", "Synergize", and "Sharpen the Saw") they don't look too much like what gets discussed on Less Wrong. So what gives?
I think part of the problem is the standard pattern-matching trap. Perhaps books like Covey's genuinely do address the factors that the vast majority of people need to work on in order to be more successful. But analytical folks tend not to read these books because
I really enjoyed The Seven Habits of Highly Effective People. (By contrast, I tried reading some @pjeby stuff yesterday and it had all the problems you describe cranked up to 11 and I found it incredibly difficult to keep reading.)
I don't think the selection bias thing would be a problem if the community was focused on high-priority instrumental rationality techniques, since at any level of effectiveness becoming more effective should be a reasonably high priority. (By contrast, if the community is focused on low-priority techniques it's not that big a deal (that was my attitude toward OvercomingBias at the beginning) and when it gets focused on stuff like cryo/MWI/FAI I find that an active turnoff.)
I think there's a decent chance epistemic rationality, ceteris paribus, makes you less likely to be traditionally successful My general impression from talking to very successful people is that very few of them are any good at figuring out what's true; indeed, they often seem to have set up elaborate defense mechanisms to make sure no one accidentally tells them the truth.
Eliezer's comment doesn't say he tried to apply the lessons in Nonprofit Kit for Dummies, though some of it he clearly did — e.g. filing the necessary paperwork to launch a 501c3!
Anyway, reading a how-to book doesn't help much unless you actually do what the book recommends. That's why it's such an important intervention to figure out How To Actually Do The Stuff You Know You Should Be Doing — also known as How to Beat Procrastination.
But the anti-akrasia techniques we've uncovered so far don't work for everyone, and there are other factors at play. For example, since a young age Eliezer has become cognitively exhausted rather quickly. He has spent years trying different things (diet, exercise, context changes, vitamins, etc.) but still hasn't found an intervention that lets him do cognitive work for as long as I can. (Luckily, the value of an hour of cognitive work from Eliezer is much higher than the value of an hour of cognitive work from me.)
Also, there was no time in history when it made sense for Eliezer Yudkowsky to spend his time doing Nonprofit Kit for Dummies stuff. (But it would have made sense, I think, for Eliezer to try harder to find someone who could do non-profit m...
But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude the issue is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences.
I very often conclude that people are suffering from the planning fallacy.
Not falling prey to the planning fallacy is the most obvious and quantifiable result from applying rationality techniques in my day to day life.
I really don't like the word 'bias' especially in combination with 'overcoming'. It implies that there's the ideal answer being computed by your brain, but it has a bias added to it, which you can overcome to have a correct answer. Much more plausible is that you do not have the answer, and you substitute some more or less flawed heuristic. And if you just overcome this heuristic you will get dumber..
The difference in optimisation targets between LW and H&B researchers is an important thing to point out, and probably the main thing I'll take away from this post.
Biases can:-
And the correlations between any 2 of these things need not be strong or positive.
Is it the halo effect if we assume that a more interesting bias will better help us achieve our goals?
No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends.
There's a similar principle that I use sometimes when solving physics problems, and when building anything electronic. It's called "Do it the Right Way."
Most of the time, I take shortcuts. I try things that seem interesting. I want to rely on myself rather than on a manual. I don't want to make a list of things to do, but instead want to do things as I think of them.
This is usually fine - it's certainly fast when it works, and it's usually easy to check my answers. But as I was practicing physics problems with a friend, I realized that he was terrible at doing things my way. Instead, he did things the right way. He "used the manual." Made a mental list. Followed the list. Every time he made a suggestion, it was always the Right Way to do things.
With physics, these two approaches aren't all that far apart in terms of usefulness - though it's good to be able to do both. But if you want to do carpentry or build electronics, you have to be able to do things the Right Way.
Gerald Weinberg is a celebrated author of computer and management books. And for many years he was a management consultant. Often he would get a consulting gig at an organization he had consulted for in the past. The unhealthy organizations, he observed, had the same (crushing) worst problem during his second gig that they had during his first gig, whereas the better organizations tended to have lots of little problems, which he took as a sign that the organization was able to recognize their worst problems and slowly or quickly shrink them.
I am not sure because I do not have access to the book, but that is probably from the chapter or section "Rudy’s Law of Rutabagas" from Weinberg's book The Secrets of Consulting: A Guide to Giving and Getting Advice Successfully.
What Weinberg did after he stopping doing management consulting, by the way, is to run workshops on improving what we would call individual and team rationality, and he maintained that people learned the skills he taught a lot better in the right kind of interpersonal situations (e.g., workshops) than they do from written materials.
Hope that helps someone.
The main insight of the book is very simple to state. However, the insight was so fundamental that it required me to update a great number of other beliefs I had, so I found being able to read a book's worth of examples of it being applied over and over again was helpful and enjoyable. YMMV.
Set a ten-minute timer and make a list of all the things you could do that would make you regret not doing them sooner. And then do those things.
I have a pretty long list like this that I try to look at every day, but I can't post it for the next two weeks for a complicated, boring reason.
I think that you are using the word 'bias' somewhat idiosyncratically here, and that this might be causing some people to have a hard time understanding the main point of this post, which (if I may) I would summarize as follows:
Many people in this community seem to believe that, when we do not get what we want, this is primarily because we are afflicted by one or more cognitive biases, such as anchoring or scope insensitivity. But this is not so. The main source of practical irrationality is lack of certain practical skills or habits, like "figuring out what your goals really are" or "looking at your situation objectively and listing the biggest problems". What are the best ways to develop these skills?
I can vouchsafe that the June CFAR minicamp covered a lot of material on figuring out what your goals really are.
As much as I think lukeprog - EY comparisons are informative, I wonder if the difference is just different amounts of energy. I hear that lukeprog is working 60 hour weeks and that EY had trouble being productive for more than 4 hours a day, and looking for citations I noticed this down the page.
That can't explain everything- there's another comment that comes to mind that I'm having difficulty finding, in which one of lukeprog's hacks dramatically increased EY's writing output- but it seems like it's part of a complete explanation.
Indeed. Conscientiousness is a pretty durable personality trait (as are all of the Big Five, and to make things worse, they tend to be 20-50% hereditable too!). This is why I've spent so much time looking into stimulants: 'use the Try Harder, Luke!' doesn't work very well. (Unless your last name is Muelhauser, I suppose.)
My guess would be that risk analysis and mitigation would be one of the more useful positive techniques in practical rationality. I wish every organization with executive officers had a CRO (chief risk officer) position. Of course, a person like that would be highly unpopular, as they would be constantly asking some very hard questions. Imagine that it is you against Murphy. What can go wrong? What are the odds of its going wrong? What are the odds of you mis-estimating that it will go wrong? What has gone wrong in the past? What are the potential mitigation steps? What are the odds of the mitigation steps themselves going wrong? Basically, a CRO would ensure that an organization is (almost) never blindsided, except maybe for true black swans. Otherwise the most that can happen is "a failure mode described in has occurred, we should now review, possibly update and implement the risk mitigation steps outlined". The standard business plan is certainly not a substitute for something like that.
Most companies do not do nearly enough risk analysis and management, possibly because the CEOs are required to be optimistic, and neither the CEO nor the board are personally responsible for failures. The worst that can happen is that they are booted out and get a golden parachute.
My top 2....
Looking at unlikely happenings more sensibly. Remembering that whenever something really unlikely happens to you, it's not a sign from the heavens. I must remember to take into account the number of other unlikely things that might have happened instead that I would also have noticed, and the number of things that happen in a typical time. In a city of a million people, meeting a particular person might seem like a one in a million chance. But if I know a thousand people in the city, and walk past a thousand people in an hour, the chance of bum...
http://lesswrong.com/lw/ahz/cashing_out_cognitive_biases_as_behavior/ may be of relevance. The single strongest correlation with various unhappy behaviors or outcomes (the DOI) in Bruine de Bruine 2007 (weaker than the overall correlation with succumbing to various fallacies, though!) was 'Applying Decision Rules':
...Applying Decision Rules asks participants to indicate, for hypothetical individual consumers using different decision rules, which of five DVD players they would buy (e.g., “Lisa wants the DVD player with the highest average rating across featu
As I recall, experiments show that people who learn about anchoring are still susceptible to anchoring, but people who learn about the sunk cost fallacy are less likely to throw good money after bad.
This obviously doesn't help right here and now, but I would like to point out that CfAR is in a good position to investigate this question experimentally. We'll have to wait awhile to be sure, but it looks like they have developed decent debiasing procedures and life outcome measures. I'm also guessing that they can't train people against every bias in a single retreat. Thus the can include different biases in different curriculums and compare their practical effects.
It's a good point that academics are likely to focus on those biases that are likely to be easy to prove, not those that are likely to be important to fix. But I'd expect the most important biases to also manifest themselves in a big way and in lots of different scenarios, and therefore be relatively easy to prove.
''"A large body of evidence[1][2][3][4][5][6][7][7][8][9][10] has established that a defining characteristic of cognitive biases is that they manifest automatically and unconsciously over a wide range of human reasoning, so even those aware of the existence of the phenomenon are unable to detect, let alone mitigate, their manifestation via awareness only."''
AFAIK, currently, none of them. The entire effort is futile and the introductory paragraph to Lesswrong appears self-defeating in light of this. I think there is far more too this place that cognitive bias mitigation.
If you're interested in learning rationality, where should you start? Remember, instrumental rationality is about making decisions that get you what you want -- surely there are some lessons that will help you more than others.
You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren't trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.
Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people's estimates about African countries. The idea wasn't that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we're just rational maximizers.
Most academic work on irrationality has followed in K&T's footsteps. And, in turn, much of the stuff done by LW and CFAR has followed in the footsteps of this academic work. So it's not hard to believe that LW types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)
But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude their biggest problem is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences. Usually their biggest problems are far more quotidian and commonsensical.
Take Eliezer. Surely he wanted SIAI to be a well-functioning organization. And he's admitted that lukeprog has done more to achieve that goal of his than he has. Why is lukeprog so much better at getting what Eliezer wants than Eliezer is? It's surely not because lukeprog is so much better at avoiding Sequence-style cognitive biases! lukeprog readily admits that he's constantly learning new rationality techniques from Eliezer.
No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends. As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, "it was a gap in general rationality."
So if you're interested in closing the gap, it seems like the skills to prioritize aren't things like commitment effect and the sunk cost fallacy, but stuff like "figure out what your goals really are", "look at your situation objectively and list the biggest problems", "when you're trying something new and risky, read the For Dummies book about it first", etc. For lack of better terminology, let's call the K&T stuff "cognitive biases" and this stuff "practical biases" (even though it's all obviously both practical and cognitive and biases is kind of a negative way of looking at it).
What are the best things you've found on tackling these "practical biases"? Post your suggestions in the comments.