If you're interested in learning rationality, where should you start? Remember, instrumental rationality is about making decisions that get you what you want -- surely there are some lessons that will help you more than others.
You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren't trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.
Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people's estimates about African countries. The idea wasn't that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we're just rational maximizers.
Most academic work on irrationality has followed in K&T's footsteps. And, in turn, much of the stuff done by LW and CFAR has followed in the footsteps of this academic work. So it's not hard to believe that LW types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)
But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude their biggest problem is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences. Usually their biggest problems are far more quotidian and commonsensical.
Take Eliezer. Surely he wanted SIAI to be a well-functioning organization. And he's admitted that lukeprog has done more to achieve that goal of his than he has. Why is lukeprog so much better at getting what Eliezer wants than Eliezer is? It's surely not because lukeprog is so much better at avoiding Sequence-style cognitive biases! lukeprog readily admits that he's constantly learning new rationality techniques from Eliezer.
No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends. As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, "it was a gap in general rationality."
So if you're interested in closing the gap, it seems like the skills to prioritize aren't things like commitment effect and the sunk cost fallacy, but stuff like "figure out what your goals really are", "look at your situation objectively and list the biggest problems", "when you're trying something new and risky, read the For Dummies book about it first", etc. For lack of better terminology, let's call the K&T stuff "cognitive biases" and this stuff "practical biases" (even though it's all obviously both practical and cognitive and biases is kind of a negative way of looking at it).
What are the best things you've found on tackling these "practical biases"? Post your suggestions in the comments.
Basically, the problem is that K&T-style insights about cognitive biases -- and, by extension, the whole OB/LW folklore that has arisen around them -- are useless for pretty much any question of practical importance. This is true both with regards to personal success and accomplishment (a.k.a. "instrumental rationality") and pure intellectual curiosity (a.k.a. "epistemic rationality").
From the point of view of a human being, the really important questions are worlds apart from anything touched by these neat academic categorizations of biases. Whom should I trust? What rules are safe to break? What rules am I in fact expected to break? When do social institutions work as advertised, and when is there in fact conniving and off-the-record tacit understanding that I'm unaware of? What do other people really think about me? For pretty much anything that really matters, the important biases are those that you have about questions of this sort -- and knowing about the artificial lab scenarios where anchoring, conjunction fallacies, etc. are observable won't give you any advantage there.
Note that this applies to your biases about abstract intellectual topics just as much as to your practical life. Whatever you know about any such topic, you know largely ad verecundiam from the intellectual authorities you trust, so that chances are you have inherited their biases wholesale. (An exception here is material that stands purely on rigorous internal logical evidence, like mathematical proofs, but there isn't much you can do with that beyond pure math.) And to answer the question of what biases might be distorting the output of the official intellectual authorities in the system you live under, you need to ask hard questions about human nature and behavior akin to the above listed ones, and accurately detect biases far more complex and difficult than anything within the reach of the simplistic behavioral economics.
Of course, the problem you ultimately run into is that such analysis, if done consistently and accurately, will produce results that clash with the social norms you live under. Which leads to the observation that some well-calibrated instinctive bias towards conformity is usually good for you.
By Django, that needed saying!