If you're interested in learning rationality, where should you start? Remember, instrumental rationality is about making decisions that get you what you want -- surely there are some lessons that will help you more than others.
You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren't trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.
Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people's estimates about African countries. The idea wasn't that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we're just rational maximizers.
Most academic work on irrationality has followed in K&T's footsteps. And, in turn, much of the stuff done by LW and CFAR has followed in the footsteps of this academic work. So it's not hard to believe that LW types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)
But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude their biggest problem is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences. Usually their biggest problems are far more quotidian and commonsensical.
Take Eliezer. Surely he wanted SIAI to be a well-functioning organization. And he's admitted that lukeprog has done more to achieve that goal of his than he has. Why is lukeprog so much better at getting what Eliezer wants than Eliezer is? It's surely not because lukeprog is so much better at avoiding Sequence-style cognitive biases! lukeprog readily admits that he's constantly learning new rationality techniques from Eliezer.
No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends. As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, "it was a gap in general rationality."
So if you're interested in closing the gap, it seems like the skills to prioritize aren't things like commitment effect and the sunk cost fallacy, but stuff like "figure out what your goals really are", "look at your situation objectively and list the biggest problems", "when you're trying something new and risky, read the For Dummies book about it first", etc. For lack of better terminology, let's call the K&T stuff "cognitive biases" and this stuff "practical biases" (even though it's all obviously both practical and cognitive and biases is kind of a negative way of looking at it).
What are the best things you've found on tackling these "practical biases"? Post your suggestions in the comments.
My guess would be that risk analysis and mitigation would be one of the more useful positive techniques in practical rationality. I wish every organization with executive officers had a CRO (chief risk officer) position. Of course, a person like that would be highly unpopular, as they would be constantly asking some very hard questions. Imagine that it is you against Murphy. What can go wrong? What are the odds of its going wrong? What are the odds of you mis-estimating that it will go wrong? What has gone wrong in the past? What are the potential mitigation steps? What are the odds of the mitigation steps themselves going wrong? Basically, a CRO would ensure that an organization is (almost) never blindsided, except maybe for true black swans. Otherwise the most that can happen is "a failure mode described in has occurred, we should now review, possibly update and implement the risk mitigation steps outlined". The standard business plan is certainly not a substitute for something like that.
Most companies do not do nearly enough risk analysis and management, possibly because the CEOs are required to be optimistic, and neither the CEO nor the board are personally responsible for failures. The worst that can happen is that they are booted out and get a golden parachute.