If you're interested in learning rationality, where should you start? Remember, instrumental rationality is about making decisions that get you what you want -- surely there are some lessons that will help you more than others.
You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren't trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.
Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people's estimates about African countries. The idea wasn't that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we're just rational maximizers.
Most academic work on irrationality has followed in K&T's footsteps. And, in turn, much of the stuff done by LW and CFAR has followed in the footsteps of this academic work. So it's not hard to believe that LW types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)
But if you look at the average person and ask why they aren't getting what they want, very rarely do you conclude their biggest problem is that they're suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in the sequences. Usually their biggest problems are far more quotidian and commonsensical.
Take Eliezer. Surely he wanted SIAI to be a well-functioning organization. And he's admitted that lukeprog has done more to achieve that goal of his than he has. Why is lukeprog so much better at getting what Eliezer wants than Eliezer is? It's surely not because lukeprog is so much better at avoiding Sequence-style cognitive biases! lukeprog readily admits that he's constantly learning new rationality techniques from Eliezer.
No, it's because lukeprog did what seems like common sense: he bought a copy of Nonprofits for Dummies and did what it recommends. As lukeprog himself says, it wasn't lack of intelligence or resources or akrasia that kept Eliezer from doing these things, "it was a gap in general rationality."
So if you're interested in closing the gap, it seems like the skills to prioritize aren't things like commitment effect and the sunk cost fallacy, but stuff like "figure out what your goals really are", "look at your situation objectively and list the biggest problems", "when you're trying something new and risky, read the For Dummies book about it first", etc. For lack of better terminology, let's call the K&T stuff "cognitive biases" and this stuff "practical biases" (even though it's all obviously both practical and cognitive and biases is kind of a negative way of looking at it).
What are the best things you've found on tackling these "practical biases"? Post your suggestions in the comments.
Which is entirely the wrong way to go about the problem. If this project is critical, and it's failure will sink the company, you really, really want to be in a position to handle the 25% cost overrun. If you have ten other identically-sized, identically-important project, then the 102.5% estimate is probably going to give you enough of a contingency to handle any one of them going over budget (but what is your plan if two go over budget?)
Thinking in terms of statistics, without any actual details attached, is one of the BIG failure modes I see from rationalists - and one that laypeople seem to avoid just fine, because to them the important thing is that Project X will make or break the company.
I'd suggest that this is a solvable problem - I've worked in multiple offices where meetings routinely ended early. Having everyone stand helps a lot. So does making them a quick and daily occurrence (it becomes routine to show up on time). So does having a meeting leader who keeps things on-topic, understands when an issue needs to be "taken offline" or researched and brought up the next day, etc..
If you have a project which will bankrupt the company if it fails, then it does not have a budget. It has costs. If you have multiple such projects, such that if any one of them fails, the company goes bankrupt, then they all have costs instead of budget.
Note that I'm assigning such a large negative value to bankruptcy such that it is trivially worse to be bankrupt with a large amount of debt as it is to be bankrupt with a smaller amount of debt- if the sunk costs fallacy applies, then there is a fate significantly worse than cancelling the project due to cost overruns; funding the project more and having it fail.
Tricks to avoid long meetings are different than figuring out how long a meeting will last.