This site often speaks of rationality and intelligence as though they were the same thing, and that someone, by becoming more rational, becomes more intelligent for practical purposes.
Certainly it seems to me that this must be to some extent the case, but what is the exchange rate? If a person has an IQ of 100, and then they spend a year on lesswrong, reading all the sequences and taking the advice to heart, training their skills and identifying their biases and all that, at the end of it, presumably their raw IQ score is still 100, but if we measure how they do on correlated indicators regarding their lifestyle or something, should we expect to see them, in some way, living the life of a smarter person? How much smarter?
How many points of IQ would you be willing to give up to retain what you have learned from this site?
Personally I would answer "less than one". It seems like it SHOULD be useful, but it doesn't really feel like it is.
The key insight here is the applicability of the weak efficient markets hypothesis: if some useful information is publicly known, you can be pretty sure that other people are already using it to their advantage. If you have found some insight that will enable you to get ahead in practice, it's always a good idea to ask yourself what exactly makes you special to be privy to this information. It may be that you are smarter than others, or that you are lucky to have privileged access to this information, but it may also be that others are already familiar with it and using it, only that you've been oblivious about it so far -- or that the insight is in fact less useful than you think.
This is why the laboratory insight about biases from psychology, behavioral economics, etc. is typically not useful in practice. If this insight really is applicable to what people do even when they have strong incentives to avoid bias, then one would expect that there already is a huge industry targeted at making profits from these biases, and avoiding falling prey to it is already a part of well-known good common sense. (This is indeed the case with e.g. gambling.) Otherwise, it may be that the bias is reproducible in the lab but disappears with enough incentive, just like lots of people would flunk a test of basic arithmetic, but it doesn't mean you could get away with shortchanging them with real money.
In contrast, when it comes to issues that don't have instrumental implications in terms of money, status, etc., it's not that hard to learn about biases and make one's beliefs more accurate than average. Trouble is, it's easy precisely because people normally don't bother correcting their beliefs in such matters, lacking the incentive to do so. (Or even having contrary incentives if the biased beliefs have beneficial signaling and other implications.)