I broadly agree with your conclusions, and I wanted to further note that this article in the PNAS draws a link between cognitive skills and material success in a way not simply mediated by job choice.
While we certainly cannot say that cognitive skills and rationality are identical, the article does discuss how groups with higher average ability to plan, take calculated risks, etc. seem to do better over the long-run, including a specific discussion of the Industrial Revolution in Britain.
Reply to: Extreme Rationality: It's Not That Great
Belaboring of: Rational Me Or We?
Related to: A Sense That More Is Possible
The success of Yvain's post threw me off completely. My experience has been opposite to what he describes: x-rationality, which I've been working on since the mid-to-late nineties, has been centrally important to successses I've had in business and family life. Yet the LessWrong community, which I greatly respect, broadly endorsed Yvain's argument that:
So that left me pondering what's different in my experience. I've been working on these things longer than most, and am more skilled than many, but that seemed unlikely to be the key.
The difference, I now think, is that I've been lucky enough to spend huge amounts of time in deeply rationalist organizations and groups--the companies I've worked at, my marriage, my circle of friends.
And rational groups kick ass.
An individual can unpack free will or figure out that the Copenhagen interpretation is nonsense. But I agree with Yvain that in a lonely rationalist's individual life, the extra oomph of x-rationality may well be drowned in the noise of all the other factors of success and failure.
But groups! Groups magnify the importance of rational thinking tremendously:
And we're not even talking about the extra power of x-rationality. Imagine a couple that truly understood Aumann, a company that grokked the Planning Fallacy, a polity that consistently tried Pulling the Rope Sideways.
When it comes to groups--sized from two to a billion--Yvain couldn't be more wrong.
Update: Orthonormal points out that I don't provide many concrete examples; I only link to three above. I'll try to put more here as I think of them: