I intended Leveling Up in Rationality to communicate this:
Despite worries that extreme rationality isn't that great, I think there's reason to hope that it can be great if some other causal factors are flipped the right way (e.g. mastery over akrasia). Here are some detailed examples I can share because they're from my own life...
But some people seem to have read it and heard this instead:
I'm super-awesome. Don't you wish you were more like me? Yay rationality!
This failure (on my part) fits into a larger pattern of the Singularity Institute seeming too arrogant and (perhaps) being too arrogant. As one friend recently told me:
At least among Caltech undergrads and academic mathematicians, it's taboo to toot your own horn. In these worlds, one's achievements speak for themselves, so whether one is a Fields Medalist or a failure, one gains status purely passively, and must appear not to care about being smart or accomplished. I think because you and Eliezer don't have formal technical training, you don't instinctively grasp this taboo. Thus Eliezer's claim of world-class mathematical ability, in combination with his lack of technical publications, make it hard for a mathematician to take him seriously, because his social stance doesn't pattern-match to anything good. Eliezer's arrogance as evidence of technical cluelessness, was one of the reasons I didn't donate until I met [someone at SI in person]. So for instance, your boast that at SI discussions "everyone at the table knows and applies an insane amount of all the major sciences" would make any Caltech undergrad roll their eyes; your standard of an "insane amount" seems to be relative to the general population, not relative to actual scientists. And posting a list of powers you've acquired doesn't make anyone any more impressed than they already were, and isn't a high-status move.
So, I have a few questions:
- What are the most egregious examples of SI's arrogance?
- On which subjects and in which ways is SI too arrogant? Are there subjects and ways in which SI isn't arrogant enough?
- What should SI do about this?
This leaves one wondering how the hell would one be this concerned about the AI risk but not study math properly? How the hell can one go on Bayesian this and Bayesian that but not study? How can one trust one's intuitions about how much computational power is needed for AGI, and not want to improve those intuitions?
I've speculated elsewhere that he would likely be unable to implement general Bayesian belief propagation graph or even know what is involved (its NP complete problem in general and the accuracy of solution is up to heuristics. Yes, heuristics. Biased ones, too). That's very bad when it comes to understanding rationality, as you will start going on with maxims like "update all your beliefs" etc, which look outright stupid to e.g. me (I assure you I can implement Bayesian belief propagation graph), and triggers my 'its another annoying person that talks about things he has no clue about' reflex.
Talking about Bayesian this and Bayesian that, one should better know mathematics very well. Because in practice all those equations get awfully hairy on things like graphs in general (not just trees). If you don't know relevant math very well and you call yourself Bayesian, you are professing a belief in belief. If you do not make a claim of extreme mathematical skills and knowledge, and you go on Bayesian this and that, other people will have to assume extreme mathematical skills and knowledge out of politeness.
Yes.