I intended Leveling Up in Rationality to communicate this:
Despite worries that extreme rationality isn't that great, I think there's reason to hope that it can be great if some other causal factors are flipped the right way (e.g. mastery over akrasia). Here are some detailed examples I can share because they're from my own life...
But some people seem to have read it and heard this instead:
I'm super-awesome. Don't you wish you were more like me? Yay rationality!
This failure (on my part) fits into a larger pattern of the Singularity Institute seeming too arrogant and (perhaps) being too arrogant. As one friend recently told me:
At least among Caltech undergrads and academic mathematicians, it's taboo to toot your own horn. In these worlds, one's achievements speak for themselves, so whether one is a Fields Medalist or a failure, one gains status purely passively, and must appear not to care about being smart or accomplished. I think because you and Eliezer don't have formal technical training, you don't instinctively grasp this taboo. Thus Eliezer's claim of world-class mathematical ability, in combination with his lack of technical publications, make it hard for a mathematician to take him seriously, because his social stance doesn't pattern-match to anything good. Eliezer's arrogance as evidence of technical cluelessness, was one of the reasons I didn't donate until I met [someone at SI in person]. So for instance, your boast that at SI discussions "everyone at the table knows and applies an insane amount of all the major sciences" would make any Caltech undergrad roll their eyes; your standard of an "insane amount" seems to be relative to the general population, not relative to actual scientists. And posting a list of powers you've acquired doesn't make anyone any more impressed than they already were, and isn't a high-status move.
So, I have a few questions:
- What are the most egregious examples of SI's arrogance?
- On which subjects and in which ways is SI too arrogant? Are there subjects and ways in which SI isn't arrogant enough?
- What should SI do about this?
(I was going to write a post on 'why I'm skeptical about SIAI', but I guess this thread is a good place to put it. This was written in a bit of a rush - if it sounds like I am dissing you guys, that isn't my intention.)
I think the issue isn't so much 'arrogance' per se - I don't think many of your audience would care about accurate boasts - but rather your arrogance isn't backed up with any substantial achievement:
You say you're right on the bleeding edge in very hard bits of technical mathematics ("we have 30-40 papers which could be published on decision theory" in one of lukeprogs Q&As, wasn't it?), yet as far as I can see none of you have published anything in any field of science. The problem is (as far as I can tell) you've been making the same boasts about all these advances you are making for years, and they've never been substantiated.
You say you've solved all these important philosophical questions (Newcomb, Quantum mechanics, Free will, physicalism, etc.), yet your answers are never published, and never particularly impress those who are actual domain experts in these things - indeed, a complaint I've heard commonly is that Lesswrong just simply misunderstand the basics. An example: I'm pretty good at philosophy of religion, and the sort of arguments Lesswrong seems to take as slam-dunks for Atheism ("biases!" "Kolmogorov complexity!") just aren't impressive, or even close to the level of discussion seen in academia. This itself is no big deal (ditto the MWI, phil of mind), but it makes for an impression of being intellectual dilettantes spouting off on matters you aren't that competent in. (I'm pretty sure most analytic philospohers roll their eyes at all the 'tabooing' and 'dissolving problems' - they were trying to solve philosophy that way 80 years ago!) Worse, my (admittedly anecdotal) survey suggests a pretty mixed reception from domain-experts in stuff that really matters to your project, like probability theory, decision theory etc.
You also generally talk about how awesome you all are via the powers of rationalism, yet none of you have done anything particularly awesome by standard measures of achievement. Writing a forest of blog posts widely reputed to be pretty good doesn't count. Nor does writing lots of summaries of modern cogsci and stuff.
It is not all bad. Because there are lots of people who are awesome by conventional metrics and do awesome things who take you guys seriously, and meeting these people has raised my confidence that you guys are doing something interesting. But reflected esteem can only take you so far.
So my feeling is basically 'put up or shut up'. You guys need to build a record of tangible/'real world' achievements, like writing some breakthrough papers on decision theory (or any papers on anything) which are published and taken seriously in mainstream science, a really popular book on 'everyday rationality', going off and using rationality to make zillions from the stock market, or whatever. I gather you folks are trying to do some of these: great! Until then, though, your 'arrogance problem' is simply that you promise lots and do little.
No, that wasn't it. I said 30-40 papers of research. Most of that is strategic research, like Carl Shulman's papers, not decision theory work.
Otherwise, I almost entirely agree with your comments.