I intended Leveling Up in Rationality to communicate this:
Despite worries that extreme rationality isn't that great, I think there's reason to hope that it can be great if some other causal factors are flipped the right way (e.g. mastery over akrasia). Here are some detailed examples I can share because they're from my own life...
But some people seem to have read it and heard this instead:
I'm super-awesome. Don't you wish you were more like me? Yay rationality!
This failure (on my part) fits into a larger pattern of the Singularity Institute seeming too arrogant and (perhaps) being too arrogant. As one friend recently told me:
At least among Caltech undergrads and academic mathematicians, it's taboo to toot your own horn. In these worlds, one's achievements speak for themselves, so whether one is a Fields Medalist or a failure, one gains status purely passively, and must appear not to care about being smart or accomplished. I think because you and Eliezer don't have formal technical training, you don't instinctively grasp this taboo. Thus Eliezer's claim of world-class mathematical ability, in combination with his lack of technical publications, make it hard for a mathematician to take him seriously, because his social stance doesn't pattern-match to anything good. Eliezer's arrogance as evidence of technical cluelessness, was one of the reasons I didn't donate until I met [someone at SI in person]. So for instance, your boast that at SI discussions "everyone at the table knows and applies an insane amount of all the major sciences" would make any Caltech undergrad roll their eyes; your standard of an "insane amount" seems to be relative to the general population, not relative to actual scientists. And posting a list of powers you've acquired doesn't make anyone any more impressed than they already were, and isn't a high-status move.
So, I have a few questions:
- What are the most egregious examples of SI's arrogance?
- On which subjects and in which ways is SI too arrogant? Are there subjects and ways in which SI isn't arrogant enough?
- What should SI do about this?
I admire the phrase "what an algorithm feels like from the inside". This is certainly one of Yudkowsky's better ideas, if it is one of his. I think that one can see the roots of it in G.E.B. Still, this may well count as something novel.
Nonetheless, Yudkowsky is not the first compatibilist.
One could define the term in such a way. I tend to take a instrumentalist view on intelligence. However, "the ability to optimize things" may well be a thing. You may as well call it intelligence, if you are so inclined.
This, nonetheless, may not be a solution to the question "what is intelligence?". It seems as though most competent naturalists have moved passed the question.
I apologize, but that does not look like a solution to the Gettier Problem. Could you elaborate?
I have absolutely no knowledge of the history of Newcomb's problem. I apologize.
Further apologies for the following terse statements:
I don't think Fun theory is known by academia. Also, it looks like, at best, a contemporary version of eudaimonia.
The concept of CEV is neat. However, I think if one were to create an ethical version of the pragmatic definition of truth, "The good is the end of inquiry" would essentially encapsulate CEV. Well, as far as one can encapsulate a complex theory with a brief statement.
TDT is awesome. Predicted by the superrationality of Hofstadter, but so what?
I don't mean to discount the intelligence of Yudkowsky. Further, it is extremely unkind of me to be so critical of him, considering how much he has influenced my own thoughts and beliefs. However, he has never written a "Two Dogmas of Empiricism" or a Naming and Necessity. Philosophical influence is something that probably can only be seen, if at all, in retrospect.
Of course, none of this really matters. He's not trying to be a good philosopher. He's trying to save the world.
Okay, the Gettier problem. I can explain the Gettier problem, but it's just my explanation, not Eliezer's.
The Gettier problem is pointing out problems with the definition of knowledge as justified true belief. "Justified true belief" (JTB) is an attempt at defining knowledge. However, it falls into the classic problem with philosophy of using intuition wrong, and has a variety of other issues. Lukeprog discusses the weakness of conceptual analysis h... (read more)