To my mind, the worst thing about the EA movement are its delusions of grandeur. Both individually and collectively, the EA people I have met display a staggering and quite sickening sense of their own self-importance. They think they are going to change the world, and yet they have almost nothing to show for their efforts except self-congratulatory rhetoric. It would be funny if it wasn't so revolting.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
There are people in this world who will never understand, say, the P?=NP problem no matter how much work they put into it. So to deny the above you'd have to say (along with Greg Egan) that there was some sort of threshold of intelligence akin to "Turing completeness" that only some of humanity were reached, but that once you reached it nothing was in principle beyond your comprehension. That doesn't seem impossible, but it's far from obvious.
I think this is in fact highly likely.