Author, ladies and gentlemen of the comments: http://lesswrong.com/lw/kg/expecting_short_inferential_distances/
Hmm, that's a good point. I can see how this might seem like a romantic/Marxist/anti-elite sentiment.
When I read it, I was thinking almost exclusively in terms of existential risk, the connection being that the end of the world (by, for instance, Unfriendly AI) won't be brought about by a cruel mad scientist, but more likely by normal people trying to make economic and scientific advances without concern for the potential consequences.
Sorry if the quote doesn't communicate that very clearly.
“I have thought for a long time now that if, some day, the increasing efficiency for the technique of destruction finally causes our species to disappear from the earth, it will not be cruelty that will be responsible for our extinction and still less, of course, the indignation that cruelty awakens and the reprisals and vengeance that it brings upon itself…but the docility, the lack of responsibility of the modern man, his base subservient acceptance of every common decree. The horrors that we have seen, the still greater horrors we shall presently see, are not signs that rebels, insubordinate, untamable men are increasing in number throughout the world, but rather that there is a constant increase in the number of obedient, docile men.”
—George Bernanos
In addition to the ~$15,000 I've donated so far this drive, I'm matching the next 5 donations of (exactly) $1001 this fundraiser. It's unlikely I'll donate this money anytime soon without the matching, so I'm hoping my decision is counterfactual enough for the donation-matching skeptics out there :)
To the stars!