eli_sennesh comments on Harper's Magazine article on LW/MIRI/CFAR and Ethereum - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (153)
An interesting problem. There are a few things that can be said about this.
1) Neoreaction is not the only tendency that combines universalism and elitism -- for that matter, it consistently rejects universalism, so it's one way of resolving the tension you're perceiving. Another way is to embrace both: this could be done by belief in a heritable factor of general intelligence (which strikes me as the rationalist thing to do, and which necessarily entails some degree of elitism), but that's merely the most visible option. An alternative is to say that some cultures are superior to others (the North to the South for a common political example, aspiring-rationalist culture to the culture at large for a local one), which also necessarily entails elitism: at the very least, the inferiors must be uplifted.
2) The coexistence of universalism and elitism (and technocratic progressivism) is reminiscent of the later days of the British Empire. They believed that they could figure out a universal morality -- and beyond that, a universally proper culture -- but, of course, only the more developed and rational among even their own people could play a part in that. I suspect that LW draws disproportionately from communities that contain ideological descent from the British Empire, and that its surrounding baggage uncritically reflects that descent -- in fact, this is provably true for one aspect of LW-rationality unless utilitarianism was independently developed somewhere else. (The last line from the last point sounds familiar.)
3) Neoreaction is probably partially an exercise in constructing new narratives and value-systems that are at least as plausible as the ones that are currently dominant. This isn't incompatible with the generation of true insights -- in fact, the point can't be made with false ones. (Obviously false ones, at least, but if the epistemic sanity waterline isn't high enough around here to make that almost as difficult a constraint, rationalism has probably failed.) There's also some shock-jock countersignaling stuff, especially with Moldbug.
4) The comparative study of civilizations leads (at least when taken in conjunction with the point that technological progress and political progress cannot be assumed to be the same, or even driven by the same factors, except insofar as technology can make possible beneficial things that would not have been possible otherwise -- though it can do the same for harmful things, like clickbait or the nuclear bomb) to two insights: first, that civilizations keep collapsing, and second, that they tend to think they're right. No two fundamentally disagreeing civilizations can be right at the same time -- so either value-systems cannot be compared (which is both easily dismissed and likely to contain a grain of truth for the simple reason that, if any of our basic moral drives come neither from culture nor about facts about the outside world, what else could they be but innate? Even the higher animals show signs of a sense of morality in lab tests, I've heard.) or one of them is wrong. It's the same argument as the atheist one against religion, just fully generalized. (I don't think the argument works for atheism, since, if you grant that the God or gods of divine-containing religions want humans to follow them, Christianity and the various paganisms can't be seriously compared -- but I digress.) Hence the utility of generating alternative narratives for the cause of seeking truth.
5) People concerned about civilizational risk would do well to take the possibility of collapse seriously, as per the fourth point. People who want to hurry up and solve morality and build it into a friendly AI, even more so. Those who believe that every civilization would come to the same moral solution should want there to be as many people likely to support this goal and do good and useful work toward it as possible, before a government or a business beats them to it, which seems to imply that they should either want there to be as many not-unfriendly and likely-to-be-useful civilizations as possible or that they should at least want Western civilization (i.e. the USA, Canada, and some amount of Europe depending on who you talk to) not to collapse, since it's generated by far the highest proportion of people who take that task seriously. (IIRC the last part is close to the reasoning Anissimov went through, but I could be misremembering.)
(There's likely to be at least one part of this that's completely wrong, especially since it's two in the morning and I'm rushing through this so I can sleep. A slice of virtual Stollen to anyone who finds one -- it's that time of the year.)
I think this is the completely wrong part, in that it assumes that any living individual ever considers everything about their civilization to be Good and Right. By and large, even the ruling classes don't get everything they want (for example, they wanted a Hayekian utopia along Peter Thiel's lines, but what they got was the messiness of actually existing neoliberalism). And in fact, one of the chief causes for the repeated collapses is that institutional structures usually can't take being pushed and pulled in too many contradictory directions at once without ceasing to act coherently for anything at all (they become "unagenty", in our language).
The US Congress is a fairly good present-day example: it's supposed to act for the people as a whole, for the districts, and for the "several States"; for the right of the majority to govern as they will and for the right of small ideological minorities to obstruct whatever they please; for the fair representation of the voters and for the institutionalization of the political parties. When these goals become contradictory instead of complementary, the institution stops functioning (ie: it passes no legislation, not even routine matters, instead of merely passing legislation I disagree with), and society has to replace it or face decline.
I'm not talking about practice, but rather about ideals, value systems, that sort of thing. Tumblrites haven't gotten what they want either -- but they still want what they want, and what they want is determined by something, and whatever that something is, it varies.