Comment author: RichardKennaway 19 November 2014 01:44:27PM 3 points [-]

By "pre-Enlightenment" I understand the social arrangements in Europe of the centuries immediately preceding the Enlightenment, which neo-reactionaries see the Enlightenment as a catastrophic falling away from, and which they desire to return to. This is unambiguously what advancedatheist is talking about upthread, and what, for example, Moldbug unfavourably contrasts our present arrangements with. This is a very specific thing, not the huge space that you interchangeably referred to as "non-Enlightenment".

"Pre-Enlightenment" bears the same relationship to "non-Enlightenment" as kangaroos do to non-elephants.

Comment author: MichaelAnissimov 19 November 2014 02:10:08PM 5 points [-]

Viewing reactionaries as wishing to return to a time in the linear past, which evolved organically based on local conditions, and which may not be appropriate to present technological conditions, is mistaken. The goal is not to simply revive a past arrangement but to apply certain traditional principles and spirit to a newer expression of organic principles that is suited to its context. So, when you say "go back to", it's not that simple. Which is why "pre-Enlightenment" seems like an oversimplifying label, to me.

In fact, you could call it post-Enlightenment, since it would be the emergence of structure from an Enlightenment society that may retain some Enlightenment principles while discarding others. Calling any system based on principles aside from Enlightenment ones "pre-Enlightenment" seems like assuming a kind of a priori obsolescence, in effect dismissing it before it's even considered.

In any case, "pre-Enlightenment" does not refer to any specific structure (like kangaroos), but a wide variety of arrangements. Therefore, I see it as more similar to "non-elephant" than "kangaroo".

Comment author: RichardKennaway 19 November 2014 12:36:08PM 4 points [-]

Are you saying that you think that a vast majority of the possible transhuman futures rest entirely on Enlightenment principles?

No. Are you saying that pre-Enlightenment and Enlightenment principles are the only possibilities? Why should either of these be part of a transhuman future?

Comment author: MichaelAnissimov 19 November 2014 12:49:02PM *  2 points [-]

Exhaustively speaking, societal organizational principles in the abstract tend to be Enlightenment-oriented or not. So, yes, any given transhuman future will have principles of some kind, which will be inspired by the Enlightenment or not. Non-Enlightenment principles (used here to describe every possible set of societal principles besides those based around the Enlightenment) are a rather huge space of possibilities, which cover not only many societies which have already existed, but many millions which may have yet to come to pass. Many "pre-Enlightenment" situations were organic hierarchies, similar to the way nature itself has operated for literally billions of years. "Pre-Enlightenment" does not refer to a specific thing, but a huge space of configurations which do not closely adhere to Enlightenment principles.

Comment author: RichardKennaway 19 November 2014 11:57:43AM *  3 points [-]

Why couldn't post-democratic outcomes exist even if human nature is deliberately reengineered?

Why would they resemble the pre-democratic outcomes that advancedatheist says "wouldn't surprise me"? What should even draw "premodern, pre-Enlightenment societies" to anyone's attention, out of the vast and unknown possibilities of a transhuman estate that removes the reasons that those societies evolved in those ways?

Comment author: MichaelAnissimov 19 November 2014 12:13:12PM 2 points [-]

Why would they resemble the pre-democratic outcomes that advancedatheist says "wouldn't surprise me"?

Because some of those, like hierarchy, are game theoretic equilibria that are likely to emerge across a wide range of possible configurations, especially where there are great asymmetries between agents.

What should even draw "premodern, pre-Enlightenment societies" to anyone's attention, out of the vast and unknown possibilities of a transhuman estate that removes the reasons that those societies evolved in those ways?

Are you saying that you think that a vast majority of the possible transhuman futures rest entirely on Enlightenment principles?

Comment author: Risto_Saarelma 19 November 2014 06:30:36AM 2 points [-]

I feel sorry for the feminist women in cryonics who don't see this as a distinct possibility of the kind of Future World which would revive them. They might find themselves in a conservative, patriarchal society which won't have much tolerance for their assumptions about women's freedoms.

I haven't really seen much discussion on the intersection of neoreaction and transhumanism. Neoreactionary theories of long-range probable societal trends, like dysgenics or a return to generally pre-Enlightenment social order also tend to assume that humans stay mostly as they are and only get selected by natural evolution. Meanwhile, getting to the point of being able to revive cryonically stored people successfully would probably include a bunch of human condition gamechanger technologies, like an ability to make the whole notion of fixed gender optional on any level (genetics, cognitive architecture, body plan) you'd care to name.

Comment author: MichaelAnissimov 19 November 2014 11:18:48AM 3 points [-]

Why couldn't post-democratic outcomes exist even if human nature is deliberately reengineered?

Comment author: skeptical_lurker 19 November 2014 10:11:22AM 7 points [-]

From what I heard I thought you were calling for people not to associate with any gays/transsexuals, or with people who themselves associate with gays/transexuals. I thought you thought that the threat posed was one of demographic decline.

I apologise if I have misrepresented your position, but that was how I interpreted the situation from what second-hand sources said. Incidentally, in what respect is Justine Tunney insane?

Comment author: MichaelAnissimov 19 November 2014 10:23:55AM 3 points [-]

Apology accepted. Your second-hand sources were wrong, tell them that. It's so difficult to have legitimate discussions about NRx when 90% of the opinion the Less Wrong community has about us is based on stuff that is completely made up.

Comment author: [deleted] 19 November 2014 08:32:24AM -1 points [-]

Well, as I said in this same thread, things like egalitarianism, female rights, minority rights, etc. have been found to be normatively binding due to the falsification of the normativity of certain social structures, usually patriarchy, royalty, and religious rule. Upon finding that those things are unjustified, we revert to the default that everyone is equal simply because there needs to be a reason to ascribe difference!

Comment author: MichaelAnissimov 19 November 2014 10:11:10AM 2 points [-]

This is one of the funnier things I've read this year.

Comment author: Toggle 18 November 2014 04:42:06AM 14 points [-]

It's curious to see the frequency of posts that start with "I am not a neoreactionary, but...". (This includes my own). If I'm not mistaken, they seem to outnumber the actual neoreactionary posts by a fair margin.

I think a call for patriarchal racially-stratified monarchy is catnip around here. Independently of its native virtues, I mean. It's a debate that couldn't even happen in most communities, so it's reinforcing our sense of LW's peculiar set of community mores. It's a radical but also unexpected vision of a technological future, so it has new ideas to wrestle with, and enough in the way of historical roots to reward study and give all participants the chance to learn. And it is political without being ossified in to tired and nationally televised debates, with new insights available to a clever thinker and plenty of room to pull sideways.

For that reason, I'm a little worried that it will receive disproportionate attention. I know my System 1 loves to read the stuff. But System 2... Enthusiastic engagement with political monarchy- pro or con- is not something I would like to see become a major feature of Less Wrong, so I think I'm going to publicly commit to posting no more than one NRx comment per month, pending major changes in community dynamics.

Comment author: MichaelAnissimov 19 November 2014 09:35:19AM 7 points [-]

Straightforwardly equating NRx with monarchy is a very surface-level (mis)understanding.

Comment author: skeptical_lurker 18 November 2014 03:47:41PM *  14 points [-]

IANANR,IFIDSIWAPLATMDTTTOMC (I am not a neoreactionary, in fact I don't strongly identify with any political labels at the moment due to the threat of motivated cognition)

But,

I think I have grasped the link between LW and NRx. Its a mixture of having something to protect and extrapolating trends. Whereas singulatarians looks at exponential trends in computing, extrapolate and see a future where some form of superintelligence will surely come to dominate, worrying that human values could be destroyed, the NRx look at the trends of memes and genes, extrapolate the exponential growth, and see a future where their ingroup and values are massively outnumbered, which can be a death sentence in democracy.

If your terminal values are running against the tide of change, then progressivism is an existential risk. Imagine you believe in God if you do not, and then imagine Christianity going the same way as Norse paganism. Imagine everything you believe gives meaning to life being discarded to the dustbin of history. Or imagine that the positive correlation between religion and fertility reverses the secularisation of society in the long run, and we end up in a totalitarian theocracy. If somehow neither of these futures scares you, keep going until you imagine a future that does.

To put it another way, most people think "this group I disagree with is only 2% of the population. They're not a threat." NRx thinks "This group is only 2% and doubling every x years. Assuming the trend stays constant, how long do I have until they have a democratic majority?".

That sounded more positive of NRx than I intended. Conversely, while exit is not threatening, NRx taking over society is of course a big threat to anyone with progressive values.

Among the ways NRx differs, I think strategic prioritisation is one of the big points. Even if you believe that homosexuality is a big threat to civilisation (which I emphatically don't) well, there are a lot of homophobes. What is going to be the marginal benefit of one more homophobe? By comparison, one more cryonisist or one more FAI researcher has very large marginal benefit due to the small size of these groups. I find it really strange that Anissimov used to talk about the threat of nanotech/AI/bioterrorism and now talks about the threat of gays and transsexuals. [Edit: I retract this last snetence - apparently I have been misinformed about Anissimov]

Comment author: MichaelAnissimov 19 November 2014 09:32:19AM -2 points [-]

Where have I talked about the threat of gays and transsexuals? I merely asserted that one especially insane transsexual (Justine Tunney) not be associated with a reactionary movement. That makes sense, right?

Comment author: Azathoth123 17 September 2014 03:43:33AM 3 points [-]

I get my news from instapundit.

Comment author: MichaelAnissimov 17 September 2014 12:01:51PM 6 points [-]

Instapundit is highly ideological libertarian, so you should balance it out with a reactionary news source like Theden.tv or Steve Sailer.

Comment author: [deleted] 16 September 2014 08:27:47PM *  2 points [-]

I was just pointing out that the links you provided weren't supporting your extraordinary claim that he actively decries the practice of people joining his movement.

What you've written here is not what I claimed three comments ago.

In [Anissimov's] view, NRx is weakened as its popular support increases.

I normally wouldn't care if random person X on the internet thinks I'm wrong about Anissimov, but I'm really tired of people gaslighting me on this. So here is your "extraordinary" evidence that Anissimov believes his movement is weakened by popular support.

To recap, I relayed two separate essays of his in which he holds this value. Emphasis added everywhere by me.

1) "Boundaries":

As neoreaction grows, it is causing people to change the way they think about progressivism, democracy, and modern governance. To maintain this property, it needs to contain a certain concentration of people who understand its principles and can communicate them.

Because neoreaction is quickly growing, it is at risk of becoming diluted by hostile groups. Today, the most prominent adjacent hostile group are libertarians. This puts us in a difficult position because we also gain many new recruits from libertarianism.

[snip...]

The key is to strike a balance; allow room for disagreement, while clarifying that certain minimum standards must be met for someone to qualify as a “neoreactionary”. If any libertarian can call themselves a “neoreactionary” and get away with it, the integrity of the group will be fatally compromised through dilution.

We can quibble about what these "minimum standards" are, but evidentally an upper bound on the amount of disagreement possible is given by the whole Justine Tunney incident.

2) "The Kind of People Who Should Be Nowhere Near Neoreaction"

There is definitely a place for being welcoming to curious learners, people who have not accepted right-wing principles yet, etc. I am not objecting to that. But there is a threshold of insanity that should simply be rejected outright. We have a clear example of that here.

Remember my claim earlier was:

In his view, NRx is weakened as its popular support increases.

You've claimed that he's only concerned about NRx's public reputation. To the contrary, he says quite clearly:

The credibility and viability of neoreaction—no matter what its role ultimately may be—is at stake.

3) "Social Conservatism and Drawing a Line in the Sand"

Here we have a more specific version of "NRx'ers must believe at least this much, or else they cannot be called NRx'ers":

There is a certain basic amount of social conservatism which must be met among all those who label themselves “neoreactionary”.

[snip...]

Why would I not feel comfortable saying that the average Democrat is surely too liberal to be called a “neoreactionary”? Shouldn’t that be obvious?

[snip...]

The point is that not everyone is a member of the group. Some people are members and others are not. This seems like it should be obvious, but I actually have to state it here, because of the hyper-inclusive social bias of geeks. [Emphasis is original.]

I feel this should satisfy any reasonable evidential standards to conclude the claim I actually made. Feel free to disagree with me substantially after actually reading Anissimov for yourself.

Comment author: MichaelAnissimov 17 September 2014 06:38:19AM *  6 points [-]

Just because I'm setting boundary conditions does not mean I am generally discouraging people from involvement, that doesn't follow. However, it's true that there's an optimal recruitment rate which is substantially less than the maximum possible recruitment rate. Recruitment rates can be deliberately throttled back by introducing noise, posting more rarely, and through other methods.

NRx would be maximally strengthened if it could recruit as many people as possible while maintaining quality, but realistically we can only add a given number of people per week without compromising quality.

I'm fantastic at engaging others by design, I openly offer to publicly debate people, only Noah Smith has taken me up on it so far.

Re: ebola, I've never joked about using it as a biological weapon, I'm just responding to the funny meming that's going on on 4chan about ebola.

View more: Prev | Next