Comment author: David_Gerard 19 November 2014 12:04:37AM *  -27 points [-]

The fact that this is "catnip" for LW-ers is a bad thing. We ought to be giving neoreaction about as much credence as we give Creationism: it's founded on bad ethics, false facts, and bad reasoning, and should be dismissed, not discussed to death.

I note (and others have noted) that SSC, although hosting the definitive NRx takedown, still puts NRx ideas in the sphere of things to be discussed calmly with steelmanning; whereas it reacts with actual disgust and lack of philosophical charity to feminism, social justice, Tumblr, etc. And that Yvain was literally surprised to find himself becoming more right-wing after hanging around neoreactionaries, i.e. that he was picking up his ideas from his friends.

Comment author: HBDfan 21 November 2014 12:56:15AM 2 points [-]

These things are disgusting. Slate Star is increasing in reasonableness.

Comment author: MichaelAnissimov 20 November 2014 08:54:29PM 7 points [-]

In reference to your first comment, basically yes.

1) The only reason I joined this thread in the first place is because someone attacked me, I don't particularly advocate neoreaction among LW groups, because I understand the community is hyper-liberalized to the point of absurdity.

2) Yes, my estimates of when the Singularity will occur moved from 2030-2040 to 2070-2080 over the last five years. This change is partially what has caused the neoreaction thing. I think there is a real risk that Western civilization will fall apart before we get there.

Comment author: HBDfan 21 November 2014 12:52:55AM 0 points [-]

The LW tone has improved this year and this post is refreshing.

Comment author: Nornagest 20 November 2014 10:06:17PM *  8 points [-]

I'm not sure where you're getting that idea. Many schools of political thought harbor totalizing sects: market capitalism has Randianism, socialism has hardline communism. That doesn't make any of them wrong, and it definitely doesn't make them incommensurate in practical terms; it just means that they're attractive, under the right circumstances, to a personality type with a certain set of preoccupations. It also means that they are, potentially, epistemically dangerous, but that's a separate issue and still doesn't mean they're wrong per se.

My original thought was simply that, the personalities and cultural backgrounds of LW contributors being what they are, a strongly politicized LW would produce more SJWs than opposition thereunto. Am I wrong?

Comment author: HBDfan 21 November 2014 12:37:30AM 3 points [-]

The tone of Lesswrong is against social justice indulging. It has improved this year in this regard and leading to this post.

Comment author: bramflakes 19 November 2014 11:44:02PM 20 points [-]

I read about HBD first and then NRx second. I couldn't have a sensible conversation about it with anybody I knew due to the prevailing progressive memeplex - for example, my History teacher once claimed that war was nonexistent in pre-agriculture societies due to it being economically unsustainable (I just about managed to avoid giving myself a concussion from slamming my head on the table). I knew cracks were appearing in the Narrative after I read the Blank Slate, and I knew I had to jettison it entirely once I finished The Bell Curve.

But what to replace it with? Mainstream conservatism was as clueless as progressivism, and while individual libertarians might have had the right mindset to discuss the issue if you framed it the right way, their answers were unsatisfying. Then one day, someone on LW linked to Moldbug - and here suddenly was a whole other narrative that made a lot more sense. It wasn't about HBD as such, but an account of the Progressive idea machine that explained why it was so taboo. I toyed with some of the weirder aspects for a while (Patchwork and Corporate Governance) but eventually gave them up for similar reasons to libertarianism (in a word: too spergy).

I wouldn't call myself a Neoreactionary. My beliefs are somewhere in between paleocon and the Traditionalist branch of NRx. In an entirely separate part of my brain there's also an active transhumanist who is annoyed that this contrarian upstart is getting all the cognitive attention, and Annisimov's early post about transhumanist/NRx synthesis hasn't properly bridged the gap. I don't know what I'll believe in a year or two.

Comment author: HBDfan 20 November 2014 10:02:03PM *  3 points [-]

Libertarianism is insufficient as most people will be led easily. They will not take freedom. Freedom is hard work, freedom is frightening. Neoreaction follows from libertarianism with a more secure possible future. Technology provides wealth and being with your group provides security in society.

Comment author: Nornagest 20 November 2014 06:35:42PM *  9 points [-]

I think many of the narratives that come under the general heading of "neoreaction" are totalizing. I don't think neoreaction as a whole forms a coherent narrative in this sense, although I haven't read enough different neoreactionaries to be totally confident that I'm not just overfitting on a couple of outliers.

I wasn't talking primarily about NRx, though.

Comment author: HBDfan 20 November 2014 09:59:19PM *  3 points [-]

This would cancel all speech about politics though, by anyone.

Comment author: Nornagest 20 November 2014 06:27:12PM *  12 points [-]

If the population of LW were to turn its eye to politics (more, that is, than it already has), I'm not at all sure that you'd get less social justice activism afterwards -- and I say this as someone that's no great fan of the social justice movement in its modern form.

When geeks like us get political, we usually start from cultural background noise and round off to the nearest coherent narrative: that is, the nearest one that can explain all the observations, whether through direct modeling or through some form of self-delusion narrative. That might sound like a good thing, but it's really not; "coherent" in politics usually means "totalizing".

Comment author: HBDfan 20 November 2014 06:33:03PM *  6 points [-]

Do you mean that neoreaction is totalizing, that individual neoreactionaries at least are totalizing?

Comment author: HBDfan 20 November 2014 06:00:58PM -3 points [-]

It has robust answers that hold true. Lesswrong needs to use rationality to speak out against the social justice warriors more. We need more rationalists to explain Gamergate and other initiatives. SSC and Ozy come out in favor of Gamergate and Eron Gjoni for example. Politics need not be the mind killer with showing sufficient working.

Comment author: [deleted] 20 November 2014 04:36:22AM 10 points [-]

That being said, there's ample discussion already on Slate Star Codex, and I wouldn't want to see it crowding out other topics here.

I keep hearing people say this. This is a rationalist site; why hasn't anyone gone out and generated some statistics?

Comment author: HBDfan 20 November 2014 01:07:09PM *  -1 points [-]

I'm pleased to see more neoreaction here. This post makes me confident to come back.

Lesswrong needs to use rationality to speak out against the social justice warriors more. We need more rationalists to explain Gamergate and other initiatives. SSC and Ozy come out in favor of Gamergate and Eron Gjoni for example. Politics need not be the mind killer with showing sufficient working.

Comment author: HBDfan 30 April 2013 08:59:05AM 15 points [-]

Please taboo "creepy".

Comment author: Chrysophylax 29 January 2013 07:21:43PM -1 points [-]

Large corporations are not really very like AIs at all. An Artificial Intelligence is an intelligence with a single utility function, whereas a company is a group of intelligences with many complex utility functions. I remain unconvinced that aggregating intelligences and applying the same terms is valid - it is, roughly speaking, like trying to apply chromodynamics to atoms and molecules. Maximising shareholder value is also not a simple problem to solve (if it were, the stock market would be a lot simpler!), especially since "shareholder value" is a very vague concept. In reality, large corporations almost never seek to maximise shareholder value (that is, in theory one might, but I can't actually imagine such a firm). The relevant terms to look up are "satisficing" and "principal-agent problem".

This rather spoils the idea of firms being intelligent - the term does not appear applicable (which is, I think, Eliezer's point).

Comment author: HBDfan 28 April 2013 03:58:01PM 1 point [-]

Corporations do not have utility function, or they do not have a single utility function. They have many utility functions. You might "money pump" the corporation.

View more: Prev | Next