Eugine_Nier comments on [LINK] Why I'm not on the Rationalist Masterlist - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (866)
Most of these are not dominant on LW, but come up often enough to make me twitchy. I am not interested in debating or discussing the merits of these points here because that's a one-way track to a flamewar this thread doesn't need.
The stronger forms of evolutionary psychology and human-diversity stuff. High confidence that most/all demographic disparities are down to genes. The belief that LessWrong being dominated by white male technophiles is more indicative of the superior rationality of white male technophiles than any shortcomings of the LW community or society-at-large.
Any and all neoreactionary stuff.
High-confidence predictions about the medium-to-far-future (especially ones that suggest sending money)
Throwing the term "eugenics" around cavalierly and assuming that everyone knows you're talking about benevolent genetic engineering and not forcibly-sterilizing-people-who-don't-look-like-me.
There should be a place to discuss these things, but it probably shouldn't be on a message board dedicated to spreading and refining the art of human rationality. LessWrong could easily be three communities:
a rationality forum (based on the sequences and similar, focused on technique and practice rather than applying to particular issues)
a transhumanist forum (for existential risk, cryonics, FAI and similar)
an object-level discussion/debate forum (for specific topics like feminism, genetic engineering, neoreactionism, etc).
I'm not sure that would work. After all, Bayes's rule has fairly obvious unPC consequences when applied to race or gender, and thinking seriously about transhumanism will require dealing with eugenics-like issues.
“rather than applying to particular issues”
That would simply result in people treating Bayesianism as if it's a separate magisterium from everyday life.
Think of it as the no-politics rule turned up to 11.The point is not that these things can't be reasoned about, but that the strong (negative/positve) affect attached to certain things makes them ill-suited to rationalist pedagogy.
Lowering the barrier to entry doesn't mean you can't have other things further up the incline, though.
Datapoint: I find that I spend more time reading the politically-charged threads and subthreads than other content, but get much less out of them. They're like junk food; interesting but not useful. On the other hand, just about anywhere other than LW, they're not even interesting.
(on running a memory-check, I find that observation applies mostly to comment threads. There's been a couple of top-level political articles that I genuinely learned something from)