Your analysis has implications not only for individuals exposed to unpopular ideas, but also for movements promoting such ideas. These movements (e.g., effective altruism) should be particularly worried about their ideas being represented inadequately by its most radical, disagreeable or crazy members, and should spend their resources accordingly (e.g. by prioritizing outreach activities, favoring more mainstream leaders, handling media requests strategically, etc.).
Reminds me of my youth, when I was a big fan of Esperanto. There was one mentally not-completely-balanced man in our country who was totally obsessed about this great idea, and kept sending letters to all media, over and over again. Of course he achieved nothing; his letters didn't even make sense. The only real effect was that when we tried to promote something we did in the media, most people after hearing the word immediately remembered this guy and refused to even talk with us.
So yeah, a stupid ally is sometimes worse than an enemy.
The negation that "Popular ideas attract disproportionally good advocates" seems also worth attention. People accept sloppy thinking a lot more readily if they agree with the conlusion. This can be used as a dark art where you present a sloppy thinking argument for obvious truth or uplifting conlusion and then proceed to use the same technique to support the payload. The target is less likely to successfully deploy resistance.
Also it's quite often that a result that is produced in a rigorous way is rederived in a sloppy way by those that are told about the result.
My experience is the opposite. The worst advocates tend toward the popular ideas.
After all, they became the worst advocates by a complete inability to think straight. So they tend to pick their ideas to champion by popularity.
Also, the majority is the audience for everyone, just because it is big. People defending the mainstream grandstand to the mainstream, while people with rare views try to recruit from from the mainstream. People with rare views know what the counterarguments are going to be; they pass the ideological Turing test. "privilege"
I am not the original poster, but most people advocating for the idea that races have genetic differences in IQ are racists, because non-racists don't dare say so in most contexts.
This may also apply in Europe to people opposing immigration--only racists would dare say so because they're so marginalized that they don't take additional hits from it.
most people advocating for the idea that races have genetic differences in IQ are racists, because non-racists don't dare say so in most contexts.
Well, actually most people advocating for the idea that races have genetic differences in IQ are racists because that idea falls under the standard definition of racism.
Yeah this is definitely false. Reality is 'racist' I refuse to fall under such a negative category. Most people in the sensical camp of individuals who can respect individual genetic differences would also seek to abolish it and give every one a fair chance if we can somehow engineer a superior outcome.
This in-group/out group stuff gets very tiresome when it is used in a sloppy fashion. It does not correct for the percentage of people who "don't give a fuck". There's a difference between LessWrong language use in a decent form and mere abuse.
Most people in the sensical camp of individuals who can respect individual genetic differences would also seek to abolish it
What do you mean by "abolish it"? Do you mean replace all people with identical clones so that no one is smarter (or stronger or has more willpower) than anyone else? Or are individual differences only a problem as long as they correlate with race?
What are you talking about? I meant once you accept it, we can do somerthing about it. There's no reason to be destructive just because we can recognize reality. Please stop linking to articles, every one has developed this poor habit. I already accept most of the conclusions you believe.
I was trying to get across that you can be sanguine, while ackowledging the reality that exists and looking for ways around this. Such as countering dysgenics, etc.
Calling people who recognize racial correlations with intelligence racist is an incorrect appropriation of the term and is stretching 'racist' to mean something it shouldn't it's a weird trivial sort of technical correctness that is mostly irrelevant. I also think Jiro makes a wildly incorrect claim that most people "wouldn't dare say so in some contexts". Every one is abusing this in-group/out-group idea, it's a defective tool in this example. There's no reason to have a huge discussion about "Unpopular idea's attract poor advocates". The original post stands on largely nothing and there's no reason for every one to accept it on a whim and be applying it everywhere.
It's like there's some sophisticated markov generator that makes you speak less-wrongesque that aims to maximize insular language while being devoid of content.
Calling people who recognize racial correlations with intelligence racist is an incorrect appropriation of the term
I think a lot of people will disagree.
it's a weird trivial sort of technical correctness that is mostly irrelevant.
So, try declaring in a mainstream public forum that races have significantly different gene-based IQ (I recommend a disposable nym for that). Listen to the names you will be called, see how many commenters will be inclined to exhibit the "weird trivial sort of technical correctness"...
I have done this. People are unskilled at execution. It's not simple and it takes a bit of care, you have to display empathy that you are uncomfortable with the conclusions, and that it isn't something that you are happy or want to believe, and that if any one is ever going to provide a solution to give every one a better chance, then we will not get there with making it a crime to think this and organize around it. They just want assurance that you're not the person they read about in the history books.
They just want assurance that you're not the person they read about in the history books.
Seems to me they would want much more, starting with your head on a stick.
They just want assurance that you're not the person they read about in the history books.
The problem is that many of the people "they read about in the history books" did indeed have accurate views on race. Which means the only way to reassure them that you're not that person is to either lie to them about your beliefs or have inaccurate beliefs.
I was mostly reflecting on a pattern in the people I've met, so most of my examples won't be persuasive.
Musing on some less personal examples:
Religious missionaries are selected for atypical faith and a resistance to "leave me alone" social cues. For many people, talking to a more moderate believer would lead to a greater shift in opinion. (Not that the goals of missionary activity are to convince the average person).
People who explicitly advocate for utilitarianism tend to care enough about the system to "bite the bullet" on certain issues, scaring away newcomers. (Peter Singer on bestiality and infanticide, Eliezer on dust specks and torture). People who are vaguely utilitarian but not too concerned about consistency can almost certainly do a better job convincing a non-utilitarian that they should be a bit more utilitarian.
Kant had some actually useful ethical insights, but said some downright stupid things in the application of his ideas (like: You shouldn't lie to a murderer who comes knocking at your door looking for a victim, but you should "be silent" or something).
If you're a progressive with a progressive social circle and want to learn about critiques of progressivism or about conservative thought in general, neo-reaction is about the worst starting place ever. It's like a conservative in the deep south trying to learn about the political left by reading Marx and browsing Tumblr.
The "highest quality" non-commercial strength training materials (meaning they looked really shiny and had the most investment sunk into them) are often extreme and "purity" minded, and were produced by people who trigger the "crank" and "how did THIS become a central feature of your identity?" red flags.
Health innovations in general (some of which I've adopted, albeit in a less extreme form) tend to spread fastest via apocalyptic messengers ("fructose is literally poison"), new-age people, or the self-experiment crowd, (who don't upset me, but would strike many people as cranks).
You seem to be conflating two things. People who give logically bad arguments for their positions, and people who say things that trigger listeners absurdity heuristic.
The more radical positions tend to be more logically coherent, hence easier to logically defend. On the other hand they're also more likely to trigger people's absurdity heuristics.
More moderate positions are harder to defend since you wind up jumping through hoops to explain why your logic doesn't apply to certain cases. This means that in practice the more moderate position functions as a Trojan Horse for the more radical position.
Part of the issue is that what people precise as "crank" is heavily influenced by what's popular.
Kant had some actually useful ethical insights, but said some downright stupid things in the application of his ideas (like: You shouldn't lie to a murderer who comes knocking at your door looking for a victim, but you should "be silent" or something).
When I was first learning Kant I also thought this was stupid. But now after thinking about it a lot more I can see how this makes sense from a game theoretic point of view. If you model the murder as a rational agent with a different utility function, lying isn't a Nash equilibrium, being silent is.
People who explicitly advocate for utilitarianism tend to care enough about the system to "bite the bullet" on certain issues, scaring away newcomers. (Peter Singer on bestiality and infanticide,
Are you one of those awful bestialityphobes or something? :)
NRx got Mike Anissimov, Nick Land, and so on. Even Moldbug wasn't a terribly polished writer.
I know Moldbug is a notorious windbag, but from the little I've read of Anissimov's writings he seems clear and engaging.
The category was "poor advocates of NRx" not "poor writers of NRx."
I imagine Anissimov is engaging when you already subscribe to his memeplex, but he's terrible at engaging with others -- by design! In his view, NRx is weakened as its popular support increases.
Side note: I did read the linked piece, and thought it was quite good, even though I think that neoreaction is fundamentally misguided and potentially disastrous if it ever becomes dominant.
Point taken about the Writing Abilities vs. PR Abilities distinction, which I was steamrolling over.
However, while I don't have much information about Anissimov's public relations abilities (actually, I'm just seeing him pop up on twitter - whatever he's doing with those screenshots of 4chan and jokes(?) about using Ebola as a biological weapon [edit: my comprehension fail, see below], it's probably not a brilliant PR campaign. But I digress), the two links you provided definitely seem to be about projecting a unified public front for the movement and disassociating it from people who confirm generally inaccurate negative stereotypes about it. Suggesting that he's opposed to more people supporting neoreaction in general on that basis seems disingenuous. Nobody wants an eternal September or to look like morons in the mainstream press, but that doesn't mean that they're actively sabotaging their public image or trying not to recruit.
Suggesting that he's opposed to more people supporting neoreaction in general on that basis seems disingenuous.
Examples? Of Anissimov recruiting?
Sorry, just to be clear, I wasn't necessarily disputing your original point, as I don't really know that much about Anissimov. I was just pointing out that the links you provided weren't supporting your extraordinary claim that he actively decries the practice of people joining his movement.
I was just pointing out that the links you provided weren't supporting your extraordinary claim that he actively decries the practice of people joining his movement.
What you've written here is not what I claimed three comments ago.
In [Anissimov's] view, NRx is weakened as its popular support increases.
I normally wouldn't care if random person X on the internet thinks I'm wrong about Anissimov, but I'm really tired of people gaslighting me on this. So here is your "extraordinary" evidence that Anissimov believes his movement is weakened by popular support.
To recap, I relayed two separate essays of his in which he holds this value. Emphasis added everywhere by me.
1) "Boundaries":
As neoreaction grows, it is causing people to change the way they think about progressivism, democracy, and modern governance. To maintain this property, it needs to contain a certain concentration of people who understand its principles and can communicate them.
Because neoreaction is quickly growing, it is at risk of becoming diluted by hostile groups. Today, the most prominent adjacent hostile group are libertarians. This puts us in a difficult position because we also gain many new recruits from libertarianism.
[snip...]
The key is to strike a balance; allow room for disagreement, while clarifying that certain minimum standards must be met for someone to qualify as a “neoreactionary”. If any libertarian can call themselves a “neoreactionary” and get away with it, the integrity of the group will be fatally compromised through dilution.
We can quibble about what these "minimum standards" are, but evidentally an upper bound on the amount of disagreement possible is given by the whole Justine Tunney incident.
2) "The Kind of People Who Should Be Nowhere Near Neoreaction"
There is definitely a place for being welcoming to curious learners, people who have not accepted right-wing principles yet, etc. I am not objecting to that. But there is a threshold of insanity that should simply be rejected outright. We have a clear example of that here.
Remember my claim earlier was:
In his view, NRx is weakened as its popular support increases.
You've claimed that he's only concerned about NRx's public reputation. To the contrary, he says quite clearly:
The credibility and viability of neoreaction—no matter what its role ultimately may be—is at stake.
3) "Social Conservatism and Drawing a Line in the Sand"
Here we have a more specific version of "NRx'ers must believe at least this much, or else they cannot be called NRx'ers":
There is a certain basic amount of social conservatism which must be met among all those who label themselves “neoreactionary”.
[snip...]
Why would I not feel comfortable saying that the average Democrat is surely too liberal to be called a “neoreactionary”? Shouldn’t that be obvious?
[snip...]
The point is that not everyone is a member of the group. Some people are members and others are not. This seems like it should be obvious, but I actually have to state it here, because of the hyper-inclusive social bias of geeks. [Emphasis is original.]
I feel this should satisfy any reasonable evidential standards to conclude the claim I actually made. Feel free to disagree with me substantially after actually reading Anissimov for yourself.
Just because I'm setting boundary conditions does not mean I am generally discouraging people from involvement, that doesn't follow. However, it's true that there's an optimal recruitment rate which is substantially less than the maximum possible recruitment rate. Recruitment rates can be deliberately throttled back by introducing noise, posting more rarely, and through other methods.
NRx would be maximally strengthened if it could recruit as many people as possible while maintaining quality, but realistically we can only add a given number of people per week without compromising quality.
I'm fantastic at engaging others by design, I openly offer to publicly debate people, only Noah Smith has taken me up on it so far.
Re: ebola, I've never joked about using it as a biological weapon, I'm just responding to the funny meming that's going on on 4chan about ebola.
You should have heard of religious or other extremists.
People with unpleasant characteristics are probably the most common, but than mostly you encouter them with less extreme positions.
Cranks can be found via http://www.crank.net/science.html
Trotskyism is represented in the US by the Spartacist League. The members do not seem well adjusted socially.
Interesting post!
I have a feeling like there is a deep connection between this and the evaporative cooling effect (more moderate members of a group are more likely to leave when a group's opinion gets too extreme, thereby increasing the ratio of extremists and making the group even more extreme). Like there ought to be a social theory that explains both effects. I can't quite put my finger on it, though. Any ideas?
To address the "meta" of this...
Is this the position of being charitable to unpopular ideas an unpopular idea? I'm not even sure how you would measure "popularity" of an idea objectively in a world where OCR scripts and bots can hack SEO and push social media memes without a single human interaction...
And using Western psychological models regarding the analysis of such things is certainly bound to be rife with the feedback of cultural bias...
And is my response an unpopular idea?
Because of these issues, I find the reasoning demonstrated here to be circular. This premise requires a more rigorous definition of the term "popularity" which, as far as I can tell, cannot be done objectively since the concept is extremely context sensitive.
I think what the idea in the post does is that it gets at the curvature of the space, so to speak.
On one hand, unpopular ideas are disproportionately likely to be advocated by disagreeable people. On the other hand, those who hold unpopular positions often have to defend their views and be familiar with the opinions of their opponents, while the proponents of popular views may not be familiar with the arguments for unpopular views. For example, outspoken atheists are likely to be more disagreeable, but they're also more likely to be familiar with religious people's arguments than the typical religious person is familiar with arguments for atheism.
Interesting post!
Another reason to be charitable: these "poor advocates", by virtue of being marginalized/unpopular/cranks may have fewer disincentives to say "the emperor has no clothes", because their standing is already low. Once they put an idea out there, it may gain traction with a greater chunk of the population. Unfortunately, this dynamic leads to "autism is caused by vaccines" movements too.
If you're interested in the topic I highly recommend this BloggingHeads episode: http://bloggingheads.tv/videos/30467 specifically the "emperor has no clothes" and "tokenism" sections (there are links to those segments under the video.
Great. I added this to my list of life-lessons.
Also: This is related to Dangers of steelmanning / principle of charity
Interesting and useful post!
But on your last bullet, you seem to be conflating 'leadership' with 'people presenting the idea'. I'm not sure they are always the same thing: the 'leaders' of any group are quite often going to be there because they're good at forging consensus and/or because they have general social/personal skills that stop them appearing like cranks.
Take a fringe political party: I would guess that people promoting that party down the pub or in online comments on newspaper websites or whatever are more likely to be the sort of advocate you describe. But in all but the smallest fringe parties, you'd expect the actual leadership to have rather more political skill.
Unfamiliar or unpopular ideas will tend to reach you via proponents who:
...hold extreme interpretations of these ideas. ...have unpleasant social characteristics. ...generally come across as cranks.
That depends very much on the company that you hold. If you are friends with a bunch of people who like to argue contrarian positions than it's not true.
What is up with main?
The recent posts to main right now are: panhandling, awkward paeans to rationality, and impressive technical posts that I don't understand. (And a meet-up post).
It's been that way for a while. The standards are so high that there's rarely any content there that wasn't gandfathered in.
Unfamiliar or unpopular ideas will tend to reach you via proponents who:
The basic idea: It's unpleasant to promote ideas that result in social sanction, and frustrating when your ideas are met with indifference. Both situations are more likely when talking to an ideological out-group. Given a range of positions on an in-group belief, who will decide to promote the belief to outsiders? On average, it will be those who believe the benefits of the idea are large relative to in-group opinion (extremists), those who view the social costs as small (disagreeable people), and those who are dispositionally drawn to promoting weird ideas (cranks).
I don't want to push this pattern too far. This isn't a refutation of any particular idea. There are reasonable people in the world, and some of them even express their opinions in public, (in spite of being reasonable). And sometimes the truth will be unavoidably unfamiliar and unpopular, etc. But there are also...
Some benefits that stem from recognizing these selection effects:
I think the first benefit listed is the most useful.
To sum up: An unpopular idea will tend to get poor representation for social reasons, which will makes it seem like a worse idea than it really is, even granting that many unpopular ideas are unpopular for good reason. So when you encounter a idea that seem unpopular, you're probably hearing about it from a sub-optimal source, and you should try to be charitable towards the idea before dismissing it.