Lumifer comments on How could one (and should one) convert someone from pseudoscience? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (53)
That's meaningless hand-waving. Do you have evidence?
By the way, if it's extremely dangerous, maybe we should shut down LW -- unenlightened people can get ideas here that "could really easily backfire", couldn't they?
I don't think it's fair to say that it is meaningless. Surely it must convey some, arguably a lot, of meaning. For example, it includes the advice of making people trust authorities more, and a critique of certain traditional rationalist ideas.
In terms of evidence... well, I don't have scientific evidence, but obviously I have anecdotes and some theory behind my belief. I can write the anecdotes if you think you're going to find knowing their details relevant, but for now I'll just skip them, since they're just anecdotes.
The theory behind my claim can roughly be summed up in a few sentences:
Inside view was what got them into this mess to begin with.
This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.
People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.
Essentially, don't try to teach people 'hard mode' until they can at least survive 'easy mode'.
'Extremely dangerous' could be considered a hyperbola; what I meant is that if you push them down into the hole of having ridiculous ideas and knowing everything about biases, you might not ever be able to get them up again.
I don't think the Sequences are that dangerous, because they spend a lot of time trying to get people to see problems in their own thinking (that's the entire point of the Sequences, isn't it?). The problem is that actually doing that is tricky. Eliezer has had a lot of community input in writing them. so he has an advantage that the OP doesn't have. Also, he didn't just focus on bias, but also on a lot of other (IMO necessary) epistemological stuff. I think they're hard to use for dismissing any opposing argument.
You want it to, but that doesn't mean it actually happens :-/
Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.
Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".
That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.
Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?
By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
Avoid things that sound cultish.
etc.
I'm actually saying that everybody, friend or foe, who engages in tribal politics, should be taught to... not engage in tribal politics. And that this should be done before we teach them the most effective arguments, because otherwise they are going to engage in tribal politics.
And why is tribal politics bad? Cuz it doesn't lead to truth/a better world, but instead to constant disagreement.
Sure. But most of the time, they seem to disagree on facts too.
I think it will backfire less.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.
Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.
Remember: I'm not against doing stuff in Inside View, but I think it will be hard to 'fix' completely broken belief systems in that context. You're going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don't just end up more polarized is going to be impossible.
I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it's very hard to get people to change their minds.
So, is this an elites vs dumb masses framework? Quod licet Iovi, non licet bovi?
Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
And that's besides the rather obvious power/control issues.
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.
Once we have the framework, the question is the cause of the dumb masses. Personally, I think it's tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can't really say "the mainstream is silencing my tribe!" witthout having some important conclusions to make about your tribe).
It's probably not a good approach for young children or similarly open minds, but we're not working with a blank slate here. Also, it's not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.
Epistemologically, the final arbiter is reality. Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
From the context it seems you associate "mainstream" with "dumb masses", but the popular views are often remarkably uninformed and are also actively shaped by a variety of interests (both political and commercial). I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)
I don't understand. Are you saying that the masses are dumb because (causative!) the tribal affiliation is strong with them??
Which you want to wipe down to the indifferent/accepting/passive moron level before starting to do anything useful?
Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.
Perhaps I focused too much on 'mainstream' when I really meant 'outside view'. Obviously, outside view can take both of these into account to different degrees, but essentially, the point is that I think teaching the person to use outside view is better, and outside view is heavily biased (arguably justifiably so) in favor of the mainstream.
But that's my point: a lot of different contrarian groups have what the OP calls "a web of lies that sound quite logical and true". Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Instead, I think you need to get them unstuck using outside view, and then you can teach them how to identify truth correctly.
Yes. The masses try to justify their ingroup, they don't try to seek truth.
The way is see it is this: if I got into a debate with a conspiracy theorist, I'm sure they would have much better object-level arguments than I do; I bet they would be able to consistently win when debating me. The reason for this is that I'm not an expert on their specific conspiracy, while they know every single shred of evidence in favor of their theory. This means that I need to rely on meta-level indicators like nobody respecting holocaust deniers in order to determine the truth of their theories, unless I want to spend huge amounts of time researching them.
Sure, there are cases where I think I can do better than most people (computer science, math, physics, philosophy, gender, generally whatever I decide is interesting and start learning a lot about) and in those case I'm willing to look at the object level, but otherwise I really don't trust my own ability to figure out the truth - and I shouldn't, because it's necessary to know a lot of the facts before you can even start formulating sensible ideas on your own.
If we take this to the extreme where someone doesn't understand truth, logic, what constitutes evidence or anything like that, I really would start out by teaching them how to deal with stuff when you don't understand it in detail, not how to deal with it when you do.
Also my limited experience from LW meetups suggests that people who come there only for the feeling of contrarianism usually avoid reading the Sequences.
Probably for the same reason they also avoid reading a serious textbook on the subjects they have strong opinions about. (I am not saying that the Sequences are a serious textbook, but rather that the dislike towards textbooks also translates to dislike towards the Sequences and probably anything other than sensational online videos).
Thus, ironically despite various accusations against Eliezer and his education, the Sequences can act as a filter against crackpots. (Not a perfect filter, but still.)