Lumifer comments on How could one (and should one) convert someone from pseudoscience? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (53)
LOL. Word inflation strikes again with a force of a million atomic bombs! X-)
Are you really arguing for keeping ideologically incorrect people barefoot and pregnant, lest they harm themselves with any tools they might acquire?
No, but attempting to go from irrational contrarian to rational contrarian (thinking about arguments, for instance by considering fallacies, is contrarian-ish) without passing through majoritarian seems like something that could really easily backfire.
That's meaningless hand-waving. Do you have evidence?
By the way, if it's extremely dangerous, maybe we should shut down LW -- unenlightened people can get ideas here that "could really easily backfire", couldn't they?
I don't think it's fair to say that it is meaningless. Surely it must convey some, arguably a lot, of meaning. For example, it includes the advice of making people trust authorities more, and a critique of certain traditional rationalist ideas.
In terms of evidence... well, I don't have scientific evidence, but obviously I have anecdotes and some theory behind my belief. I can write the anecdotes if you think you're going to find knowing their details relevant, but for now I'll just skip them, since they're just anecdotes.
The theory behind my claim can roughly be summed up in a few sentences:
Inside view was what got them into this mess to begin with.
This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.
People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.
Essentially, don't try to teach people 'hard mode' until they can at least survive 'easy mode'.
'Extremely dangerous' could be considered a hyperbola; what I meant is that if you push them down into the hole of having ridiculous ideas and knowing everything about biases, you might not ever be able to get them up again.
I don't think the Sequences are that dangerous, because they spend a lot of time trying to get people to see problems in their own thinking (that's the entire point of the Sequences, isn't it?). The problem is that actually doing that is tricky. Eliezer has had a lot of community input in writing them. so he has an advantage that the OP doesn't have. Also, he didn't just focus on bias, but also on a lot of other (IMO necessary) epistemological stuff. I think they're hard to use for dismissing any opposing argument.
You want it to, but that doesn't mean it actually happens :-/
Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.
Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".
That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.
Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?
By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
Avoid things that sound cultish.
etc.
I'm actually saying that everybody, friend or foe, who engages in tribal politics, should be taught to... not engage in tribal politics. And that this should be done before we teach them the most effective arguments, because otherwise they are going to engage in tribal politics.
And why is tribal politics bad? Cuz it doesn't lead to truth/a better world, but instead to constant disagreement.
Sure. But most of the time, they seem to disagree on facts too.
I think it will backfire less.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.
Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.
Remember: I'm not against doing stuff in Inside View, but I think it will be hard to 'fix' completely broken belief systems in that context. You're going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don't just end up more polarized is going to be impossible.
I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it's very hard to get people to change their minds.
So, is this an elites vs dumb masses framework? Quod licet Iovi, non licet bovi?
Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
And that's besides the rather obvious power/control issues.
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.
Once we have the framework, the question is the cause of the dumb masses. Personally, I think it's tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can't really say "the mainstream is silencing my tribe!" witthout having some important conclusions to make about your tribe).
It's probably not a good approach for young children or similarly open minds, but we're not working with a blank slate here. Also, it's not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.
Also my limited experience from LW meetups suggests that people who come there only for the feeling of contrarianism usually avoid reading the Sequences.
Probably for the same reason they also avoid reading a serious textbook on the subjects they have strong opinions about. (I am not saying that the Sequences are a serious textbook, but rather that the dislike towards textbooks also translates to dislike towards the Sequences and probably anything other than sensational online videos).
Thus, ironically despite various accusations against Eliezer and his education, the Sequences can act as a filter against crackpots. (Not a perfect filter, but still.)
Hey! Hey. He. Careful there, a propos word inflation. It strikes with a force of no more than one thousand atom bombs.
Sounds as good a reason as any!
I'm not sure how much it counts, but I bet Chief Ramsay would've shut it down long ago. Betting is good, I've learned.
Extremely dangerous stuff, that...
But if betting is good, pre-commitment and co-operation are the best! X-)
Knowing About Biases Can Hurt People has already been linked in this thread here. It seems to be the steelman of tailcalled's position and I suggest you argue against it instead of trying to score cheap points by pointing out how tailcalled uses "wrong" words to express himself.
I am not much concerned about "wrong" words other than that it might generate misunderstanding and confusion, but it does seem to me that I and tailcalled have real (not definitional) differences and disagreements.
I argue with live, present people. If you want to point out the many ways in which I'm wrong, jump in :-) But I am not going to argue with texts -- among other things, they don't answer back.