You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Lumifer comments on How could one (and should one) convert someone from pseudoscience? - Less Wrong Discussion

10 Post author: Vilx- 05 October 2015 11:53AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (53)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 05 October 2015 05:56:03PM *  -1 points [-]

Surely it must convey some, arguably a lot, of meaning.

You want it to, but that doesn't mean it actually happens :-/

Inside view was what got them into this mess to begin with.

Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.

This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.

Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".

People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.

That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.

Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?

Comment author: tailcalled 05 October 2015 06:18:09PM *  0 points [-]

Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.

By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:

  • Assume people who claim to be better than the mainstream are wrong.

  • Pay greater attention to authority than arguments.

  • Avoid things that sound cultish.

  • etc.

Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".

I'm actually saying that everybody, friend or foe, who engages in tribal politics, should be taught to... not engage in tribal politics. And that this should be done before we teach them the most effective arguments, because otherwise they are going to engage in tribal politics.

And why is tribal politics bad? Cuz it doesn't lead to truth/a better world, but instead to constant disagreement.

That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.

Sure. But most of the time, they seem to disagree on facts too.

Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?

I think it will backfire less.

Comment author: Lumifer 05 October 2015 06:40:03PM 3 points [-]

By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:

Assume people who claim to be better than the mainstream are wrong.

Pay greater attention to authority than arguments.

In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.

because otherwise they are going to engage in tribal politics.

Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.

Comment author: tailcalled 05 October 2015 07:13:18PM 0 points [-]

In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.

Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.

Remember: I'm not against doing stuff in Inside View, but I think it will be hard to 'fix' completely broken belief systems in that context. You're going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don't just end up more polarized is going to be impossible.

Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.

I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it's very hard to get people to change their minds.

Comment author: Lumifer 05 October 2015 07:29:49PM *  3 points [-]

Well, that makes sense for people who ... Less so for people who have trouble with rationality.

So, is this an elites vs dumb masses framework? Quod licet Iovi, non licet bovi?

Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.

And that's besides the rather obvious power/control issues.

Comment author: tailcalled 05 October 2015 07:51:29PM 0 points [-]

So, is this an elites vs dumb masses framework?

For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.

Once we have the framework, the question is the cause of the dumb masses. Personally, I think it's tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can't really say "the mainstream is silencing my tribe!" witthout having some important conclusions to make about your tribe).

Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.

It's probably not a good approach for young children or similarly open minds, but we're not working with a blank slate here. Also, it's not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.

Comment author: Lumifer 05 October 2015 08:17:40PM *  1 point [-]

For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.

Epistemologically, the final arbiter is reality. Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.

From the context it seems you associate "mainstream" with "dumb masses", but the popular views are often remarkably uninformed and are also actively shaped by a variety of interests (both political and commercial). I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)

the question is the cause of the dumb masses. Personally, I think it's tribal stuff

I don't understand. Are you saying that the masses are dumb because (causative!) the tribal affiliation is strong with them??

but we're not working with a blank slate here

Which you want to wipe down to the indifferent/accepting/passive moron level before starting to do anything useful?

avoiding object level is perfectly sensible if you are not an expert.

Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.

Comment author: tailcalled 05 October 2015 08:49:07PM 0 points [-]

Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.

Perhaps I focused too much on 'mainstream' when I really meant 'outside view'. Obviously, outside view can take both of these into account to different degrees, but essentially, the point is that I think teaching the person to use outside view is better, and outside view is heavily biased (arguably justifiably so) in favor of the mainstream.

I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)

But that's my point: a lot of different contrarian groups have what the OP calls "a web of lies that sound quite logical and true". Do you really think you can teach them how to identify such a web of lies while they are stuck in one?

Instead, I think you need to get them unstuck using outside view, and then you can teach them how to identify truth correctly.

I don't understand. Are you saying the the masses are dumb because (causative!) the tribal affiliation is strong with them??

Yes. The masses try to justify their ingroup, they don't try to seek truth.

Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.

The way is see it is this: if I got into a debate with a conspiracy theorist, I'm sure they would have much better object-level arguments than I do; I bet they would be able to consistently win when debating me. The reason for this is that I'm not an expert on their specific conspiracy, while they know every single shred of evidence in favor of their theory. This means that I need to rely on meta-level indicators like nobody respecting holocaust deniers in order to determine the truth of their theories, unless I want to spend huge amounts of time researching them.

Sure, there are cases where I think I can do better than most people (computer science, math, physics, philosophy, gender, generally whatever I decide is interesting and start learning a lot about) and in those case I'm willing to look at the object level, but otherwise I really don't trust my own ability to figure out the truth - and I shouldn't, because it's necessary to know a lot of the facts before you can even start formulating sensible ideas on your own.

If we take this to the extreme where someone doesn't understand truth, logic, what constitutes evidence or anything like that, I really would start out by teaching them how to deal with stuff when you don't understand it in detail, not how to deal with it when you do.

Comment author: Lumifer 06 October 2015 06:01:47PM *  1 point [-]

when I really meant 'outside view'

Let's sort out the terminology. I think we mean different things by "outside view".

As far as I understand you, for you the "outside view" means not trying to come to any conclusions on your own, but rather accept what the authorities (mainstream, experts, etc.) tell you. Essentially, when you recommend "outside view" to people you tell them not to think for themselves but rather accept what others are telling them (see e.g. here).

I understand "outside view" a bit more traditionally (see e.g. here) and treat it as a forecasting technique. Basically, when you want to forecast something using the inside view, you treat that something as 'self-propelled', in a way, you look at its internal workings and mechanisms to figure out what will happen to it. If you take the outside view, on the other hand, you treat that something as a black box that is moved primarily by external forces and so to forecast you look at these external forces and not at the internals.

Given this, I read your recommendation "teaching the person to use outside view is better" as "teach the person to NOT think for himself, but accept whatever most people around think".

I disagree with this recommendation rather strongly.

Do you really think you can teach them how to identify such a web of lies while they are stuck in one?

Why, yes, I do. In fact, I think it's the normal process of extracting oneself from "a web of lies" -- you start by realizing you're stuck in one. Of course, no one said it would be easy.

An example -- religious deconversion. How do you think it will work in your system?

Yes. The masses try to justify their ingroup, they don't try to seek truth.

Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy "smartness") and the strength of tribal affiliation. Do we observe it? The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?

if I got into a debate with a conspiracy theorist ...I bet they would be able to consistently win when debating me.

I don't know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms -- the one who scores the most points with the judges wins. That's not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.

otherwise I really don't trust my own ability to figure out the truth

That's probably the core difference that leads to our disagreements. I do trust my own ability (the fact that I'm arrogant should not be a surprise to anyone). Specifically, I trust my own ability more than I trust the mainstream opinion.

Of course, my opinion and the mainstream opinion coincide on a great deal of mundane things. But when they don't, I am not terribly respectful of the mainstream opinion and do not by default yield to it.

In fact, I don't see how your approach is compatible with being on LW. Let's take Alice who is a LessWrongian and is concerned about FAI risks. And let's take Bob who subscribes to your approach of defering to the mainstream.

Alice goes: "I'm concerned about the risk of FAI."

Bob: "That's silly. You found yourself a cult with ridiculous ideas. Do you have a Ph.D. in Comp Sci or something similar? If not, you should not try have your own opinion about things you do not understand. Is the mainstream concerned about FAI? It is not. So you should not, as well."

What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.

If we take this to the extreme where someone doesn't understand truth, logic, what constitutes evidence or anything like that

I don't think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.

Comment author: tailcalled 06 October 2015 07:09:46PM 1 point [-]

I understand "outside view" a bit more traditionally and treat it as a forecasting technique.

The thing is, you can apply it more widely than just forecasting. Forecasting is just trying to figure out the future, and there's no reason you should limit yourself to the future.

Anyway, the way I see it, in inside view, both when forecasting and when trying to figure out truth, you focus on the specific problem you are working on, try to figure out its internals, etc.. In outside view, you look at things outside the problem, like track record of similar things (which I, in my list, called "looks like cultishness"; arguably I could have named that better), other's expectations of your success (hey bank, I would like to borrow money to start a company! what, you don't believe I will succeed?), etc.. Perhaps 'outside view' isn't a good term either (which kinda justifies me calling it majoritarianism to begin with...), but whatever. Let's make up some new terms, how about calling them the helpless and the independent views?

Why, yes, I do. In fact, I think it's the normal process of extracting oneself from "a web of lies" -- you start by realizing you're stuck in one. Of course, no one said it would be easy.

Well, how often does it happen?

An example -- religious deconversion. How do you think it will work in your system?

How much detail do you want it in and how general do you want it to be? What is the starting point of the person who needs to be deconverted? Actually, to skip all these kinds of questions, could you give an example of how you would write how deconversion would work in your system?

Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy "smartness") and the strength of tribal affiliation. Do we observe it?

IQ != rationality. I don't know if there is a correlation, and if there is one, I don't know in which direction. Eliezer has made a good argument that higher IQ gives a wider possible range of rationality, but I don't have the evidence to support that.

Anyway, I at least notice that the times where people are wrong, it's often because they try to signal loyalty to their tribe (of course, there often is an opposing tribe that is correct on the question where the first one was wrong...). This is anecdotal, though, so YMMV. What do you observe? That people who have made certain answers to certain questions part of their identity are more likely to be correct?

The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?

...probably? Not so much with military conflicts, because you are not doing as much politics as you are doing fighting, but I generally see that if a discussion becomes political, everybody starts saying stupid stuff.

I don't know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms -- the one who scores the most points with the judges wins. That's not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.

But the only reason I don't get convinced is because of the helpless view (and, of course, things like tribalism, but let's pretend I'm a bounded rationalist for simplicity). In the independent view, I see lots of reasons for believing him, and I have no good counterarguments. I mean, I know that I can find counterarguments, but I'm not going to do that after the debate.

In fact, I don't see how your approach is compatible with being on LW.

Again, I believe in an asymmetry between people who have internalized various lessons on tribalism and other people. I agree that if I did not believe in that asymmetry, I would not have good epistemic reasons for being on LW (though I might have other good reasons, such as entertainment).

What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.

"Smart people like Bill Gates, Stephen Hawking and Elon Musk are worried about AI along with a lot of experts on AI."

This should also be a significant factor in her belief in AI risk; if smart people or experts weren't worried, she should not be either.

I don't think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.

I've been in a high-IQ club and not all of them are rational. Take selection effects into account and we might very well end up with a lot of irrational high-IQ people.