That seems extremely dangerous.
Everything is dangerous.
If it works, it can be misapplied.
If it doesn't work, it displaces effort from things that do work.
That seems extremely dangerous.
Everything is dangerous.
If it works, it can be misapplied.
If it doesn't work, it displaces effort from things that do work.
Sure, but inside view/contrarianism/knowledge of most biases seem like things that ideally should be reserved for when you know what you're doing, which the person described in the OP probably doesn't.
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.
Epistemologically, the final arbiter is reality. Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
From the context it seems you associate "mainstream" with "dumb masses", but the popular views are often remarkably uninformed and are also actively shaped by a variety of interests (both political and commercial). I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)
the question is the cause of the dumb masses. Personally, I think it's tribal stuff
I don't understand. Are you saying that the masses are dumb because (causative!) the tribal affiliation is strong with them??
but we're not working with a blank slate here
Which you want to wipe down to the indifferent/accepting/passive moron level before starting to do anything useful?
avoiding object level is perfectly sensible if you are not an expert.
Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.
Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
Perhaps I focused too much on 'mainstream' when I really meant 'outside view'. Obviously, outside view can take both of these into account to different degrees, but essentially, the point is that I think teaching the person to use outside view is better, and outside view is heavily biased (arguably justifiably so) in favor of the mainstream.
I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)
But that's my point: a lot of different contrarian groups have what the OP calls "a web of lies that sound quite logical and true". Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Instead, I think you need to get them unstuck using outside view, and then you can teach them how to identify truth correctly.
I don't understand. Are you saying the the masses are dumb because (causative!) the tribal affiliation is strong with them??
Yes. The masses try to justify their ingroup, they don't try to seek truth.
Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.
The way is see it is this: if I got into a debate with a conspiracy theorist, I'm sure they would have much better object-level arguments than I do; I bet they would be able to consistently win when debating me. The reason for this is that I'm not an expert on their specific conspiracy, while they know every single shred of evidence in favor of their theory. This means that I need to rely on meta-level indicators like nobody respecting holocaust deniers in order to determine the truth of their theories, unless I want to spend huge amounts of time researching them.
Sure, there are cases where I think I can do better than most people (computer science, math, physics, philosophy, gender, generally whatever I decide is interesting and start learning a lot about) and in those case I'm willing to look at the object level, but otherwise I really don't trust my own ability to figure out the truth - and I shouldn't, because it's necessary to know a lot of the facts before you can even start formulating sensible ideas on your own.
If we take this to the extreme where someone doesn't understand truth, logic, what constitutes evidence or anything like that, I really would start out by teaching them how to deal with stuff when you don't understand it in detail, not how to deal with it when you do.
Well, that makes sense for people who ... Less so for people who have trouble with rationality.
So, is this an elites vs dumb masses framework? Quod licet Iovi, non licet bovi?
Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
And that's besides the rather obvious power/control issues.
So, is this an elites vs dumb masses framework?
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.
Once we have the framework, the question is the cause of the dumb masses. Personally, I think it's tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can't really say "the mainstream is silencing my tribe!" witthout having some important conclusions to make about your tribe).
Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
It's probably not a good approach for young children or similarly open minds, but we're not working with a blank slate here. Also, it's not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.
By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
because otherwise they are going to engage in tribal politics.
Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.
Remember: I'm not against doing stuff in Inside View, but I think it will be hard to 'fix' completely broken belief systems in that context. You're going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don't just end up more polarized is going to be impossible.
Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.
I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it's very hard to get people to change their minds.
Surely it must convey some, arguably a lot, of meaning.
You want it to, but that doesn't mean it actually happens :-/
Inside view was what got them into this mess to begin with.
Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.
This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.
Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".
People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.
That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.
Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?
Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.
By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
Avoid things that sound cultish.
etc.
Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".
I'm actually saying that everybody, friend or foe, who engages in tribal politics, should be taught to... not engage in tribal politics. And that this should be done before we teach them the most effective arguments, because otherwise they are going to engage in tribal politics.
And why is tribal politics bad? Cuz it doesn't lead to truth/a better world, but instead to constant disagreement.
That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.
Sure. But most of the time, they seem to disagree on facts too.
Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?
I think it will backfire less.
without passing through majoritarian seems like something that could really easily backfire.
That's meaningless hand-waving. Do you have evidence?
By the way, if it's extremely dangerous, maybe we should shut down LW -- unenlightened people can get ideas here that "could really easily backfire", couldn't they?
That's meaningless hand-waving. Do you have evidence?
I don't think it's fair to say that it is meaningless. Surely it must convey some, arguably a lot, of meaning. For example, it includes the advice of making people trust authorities more, and a critique of certain traditional rationalist ideas.
In terms of evidence... well, I don't have scientific evidence, but obviously I have anecdotes and some theory behind my belief. I can write the anecdotes if you think you're going to find knowing their details relevant, but for now I'll just skip them, since they're just anecdotes.
The theory behind my claim can roughly be summed up in a few sentences:
Inside view was what got them into this mess to begin with.
This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.
People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.
Essentially, don't try to teach people 'hard mode' until they can at least survive 'easy mode'.
By the way, if it's extremely dangerous, maybe we should shut down LW -- unenlightened people can get ideas here that "could really easily backfire", couldn't they?
'Extremely dangerous' could be considered a hyperbola; what I meant is that if you push them down into the hole of having ridiculous ideas and knowing everything about biases, you might not ever be able to get them up again.
I don't think the Sequences are that dangerous, because they spend a lot of time trying to get people to see problems in their own thinking (that's the entire point of the Sequences, isn't it?). The problem is that actually doing that is tricky. Eliezer has had a lot of community input in writing them. so he has an advantage that the OP doesn't have. Also, he didn't just focus on bias, but also on a lot of other (IMO necessary) epistemological stuff. I think they're hard to use for dismissing any opposing argument.
That seems extremely dangerous.
LOL. Word inflation strikes again with a force of a million atomic bombs! X-)
Are you really arguing for keeping ideologically incorrect people barefoot and pregnant, lest they harm themselves with any tools they might acquire?
No, but attempting to go from irrational contrarian to rational contrarian (thinking about arguments, for instance by considering fallacies, is contrarian-ish) without passing through majoritarian seems like something that could really easily backfire.
The one thing I've come up with is to somehow introduce them to classical logical fallacies.
That seems extremely dangerous. Most of the time, this will just make people better at rationalization, and many things that are usually considered fallacies are actually heuristics.
I personally subscribe to the Many Worlds Interpretation of quantum mechanics, so I effectively "believe" in the multiverse. That means it is possible that somewhere in the universal wavefunction, there is an Everett Branch in which magic is real.
Nope. The laws of physics are the same in all branches.
Or at least every time someone chants an incantation, by total coincidence, the desired effect occurs.
Those branches would be extremely rare.
Alan Turing pondered a related problem known as the halting problem, which asks if a general algorithm can distinguish between an algorithm that will finish or one that will run forever.
I don't find it very obvious how this is related.
So how would a person distinguish between pseudo-magic that will inevitably fail, and real magic that is the true laws of physics?
The pseudo-magic will with large probability fail the next time you test it.
And finally, what if our entire understanding of reality, including logic, is mere deception by happenstance, and everything we think we know is false?
Then you would find out very soon, unless you postulate something to keep the system stable.
It's not surprising that one particular parsimony principle can be used to overturn one particular form of theism. After all, most theists disagree with most theisms...and most believres in a Weird Science hypothesis (MUH, Matrix, etcv ) don't believe in the others.
The question is: where is the slam dunk against theism..the one that works against all forms of theism, that works only against theism , and not against similar scientific ideas like Matrix Lords, and works against the strongest arguments for theism, not just biblically literalist creationist protestant Christianity, and doesn't rest on cherry-picking particular parisimony principles?
There are multiple principles of parsimony, multiple Occam's razors.
Some focus on ontology, on the multiplication of entities, as in the original razor others on epistemology the multiplication of assumptions. The Kolmogorov complexity measure is more alligned to the latter.
Smaller universes are favoured by the ontological razor,but disfavoured by the Epistemological razor, because they are more arbitrary. Maximally large universes can have low epistemic complexity (because you have to add information specifying hwat has been left out to arrive at smaller universs), and low K. complexity (because short programmes can generate infinite bitstrings, eg an expansion of pi).
Morality as we know it evolved from physics plus starting conditions. When you say that physics is soluble but morality isn't, I suppose you mean that the starting conditions are absent.
You need to know not just the starting conditions, but also the position where morality evolves. That position can theoretically have huge complexity.