I've known for a long time that some people who are very close to me are somewhat inclined to believe the pseudoscience world, but it always seemed pretty benign. In their everyday lives they're pretty normal people and don't do any crazy things, so this was a topic I mostly avoided and left it at that. After all - they seemed to find psychological value in it. A sense of control over their own lives, a sense of purpose, etc.
Recently I found out however that at least one of them seriously believes Bruce Lipton, who in essence preaches that happy thoughts cure cancer. Now I'm starting to get worried...
Thus I'm wondering - what can I do about it? This is in essence a religious question. They believe this stuff with just anecdotal proof. How do I disprove it without sounding like "Your religion is wrong, convert to my religion, it's right"? Pseudoscientists are pretty good at weaving a web of lies that sound quite logical and true.
The one thing I've come up with is to somehow introduce them to classical logical fallacies. That at least doesn't directly conflict with their beliefs. But beyond that I have no idea.
And perhaps more important is the question - should I do anything about it? The pseudoscientific world is a rosy one. You're in control of your life and your body, you control random events, and most importantly - if you do everything right, it'll all be OK. Even if I succeed in crushing that illusion, I have nothing to put in its place. I'm worried that revealing just how truly bleak the reality is might devastate them. They seem to be drawing a lot of their happiness from these pseudoscientific beliefs, either directly or indirectly.
And anyway, more likely that I won't succeed but just ruin my (healthy) relationship with them. Maybe it's best just not to interfere at all? Even if they end up hurting themselves, well... it was their choice. Of course, that also means that I'll be standing idly by and allowing bullshit to propagate, which is kinda not a very good thing. However right now they are not very pushy about their beliefs, and only talk about them if the topic comes up naturally, so I guess it's not that bad.
Any thoughts?
The thing is, you can apply it more widely than just forecasting. Forecasting is just trying to figure out the future, and there's no reason you should limit yourself to the future.
Anyway, the way I see it, in inside view, both when forecasting and when trying to figure out truth, you focus on the specific problem you are working on, try to figure out its internals, etc.. In outside view, you look at things outside the problem, like track record of similar things (which I, in my list, called "looks like cultishness"; arguably I could have named that better), other's expectations of your success (hey bank, I would like to borrow money to start a company! what, you don't believe I will succeed?), etc.. Perhaps 'outside view' isn't a good term either (which kinda justifies me calling it majoritarianism to begin with...), but whatever. Let's make up some new terms, how about calling them the helpless and the independent views?
Well, how often does it happen?
How much detail do you want it in and how general do you want it to be? What is the starting point of the person who needs to be deconverted? Actually, to skip all these kinds of questions, could you give an example of how you would write how deconversion would work in your system?
IQ != rationality. I don't know if there is a correlation, and if there is one, I don't know in which direction. Eliezer has made a good argument that higher IQ gives a wider possible range of rationality, but I don't have the evidence to support that.
Anyway, I at least notice that the times where people are wrong, it's often because they try to signal loyalty to their tribe (of course, there often is an opposing tribe that is correct on the question where the first one was wrong...). This is anecdotal, though, so YMMV. What do you observe? That people who have made certain answers to certain questions part of their identity are more likely to be correct?
...probably? Not so much with military conflicts, because you are not doing as much politics as you are doing fighting, but I generally see that if a discussion becomes political, everybody starts saying stupid stuff.
But the only reason I don't get convinced is because of the helpless view (and, of course, things like tribalism, but let's pretend I'm a bounded rationalist for simplicity). In the independent view, I see lots of reasons for believing him, and I have no good counterarguments. I mean, I know that I can find counterarguments, but I'm not going to do that after the debate.
Again, I believe in an asymmetry between people who have internalized various lessons on tribalism and other people. I agree that if I did not believe in that asymmetry, I would not have good epistemic reasons for being on LW (though I might have other good reasons, such as entertainment).
"Smart people like Bill Gates, Stephen Hawking and Elon Musk are worried about AI along with a lot of experts on AI."
This should also be a significant factor in her belief in AI risk; if smart people or experts weren't worried, she should not be either.
I've been in a high-IQ club and not all of them are rational. Take selection effects into account and we might very well end up with a lot of irrational high-IQ people.
Actually, there is -- future is the only thing you can change -- but let's not sidetrack too much.
Sure, good names, let's take 'em.
The reason I brought it up is that there is no default "do what the mainstream does" position there. The mainstream is religious and the helpless view would tell you to be religious, too.
I don't have much experience with deconversions, but even looking at personal st... (read more)