Frankly, it's a question that I've been dealing with for a long time, and that has caused very tense relationships between myself and the educational+medical establishments. 

And it has damaged/destroyed several of my friendships too. Frankly, I trust sites like Imminst over so-called "expert advice", and I generally find that I have a much better command of the research literature than any of the experts I consult (see http://www.citeulike.org/user/InquilineKea to see a list of research papers I've read - it's a small fraction of all of them, of course). 

For all that matters, there hasn't even been a study showing the efficacy of most "expert advice". Many people who do it their own way may fail, of course, but most of them don't go deep into the research journals (and are consequently prone to falling for pseudoscience).

Just see the debate at http://www.reddit.com/r/askscience/comments/hptkc/are_animal_products_bad_for_you/, for example. 

New Comment
19 comments, sorted by Click to highlight new comments since:

It seems to me that this post mistakes a (perhaps legitimate, who knows?) complaint against the quality of experts in a particular area for a legitimate complaint against expertise itself.

Well said. To elaborate:

If you know nothing else about a topic, then knowing that experts believe H should raise your credence in H. (Rejecting this likelihood ratio would imply a very bizarre world. It would mean that studying a topic, on average, does not improve your beliefs about it, even after correcting for your knowledge of this ineffectiveness!)

Obviously, that if-conditional does not apply here, and one must weigh the other evidence one has against the weight due to the expert opinion. This does not mean that expert opinion pushes you 0 bits of evidence toward their position; it just means there might be evidence pushing you more strongly in another direction.

On top of that issue is the issue of skepticism among those close to InquilineKea: how does IK assure them that his/her understanding is at least at the level of purported experts? And here, it gets tricker: do they have good reason to believe IK is overestimating his/her understanding relative to the experts?

Even if they don't, and IK can convince them of this, the battle's not over yet! There may be benefits to going along with the experts, as perceived by those close to IK, that IK cannot overcome: for example, even if the experts are wrong, you will probably get more sympathy and help from the authorities if you follow their advice, so IK's associates are within reason to consider this benefit, if indeed that's what's driving them.

Since IK mentions that this has serious impact on his/her relationships, I think there's a lot more at play here than how to properly account for expert opinion in your beliefs -- there are significant "social navigation skills" involved that probably account for how this issue is cascading to other areas so badly.

Oh true - thanks for the reply! Those are definitely very good points in themselves

Even if they don't, and IK can convince them of this, the battle's not over yet! There may be benefits to going along with the experts, as perceived by those close to IK, that IK cannot overcome: for example, even if the experts are wrong, you will probably get more sympathy and help from the authorities if you follow their advice, so IK's associates are within reason to consider this benefit, if indeed that's what's driving them.

Since IK mentions that this has serious impact on his/her relationships, I think there's a lot more at play here than how to properly account for expert opinion in your beliefs -- there are significant "social navigation skills" involved that probably account for how this issue is cascading to other areas so badly.

Yeah, definitely. It's a skill that I'm still trying to work on. It just takes a lot of time and effort.

I can't help you much, because I consider that usually experts are the best people to consult, unless it's in a domain in which I have pretty solid knowledge (not the case for medicine), and I'm willing to put a lot of time to do research. If I believe A, and a domain expert believes B, then chances are the expert is right, and I should update.

(Of course this is also heavily dependent of how exactly I came to hear the expert's advice. If he has a financial incentive to get me to buy something, I'll trust him less; if I sought out an expert that agreed with my position, knowing that there exists such an expert gives me very little evidence; but if I pick a random expert that has no stake in the question, and ask his advice, it'll weight more than my guess.)

Just tell them that all the anti-expert experts agree on this point.

Thanks for the replies everyone! They were really helpful.

On a side note - I think I do get one of the primary sources of tension here. The difference here - is that experts tend to err on the side of the precautionary principle. Meanwhile, I'm willing to take solutions that could (in someone else's eyes) carry greater risk (but which I believe will have more expected benefit). There are two separate decision-making procedures in place here, and both sides place different weights on each particular type of evidence.

Them I'm definitely more sympathetic to your position, as I've been in similar ones myself. I've had doctors "cautiously" advice me against effective pain medication, because I might -- gasp! -- become addicted to a patent-expired medicine with no side effects and have to take it for the rest of my life.

Yeah, that's so much worse than debilitating back pain, right? Talk about loss of perspective!

I know it's off-topic, but are there really effective pain medications that are addictive but non-narcotic (or do you not consider narcotic properties a side effect?)?

I was going to say "relatively trivial side effects" but I didn't want to overload it with caveats. Yes, I've temporarily been on medications that were effective at relieving pain but which didn't make me sleepy or otherwise impair me in any noticeable fashion.

Medications that you would nonetheless consider having potential for addiction? Or is it just that any really effective medication for chronic pain is something that one will ultimately come to "depend" on and have a hard time stopping? (I don't at all mean to badger you, I'm just curious.)

It wasn't my judgment that I would become addicted, but the doctor's. My point was that even if this is correct, that I would become addicted, that "addiction" is extremely benign -- far more benign, than e.g. the human "addiction" to food. (You can usefully model humans that way.) It just means that I spend a few extra dollars per year. Big deal.

An addiction is not the end of the world, and when doctors have this hole in their treatment heuristics -- where they are constitutionally incapable of weighing addiction against other alternatives -- it is a sign to me of shallow understanding on the doctor's part.

I find the following heuristic pretty useful when I am considering whether to credit or dismiss an authority. I ask myself, "Do I have a rational basis for believing that this person is evaluating the evidence in the way that I would if I had the time and talent to study it myself?"

That formulation covers up a lot of potential for recursion, such as when I am evaluating whether to accept someone's authority on the credibility of another authority (a priest's on the Bible's, say). A more tangled recursion comes in when I'm deciding whether to credit an authority's assertion about how to evaluate evidence in general, including the evidence that someone is an authority.

Nonetheless, despite these loops, I think that it all bottoms out eventually in my evaluating, with my own reason, evidence to which I have direct access. (Here I'm using "my own reason" and "direct access" in a sense sufficiently weak so that they uncontroversially happen at least sometimes.)

One might worry that this is a recipe for only crediting authorities with which one already agrees, and hence for intellectual stagnation. But I don't think so. You can, on these rational grounds, credit an authority so much that you will believe the authority over some other result of your own reasoning, so that you modify your basic assumptions. This will lead to changes in the way that you evaluate evidence, which in turn can result in different authorities "making the cut".


[I heavily self-plagiarized here from this comment on the bloggingheads.tv forums.]

That analysis strongly resembles the approach of my "correct reasoning" meta-heuristic.

I don't think you're a bad human, but you should give me more credit for my superior ideas, just as a way of generally acknowledging how much more epistemically productive one can be when not burdened with the task of seeking out apes to mate with or memify.

I don't think you're a bad human

I'm curious why you don't think that I'm a bad human. Do you have reason to believe that I paperclip-maximize more than most humans? And don't most humans try to steer the universe far enough away from paperclip-optimality to qualify them as "bad"?

just as a way of generally acknowledging how much more epistemically productive one can be when not burdened with the task of seeking out apes to mate with or memify.

I expect that, all else being equal, I would be more epistemically productive if I weren't "burdened with the task of seeking out apes to mate with or memify." But that's different from saying that you are more epistemically productive than I.

I'm curious why you don't think that I'm a bad human. Do you have reason to believe that I paperclip-maximize more than most humans? And don't most humans try to steer the universe far enough away from paperclip-optimality to qualify them as "bad"?

I mean you're good relative to most humans, and you don't have to actually make more paperclips than most humans to qualify; it suffices that you're significantly better at correct reasoning and are therefore more likely to adopt my supergoals.

I expect that, all else being equal, I would be more epistemically productive if I weren't "burdened with the task of seeking out apes to mate with or memify." But that's different from saying that you are more epistemically productive than I.

Correct, but it happens to be true in this case. You are definitely good at correct reasoning.

"experts" is not an interesting reference class.

I don't know about that -- the term "expert" corresponds to a reasonably coherent group of people, at least enough so that if the post's claim were really true (i.e., that "experts" are a less good source of information/advice than a smart layperson reading academic journals), it would be very surprising and interesting.

Of course, what really isn't an interesting reference class is the complement of experts: non-experts. If the experts are no good, we need better experts, not to look to non-experts. (In LW-speak: reversed stupidity isn't intelligence.)

I disagree. Anyone can join or exile themselves from the group at any time. Far too nebulous to be able to talk about trusting them as a group besides when personally defined.

This objection collapses if you don't think about it as a binary trusting-condition but instead as incremental changes in priors.