Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Psychohistorian comments on Harnessing Your Biases - Less Wrong

10 Post author: swestrup 02 July 2009 08:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (13)

You are viewing a single comment's thread.

Comment author: Psychohistorian 03 July 2009 12:05:04AM *  0 points [-]

I'm not sure this really counts as a bias. It seems quite rational, unless you will actually suffer immediate and significant consequences if you are wrong about string theory.

The cost of privately held beliefs (especially about abstract truths) is quite low. If I believe the earth is a flat disk on the back of an infinite stack of tortoises, and if I'm, say, a car mechanic, I will not suffer at all for this belief. Unless of course, I mention it to my friends, because they will judge me for it, unless of course they all believe the same thing, in which case I'll probably proclaim this belief loudly and often and possibly meet up with them to discuss it on an appointed day of the week. I may suffer because I fail at epistemology, but it doesn't seem clear how trusting the wrong source on one marginal occasion will corrupt my epistemology (doubly so if I'm refined enough to have a concept of my own epistemology). Taking epistemology as exogenous, there's really no cost to a marginal false belief (that does not affect me directly).

Having a false belief about some fact that has no direct bearing on your life is way, way, way cheaper than publicly expressing belief in such a fact and being refuted. There seems to be nothing irrational about investing more energy fact-checking in the latter scenario.

Edit: Two things.

First, the turtles example was poorly chosen, as it blurs the line between belief and epistemology too much. Better examples would include, say, wrongly believing celebrity gossip, or having incorrect beliefs about unpractical science due to a lack of information or a misunderstanding of alternatives. If the car mechanic believed Newton was totally right (because he hadn't seen evidence to the contrary), this would be a very, very low cost false belief. Interestingly, "Barack Obama is a Muslim" probably falls under this category, though it blurs the epistemological line a bit more.

Second, it's also quite possible that you care more about misleading others than you do being wrong. It's easy enough to change your mind if you see contradictory evidence or reread the article and understand it better. It's rather harder to change other people's minds who have been convinced, and you'll feel like you've let them down as an authority, since they trusted you.

Comment author: Marcello 03 July 2009 12:56:54AM 4 points [-]

I'm not sure the cost of privately held false beliefs is as low as you think it is. The universe is heavily Causally Entangled. Now even if in your example, the shape of the earth isn't causally entangled with anything our mechanic cares about, that doesn't get you off the hook. A false belief can shoot you in the foot in at least two ways. First, you might explicitly use it to reason about the value of some other variable in your causal graph. Second, your intuition might draw on it as an analogy when you are reasoning about something else.

If our car mechanic thinks his planet is a disc supported atop an infinite pile of turtles, when this is in fact not the case, then isn't he more likely to conclude that other things which he may actually come into more interaction with (such as a complex device embedded inside a car which could be understood by our mechanic, if he took it apart and then took the pieces apart about five times) might also be "turtles all the way down"? If I actually lived on a disc on top of infinitely many turtles, then I would be nowhere near as reluctant to conclude that I had a genuine fractal device on my hands. If I actually lived in a world which was turtles all the way down, I would also be much more disturbed by paradoxes involving backward supertasks.

To sum up: False beliefs don't contaminate your belief pool via the real links in the causal network in reality; they contaminate your belief pool via the associations in your mind.

Comment author: Psychohistorian 03 July 2009 02:16:31AM *  1 point [-]

This is what I meant by epistemology. It's not the bad beliefs causing bad epistemology (with certain exceptions, like some instances of religion, in which people may mess up their epistemology to retain their beliefs), but the bad epistemology causing the beliefs. I picked a bit too extreme an example to illustrate my point, and made note of alternative examples in the original.

If I told the car mechanic, "Actually, the Earth revolves around the Sun, which is one star among billions in one galaxy among billions, and you should believe me because God told me so," and he changes his beliefs accordingly, he's not really any better off than he was. The problem is not his belief, it's his system for validating beliefs.

By contrast, if I actually explained why that statement was true and he said, "Well, duh, of course I was wrong! I really should have looked into that!" then I'd say he never had much of a problem to begin with, other than a lack of curiosity.

Comment author: MineCanary 03 July 2009 03:51:16AM 0 points [-]

I'm not sure what the relationship between metaphors propagating in someone's thinking and the causal entanglement of the universe is.

I'd argue that people profit from having different ways to look at the world--even though it shares a common structure, this isn't always locally noticeable or important, and certainly things can look different at different scales. I'm equally unsure that it matters whether or not you see an object that is fractal for the scales of relevance to you and assume it is truly fractal or just a repeating pattern on a few scales.

I agree with Psychohistorian that it's more important that the mechanic be willing to abandon his belief with greater knowledge of the physics of the universe. But even then, facility with fractal thinking may still offer benefits.

That is: The associations in your mind are put to constant test when it comes to encountering the real world. Certainly long-term, serious misconceptions--liking seeing God in everything and missing insights into natural truth--can be quite a thing to overcome and can stifle certain important lines of thought. But for any beliefs you get from reading inadequately informed science journalism--well, the ways of thinking your mind's going to be contaminated with are those that are prevalent in our culture, so you probably encounter them anyway. They're also things that seem plausible to you, given your background, so you again probably already think in these terms, or the interconnectedness with all the other observations of life you've had is too small to distinguish between two alternate explanations--the false one you've just read and the real truth, which is "out there" still. And if scientific results were really so obvious from what we already know about the universe, research would be a lot less important--rather, it is because scientific findings can offer counter-intuitive results, ways of thinking that we DON'T find useful or essential in everyday life, that we find them so intriguing.