Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Modularity, signaling, and belief in belief

19 Kaj_Sotala 13 November 2011 11:54AM

This is the fourth part in a mini-sequence presenting material from Robert Kurzban's excellent book Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind.

In the previous post, Strategic ignorance and plausible deniability, we discussed some ways by which people might have modules designed to keep them away from certain kinds of information. These arguments were relatively straightforward.

The next step up is the hypothesis that our "press secretary module" might be designed to contain information that is useful for certain purposes, even if other modules have information that not only conflicts with this information, but is also more likely to be accurate. That is, some modules are designed to acquire systematically biased - i.e. false - information, including information that other modules "know" is wrong.

continue reading »

St. Petersburg Mugging Implies You Have Bounded Utility

11 TimFreeman 07 June 2011 03:06PM

This post describes an infinite gamble that, under some reasonable assumptions, will motivate people who act to maximize an unbounded utility function to send me all their money. In other words, if you understand this post and it doesn't motivate you to send me all your money, then you have a bounded utility function, or perhaps even upon reflection you are not choosing your actions to maximize expected utility, or perhaps you found a flaw in this post.

continue reading »

Belief in Belief vs. Internalization

33 Desrtopa 29 November 2010 03:12AM

Related to Belief In Belief

Suppose that a neighbor comes to you one day and tells you “There’s a dragon in my garage!” Since all of us have been through this before at some point or another, you may be inclined to save time and ask “Is the dragon by any chance invisible, inaudible, intangible, and does it convert oxygen to carbon dioxide when it breathes?”

The neighbor, however, is a scientific minded fellow and responds “Yes, yes, no, and maybe, I haven’t checked. This is an idea with testable consequences. If I try to touch the dragon it gets out of the way, but it leaves footprints in flour when I sprinkle it on the garage floor, and whenever it gets hungry, it comes out of my garage and eats a nearby animal. It always chooses something weighing over thirty pounds, and you can see the animals get snatched up and mangled to a pulp in its invisible jaws. It’s actually pretty horrible. You may have noticed that there have been fewer dogs around the neighborhood lately.”

This triggers a tremendous number of your skepticism filters, and so the only thing you can think of to say is “I think I’m going to need to see this.”

“Of course,” replies the neighbor, and he sets off across the street, opens the garage door, and is promptly eaten by the invisible dragon.

continue reading »

Epistemic Luck

74 Alicorn 08 February 2010 12:02AM

Who we learn from and with can profoundly influence our beliefs. There's no obvious way to compensate.  Is it time to panic?

During one of my epistemology classes, my professor admitted (I can't recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.

What a peculiar thing for an epistemologist to admit!

Of course, on the one hand, he's almost certainly right.  Schools have their cultures, their traditional views, their favorite literature providers, their set of available teachers.  These have a decided enough effect that I've heard "X was a student of Y" used to mean "X holds views basically like Y's".  And everybody knows this.  And people still show a distinct trend of agreeing with their teachers' views, even the most controversial - not an unbroken trend, but still an obvious one.  So it's not at all unlikely that, yes, had the professor gone to a different graduate school, he'd believe something else about his subject, and he's not making a mistake in so acknowledging...

But on the other hand... but... but...

But how can he say that, and look so undubiously at the views he picked up this way?  Surely the truth about knowledge and justification isn't correlated with which school you went to - even a little bit!  Surely he knows that!

continue reading »

Sorting Out Sticky Brains

50 Alicorn 18 January 2010 04:18AM

tl;dr: Just because it doesn't seem like we should be able to have beliefs we acknowledge to be irrational, doesn't mean we don't have them.  If this happens to you, here's a tool to help conceptualize and work around that phenomenon.

There's a general feeling that by the time you've acknowledged that some belief you hold is not based on rational evidence, it has already evaporated.  The very act of realizing it's not something you should believe makes it go away.  If that's your experience, I applaud your well-organized mind!  It's serving you well.  This is exactly as it should be.

If only we were all so lucky.

Brains are sticky things.  They will hang onto comfortable beliefs that don't make sense anymore, view the world through familiar filters that should have been discarded long ago, see significances and patterns and illusions even if they're known by the rest of the brain to be irrelevant.  Beliefs should be formed on the basis of sound evidence.  But that's not the only mechanism we have in our skulls to form them.  We're equipped to come by them in other ways, too.  It's been observed1 that believing contradictions is only bad because it entails believing falsehoods.  If you can't get rid of one belief in a contradiction, and that's the false one, then believing a contradiction is the best you can do, because then at least you have the true belief too.

The mechanism I use to deal with this is to label my beliefs "official" and "unofficial".  My official beliefs have a second-order stamp of approval.  I believe them, and I believe that I should believe them.  Meanwhile, the "unofficial" beliefs are those I can't get rid of, or am not motivated to try really hard to get rid of because they aren't problematic enough to be worth the trouble.  They might or might not outright contradict an official belief, but regardless, I try not to act on them.

continue reading »

You don't need Kant

21 Andrew 01 April 2009 06:09PM

Related to: Comments on Degrees of Radical Honesty, OB: Belief in Belief, Cached Thoughts.

"Nothing worse could happen to these labours than that anyone should make the unexpected discovery that there neither is, nor can be, any a priori knowledge at all.... This would be the same thing as if one sought to prove by reason that there is no reason" (Critique of Practical Reason, Introduction).

You don't need Kant to demonstrate the value of honesty. In fact, summoning his revenant can be a dangerous thing to do. You end up in the somewhat undesirable situation of having almost the right conclusion, but having it for the wrong reasons. Reasons you weren't even aware of, because they were all collapsed into the belief, "I believe in person X".

One of the annoying things about philosophy is that the dead simply don't die. Once a philosopher or philosophical doctrine gains some celebrity in the community, it's very difficult to convince anyone afterward that said philosopher or doctrine was flawed. In other words, the philosophical community tends to have problems with relinquishment. Therefore, there are still many philosophers that spend their careers studying, for example, Plato, apparently not with the intent to determine what parts of what Plato wrote are correct or still applicable, but rather with the intent to defend Plato from criticism. To prove Plato was right.

Since the community doesn't value relinquishment, the cost of writing a flawed criticism is very low. Therefore, journals are glutted with so-called "negative results": "Kant was wrong", "Hegel was wrong", etc. No one seriously believes otherwise, but writing positive philosophical results is hard, and not writing at all isn't a viable career option for a professional philosopher.

To its credit, MBlume refrains from bringing up Kant in his article on radical honesty, where he cites other, more feasible variants of radical honesty. However, in the comments, Kant rears his ugly head.

continue reading »

The Mystery of the Haunted Rationalist

69 Yvain 08 March 2009 08:39PM

Followup to: Simultaneously Right and Wrong

    "The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents."

          - H.P. Lovecraft, The Call of Cthulhu

There is an old yarn about two skeptics who stayed overnight in a supposedly haunted mansion, just to prove they weren't superstitious. At first, they laughed and joked with each other in the well-lit master bedroom. But around eleven, there was a thunderstorm - hardly a rare event in those parts - and all the lights went off. As it got later and later, the skeptics grew more and more nervous, until finally around midnight, the stairs leading up to their room started to creak. The two of them shot out of there and didn't stop running until they were in their car and driving away.

So the skeptics' emotions overwhelmed their rationality. That happens all the time. Is there any reason to think this story proves anything more interesting than that some skeptics are cowards?

continue reading »

Moore's Paradox

47 Eliezer_Yudkowsky 08 March 2009 02:27AM

Followup toBelief in Self-Deception

Moore's Paradox is the standard term for saying "It's raining outside but I don't believe that it is."  HT to painquale on MetaFilter.

I think I understand Moore's Paradox a bit better now, after reading some of the comments on Less Wrong.  Jimrandomh suggests:

Many people cannot distinguish between levels of indirection. To them, "I believe X" and "X" are the same thing, and therefore, reasons why it is beneficial to believe X are also reasons why X is true.

I don't think this is correct—relatively young children can understand the concept of having a false belief, which requires separate mental buckets for the map and the territory.  But it points in the direction of a similar idea:

Many people may not consciously distinguish between believing something and endorsing it.

After all—"I believe in democracy" means, colloquially, that you endorse the concept of democracy, not that you believe democracy exists.  The word "belief", then, has more than one meaning.  We could be looking at a confused word that causes confused thinking (or maybe it just reflects pre-existing confusion).

So: in the original example, "I believe people are nicer than they are", she came up with some reasons why it would be good to believe people are nice—health benefits and such—and since she now had some warm affect on "believing people are nice", she introspected on this warm affect and concluded, "I believe people are nice".  That is, she mistook the positive affect attached to the quoted belief, as signaling her belief in the proposition.  At the same time, the world itself seemed like people weren't so nice.  So she said, "I believe people are nicer than they are."

continue reading »

Simultaneously Right and Wrong

88 Yvain 07 March 2009 10:55PM

Related to: Belief in Belief, Convenient Overconfidence

     "You've no idea of what a poor opinion I have of myself, and how little I deserve it."

      -- W.S. Gilbert

In 1978, Steven Berglas and Edward Jones performed a study on voluntary use of performance inhibiting drugs. They asked subjects to solve certain problems. The control group received simple problems, the experimental group impossible problems. The researchers then told all subjects they'd solved the problems successfully, leaving the controls confident in their own abilities and the experimental group privately aware they'd just made a very lucky guess.

Then they offered the subjects a choice of two drugs to test. One drug supposedly enhanced performance, the other supposedly handicapped it.

There's a cut here in case you want to predict what happened.

continue reading »

Belief in Self-Deception

51 Eliezer_Yudkowsky 05 March 2009 03:20PM

Continuation ofNo, Really, I've Deceived Myself
Followup toDark Side Epistemology

I spoke yesterday of my conversation with a nominally Orthodox Jewish woman who vigorously defended the assertion that she believed in God, while seeming not to actually believe in God at all.

While I was questioning her about the benefits that she thought came from believing in God, I introduced the Litany of Tarski—which is actually an infinite family of litanies, a specific example being:

  If the sky is blue
      I desire to believe "the sky is blue"
  If the sky is not blue
      I desire to believe "the sky is not blue".

"This is not my philosophy," she said to me.

"I didn't think it was," I replied to her.  "I'm just asking—assuming that God does not exist, and this is known, then should you still believe in God?"

She hesitated.  She seemed to really be trying to think about it, which surprised me.

"So it's a counterfactual question..." she said slowly.

I thought at the time that she was having difficulty allowing herself to visualize the world where God does not exist, because of her attachment to a God-containing world.

Now, however, I suspect she was having difficulty visualizing a contrast between the way the world would look if God existed or did not exist, because all her thoughts were about her belief in God, but her causal network modelling the world did not contain God as a node.  So she could easily answer "How would the world look different if I didn't believe in God?", but not "How would the world look different if there was no God?"

She didn't answer that question, at the time.  But she did produce a counterexample to the Litany of Tarski:

She said, "I believe that people are nicer than they really are."

continue reading »

View more: Next