Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

The Truth and Instrumental Rationality

11 the-citizen 01 November 2014 11:05AM

One of the central focuses of LW is instrumental rationality. It's been suggested, rather famously, that this isn't about having true beliefs, but rather its about "winning". Systematized winning. True beliefs are often useful to this goal, but an obsession with "truthiness" is seen as counter-productive. The brilliant scientist or philosopher may know the truth, yet be ineffective. This is seen as unacceptable to many who see instrumental rationality as the critical path to achieving one's goals. Should we all discard our philosophical obsession with the truth and become "winners"?


The River Instrumentus

You are leading a group of five people away from deadly threat which is slowly advancing behind you. You come to a river. It looks too dangerous to wade through, but through the spray of the water you see a number of stones. They are dotted across the river in a way that might allow you to cross. However, the five people you are helping are extremely nervous and in order to convince them to cross, you will not only have to show them its possible to cross, you will also need to look calm enough after doing it to convince them that it's safe. All five of them must cross, as they insist on living or dying together.

Just as you are about to step out onto the first stone it splutters and moves in the mist of the spraying water. It looks a little different from the others, now you think about it. After a moment you realise its actually a person, struggling to keep their head above water. Your best guess is that this person would probably drown if they got stepped on by five more people. You think for a moment, and decide that, being a consequentialist concerned primarily with the preservation of life, it is ultimately better that this person dies so the others waiting to cross might live. After all, what is one life compared with five?

However, given your need for calm and the horror of their imminent death at your hands (or feet), you decide it is better not to think of them as a person, and so you instead imagine them being simply a stone. You know you'll have to be really convincingly calm about this, so you look at the top of the head for a full hour until you utterly convince yourself that the shape you see before you is factually indicitative not of a person, but of a stone. In your mind, tops of heads aren't people - now they're stones. This is instrumentally rational - when you weigh things up the self-deception ultimately increases the number of people who will likely live, and there is no specific harm you can identify as a result.

After you have finished convincing yourself you step out onto the per... stone... and start crossing. However, as you step out onto the subsequent stones, you notice they all shift a little under your feet. You look down and see the stones spluttering and struggling. You think to yourself "lucky those stones are stones and not people, otherwise I'd be really upset". You lead the five very greatful people over the stones and across the river. Twenty dead stones drift silently downstream.

When we weigh situations on pure instrumentality, small self deception makes sense. The only problem is, in an ambiguous and complex world, self-deceptions have a notorious way of compounding eachother, and leave a gaping hole for cognitive bias to work its magic. Many false but deeply-held beliefs throughout human history have been quite justifiable on these grounds. Yet when we forget the value of truth, we can be instrumental, but we are not instrumentally rational. Rationality implies, or ought to imply, a value of the truth.


Winning and survival

In the jungle of our evolutionary childhood, humanity formed groups to survive. In these groups there was a hierachy of importance, status and power. Predators, starvation, rival groups and disease all took the weak on a regular basis, but the groups afforded a partial protection. However, a violent or unpleasant death still remained a constant threat. It was of particular threat to the lowest and weakest members of the group. Sometimes these individuals were weak because they were physically weak. However, over time groups that allowed and rewarded things other than physical strength became more successful. In these groups, discussion played a much greater role in power and status. The truely strong individuals, the winners in this new arena were one's that could direct converstation in their favour - conversations about who will do what, about who got what, and about who would be punished for what. Debates were fought with words, but they could end in death all the same.

In this environment, one's social status is intertwined with one's ability to win. In a debate, it was not so much a matter of what was true, but of what facts and beliefs achieved one's goals. Supporting the factual position that suited one's own goals was most important. Even where the stakes where low or irrelevant, it payed to prevail socially, because one's reputation guided others limited cognition about who was best to listen to. Winning didn't mean knowing the most, it meant social victory. So when competition bubbled to the surface, it payed to ignore what one's opponent said and instead focus on appearing superior in any way possible. Sure, truth sometimes helped, but for the charismatic it was strictly optional. Politics was born.

Yet as groups got larger, and as technology began to advance for the first time, there appeared a new phenomenon. Where a group's power dynamics meant that it systematically had false beliefs, it became more likely to fail. The group that believing that fire spirits guided a fire's advancement fared poorly compared with those who checked the wind and planned their means of escape accordingly. The truth finally came into its own. Yet truth, as opposed to simple belief by politics, could not be so easily manipulated for personal gain. The truth had no master. In this way it was both dangerous and liberating. And so slowly but surely the capacity for complex truth-pursuit became evolutionarily impressed upon the human blueprint.

However, in evolutionary terms there was little time for the completion of this new mental state. Some people had it more than others. It also required the right circumstances for it to rise to the forefront of human thought. And other conditions could easily destroy it. For example, should a person's thoughts be primed with an environment of competition, the old ways came bubbling up to the surface. When a person's environment is highly competitive, it reverts to its primitive state. Learning and updating of views becomes increasingly difficult, because to the more primitive aspects of a person's social brain, updating one's views is a social defeat.

When we focus an organisation's culture on winning, there can be many benefits. It can create an air of achievement, to a degree. Hard work and the challenging of norms can be increased. However, we also prime the brain for social conflict. We create an environment where complexity and subtlety in conversation, and consequently in thought, is greatly reduced. In organisations where the goals and means are largely intellectual, a competitive environment creates useless conversations, meaningless debates, pointless tribalism, and little meaningful learning. There are many great examples, but I think you'd be best served watching our elected representatives at work to gain a real insight.


Rationality and truth

Rationality ought to contain an implication of truthfulness. Without it, our little self-deceptions start to gather and compond one another. Slowly but surely, they start to reinforce, join, and form an unbreakable, unchallengable yet utterly false belief system. I need not point out the more obvious examples, for in human society, there are many. To avoid this on LW and elsewhere, truthfulness of belief ought to inform all our rational decisions, methods and goals. Of course true beliefs do not guarantee influence or power or achievement, or anything really. In a world of half-evolved truth-seeking equipment, why would we expect that?  What we can expect is that, if our goals are anything to do with the modern world in all its complexity, the truth isn't sufficient, but it is neccessary.

Instrumental rationality is about achieving one's goals, but in our complex world goals manifest in many ways - and we can never really predict how a false belief will distort our actions to utterly destroy our actual achievements. In the end, without truth, we never really see the stones floating down the river for what they are.

The Useful Idea of Truth

77 Eliezer_Yudkowsky 02 October 2012 06:16PM

(This is the first post of a new Sequence, Highly Advanced Epistemology 101 for Beginners, setting up the Sequence Open Problems in Friendly AI.  For experienced readers, this first post may seem somewhat elementary; but it serves as a basis for what follows.  And though it may be conventional in standard philosophy, the world at large does not know it, and it is useful to know a compact explanation.  Kudos to Alex Altair for helping in the production and editing of this post and Sequence!)


I remember this paper I wrote on existentialism. My teacher gave it back with an F. She’d underlined true and truth wherever it appeared in the essay, probably about twenty times, with a question mark beside each. She wanted to know what I meant by truth.
-- Danielle Egan

I understand what it means for a hypothesis to be elegant, or falsifiable, or compatible with the evidence. It sounds to me like calling a belief ‘true’ or ‘real’ or ‘actual’ is merely the difference between saying you believe something, and saying you really really believe something.
-- Dale Carrico

What then is truth? A movable host of metaphors, metonymies, and; anthropomorphisms: in short, a sum of human relations which have been poetically and rhetorically intensified, transferred, and embellished, and which, after long usage, seem to a people to be fixed, canonical, and binding.
-- Friedrich Nietzche


The Sally-Anne False-Belief task is an experiment used to tell whether a child understands the difference between belief and reality. It goes as follows:

  1. The child sees Sally hide a marble inside a covered basket, as Anne looks on.

  2. Sally leaves the room, and Anne takes the marble out of the basket and hides it inside a lidded box.

  3. Anne leaves the room, and Sally returns.

  4. The experimenter asks the child where Sally will look for her marble.

Children under the age of four say that Sally will look for her marble inside the box. Children over the age of four say that Sally will look for her marble inside the basket.

continue reading »

The two meanings of mathematical terms

-2 JamesCole 15 June 2009 02:30PM

[edit: sorry, the formatting of links and italics in this is all screwy.  I've tried editing both the rich-text and the HTML and either way it looks ok while i'm editing it but the formatted terms either come out with no surrounding spaces or two surrounding spaces]

In the latest Rationality Quotes thread, CronoDAS  quoted  Paul Graham: 

It would not be a bad definition of math to call it the study of terms that have precise meanings.

Sort of. I started writing a this as a reply to that comment, but it grew into a post.
We've all heard of the story of  epicycles  and how before Copernicus came along the movement of the stars and planets were explained by the idea of them being attached to rotating epicycles, some of which were embedded within other larger, rotating epicycles (I'm simplifying the terminology a little here).
As we now know, the Epicycles theory was completely wrong.  The stars and planets were not at the distances from earth posited by the theory, or of the size presumed by it, nor were they moving about on some giant clockwork structure of rings.  
In the theory of Epicycles the terms had precise mathematical meanings.  The problem was that what the terms were meant to represent in reality were wrong.  The theory involved applied mathematical statements, and in any such statements the terms don’t just have their mathematical meaning -- what the equations say about them -- they also have an ‘external’ meaning concerning what they’re supposed to represent in or about reality.
Lets consider these two types of meanings.  The mathematical, or  ‘internal’, meaning of a statement like ‘1 + 1 = 2’ is very precise.  ‘1 + 1’ is  defined  as ‘2’, so ‘1 + 1 = 2’ is pretty much  the  pre-eminent fact or truth.  This is why mathematical truth is usually given such an exhaulted place.  So far so good with saying that mathematics is the study of terms with precise meanings. 
But what if ‘1 + 1 = 2’ happens to be used to describe something in reality?  Each of the terms will then take on a  second meaning -- concerning what they are meant to be representing in reality.  This meaning lies outside the mathematical theory, and there is no guarantee that it is accurate.
The problem with saying that mathematics is the study of terms with precise meanings is that it’s all to easy to take this as trivially true, because the terms obviously have a precise mathematical sense.  It’s easy to overlook the other type of meaning, to think there is just  the  meaning of the term, and that there is just the question of the precision of their meanings.   This is why we get people saying "numbers don’t lie".  
‘Precise’ is a synonym for "accurate" and "exact" and it is characterized by "perfect conformity to fact or truth" (according to WordNet).  So when someone says that mathematics is the study of terms with precise meanings, we have a tendancy to take it as meaning it’s the study of things that are accurate and true.  The problem with that is, mathematical precision clearly does not guarantee the precision -- the accuracy or truth -- of applied mathematical statements, which need to conform with reality.
There are quite subtle ways of falling into this trap of confusing the two meanings.  A believer in epicycles would likely have thought that it must have been correct because it gave mathematically correct answers.  And  it actually did .  Epicycles actually did precisely calculate the positions of the stars and planets (not absolutely perfectly, but in principle the theory could have been adjusted to give perfectly precise results).  If the mathematics was right, how could it be wrong?  
But what the theory was actually calcualting was not the movement of galactic clockwork machinery and stars and planets embedded within it, but the movement of points of light (corresponding to the real stars and planets) as those points of light moved across the sky.  Those positions were right but they had it conceptualised all wrong.  
Which begs the question of whether it really matters if the conceptualisation is wrong, as long as the numbers are right?  Isn’t instrumental correctness all that really matters?  We might think so, but this is not true.  How would Pluto’s existence been predicted  under an epicycles conceptualisation?  How would we have thought about space travel under such a conceptualisation?
The moral is, when we're looking at mathematical statements, numbers are representations, and representations can lie.



If you're interested in knowing more about epicycles and how that theory was overthrown by the Copernican one, Thomas Kuhn's quite readable  The Copernican Revolution  is an excellent resource.  

 

Fake Norms, or "Truth" vs. Truth

16 Eliezer_Yudkowsky 22 July 2008 10:23AM

Followup toApplause Lights

When you say the word "truth", people know that "truth" is a good thing, and that they're supposed to applaud.  So it might seem like there is a social norm in favor of "truth".  But when it comes to some particular truth, like whether God exists, or how likely their startup is to thrive, people will say:  "I just want to believe" or "you've got to be optimistic to succeed".

So Robin and I were talking about this, and Robin asked me how it is that people prevent themselves from noticing the conflict.

I replied that I don't think active prevention is required.  First, as I quoted Michael Vassar:

"It seems to me that much of the frustration in my life prior to a few years ago has been due to thinking that all other human minds necessarily and consistently implement modus ponens."

But more importantly, I don't think there does exist any social norm in favor of truth.  There's a social norm in favor of "truth".  There's a difference.

continue reading »

Guardians of the Truth

31 Eliezer_Yudkowsky 15 December 2007 06:44PM

Followup toTsuyoku Naritai, Reversed Stupidity is not Intelligence

The criticism is sometimes leveled against rationalists:  "The Inquisition thought they had the truth!  Clearly this 'truth' business is dangerous."

There are many obvious responses, such as "If you think that possessing the truth would license you to torture and kill, you're making a mistake that has nothing to do with epistemology."  Or, "So that historical statement you just made about the Inquisition—is it true?"

Reversed stupidity is not intelligence:  "If your current computer stops working, you can't conclude that everything about the current system is wrong and that you need a new system without an AMD processor, an ATI video card... even though your current system has all these things and it doesn't work.  Maybe you just need a new power cord."  To arrive at a poor conclusion requires only one wrong step, not every step wrong.  The Inquisitors believed that 2 + 2 = 4, but that wasn't the source of their madness.  Maybe epistemological realism wasn't the problem either?

It does seem plausible that if the Inquisition had been made up of relativists, professing that nothing was true and nothing mattered, they would have mustered less enthusiasm for their torture.  They would also have had been less enthusiastic if lobotomized.  I think that's a fair analogy.

And yet... I think the Inquisition's attitude toward truth played a role.  The Inquisition believed that there was such a thing as truth, and that it was important; well, likewise Richard Feynman.  But the Inquisitors were not Truth-Seekers.  They were Truth-Guardians.

continue reading »

The Meditation on Curiosity

36 Eliezer_Yudkowsky 06 October 2007 12:26AM

"The first virtue is curiosity."
        —The Twelve Virtues of Rationality

As rationalists, we are obligated to criticize ourselves and question our beliefs... are we not?

Consider what happens to you, on a psychological level, if you begin by saying:  "It is my duty to criticize my own beliefs."  Roger Zelazny once distinguished between "wanting to be an author" versus "wanting to write".  Mark Twain said:  "A classic is something that everyone wants to have read and no one one wants to read."  Criticizing yourself from a sense of duty leaves you wanting to have investigated, so that you'll be able to say afterward that your faith is not blind.  This is not the same as wanting to investigate.

This can lead to motivated stopping of your investigation.  You consider an objection, then a counterargument to that objection, then you stop there.  You repeat this with several objections, until you feel that you have done your duty to investigate, and then you stop there. You have achieved your underlying psychological objective: to get rid of the cognitive dissonance that would result from thinking of yourself as a rationalist, and yet knowing that you had not tried to criticize your belief.  You might call it purchase of rationalist satisfaction—trying to create a "warm glow" of discharged duty.

continue reading »

Feeling Rational

76 Eliezer_Yudkowsky 26 April 2007 04:48AM

A popular belief about "rationality" is that rationality opposes all emotion—that all our sadness and all our joy are automatically anti-logical by virtue of being feelings.  Yet strangely enough, I can't find any theorem of probability theory which proves that I should appear ice-cold and expressionless.

So is rationality orthogonal to feeling?  No; our emotions arise from our models of reality.  If I believe that my dead brother has been discovered alive, I will be happy; if I wake up and realize it was a dream, I will be sad.  P. C. Hodgell said:  "That which can be destroyed by the truth should be."  My dreaming self's happiness was opposed by truth.  My sadness on waking is rational; there is no truth which destroys it.

continue reading »

Why truth? And...

47 Eliezer_Yudkowsky 27 November 2006 01:49AM

Some of the comments in this blog have touched on the question of why we ought to seek truth.  (Thankfully not many have questioned what truth is.)  Our shaping motivation for configuring our thoughts to rationality, which determines whether a given configuration is "good" or "bad", comes from whyever we wanted to find truth in the first place.

It is written:  "The first virtue is curiosity."  Curiosity is one reason to seek truth, and it may not be the only one, but it has a special and admirable purity.  If your motive is curiosity, you will assign priority to questions according to how the questions, themselves, tickle your personal aesthetic sense.  A trickier challenge, with a greater probability of failure, may be worth more effort than a simpler one, just because it is more fun.

continue reading »