Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Your Rationality is My Business

50 Post author: Eliezer_Yudkowsky 15 April 2007 07:31AM

Some responses to Lotteries: A Waste of Hope chided me for daring to criticize others' decisions; if someone else chooses to buy lottery tickets, who am I to disagree?  This is a special case of a more general question:  What business is it of mine, if someone else chooses to believe what is pleasant rather than what is true?  Can't we each choose for ourselves whether to care about the truth?

An obvious snappy comeback is:  "Why do you care whether I care whether someone else cares about the truth?"  It is somewhat inconsistent for your utility function to contain a negative term for anyone else's utility function having a term for someone else's utility function.  But that is only a snappy comeback, not an answer.

So here then is my answer:  I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future.  One of those interests is the human pursuit of truth, which has strengthened slowly over the generations (for there was not always Science).  I wish to strengthen that pursuit further, in this generation. That is a wish of mine, for the Future.  For we are all of us players upon that vast gameboard, whether we accept the responsibility or not.

And that makes your rationality my business.

Is this a dangerous idea?  Yes, and not just pleasantly edgy "dangerous".  People have been burned to death because some priest decided that they didn't think the way they should.  Deciding to burn people to death because they "don't think properly" - that's a revolting kind of reasoning, isn't it?  You wouldn't want people to think that way, why, it's disgusting. People who think like that, well, we'll have to do something about them...

I agree!  Here's my proposal:  Let's argue against bad ideas but not set their bearers on fire.

The syllogism we desire to avoid runs:  "I think Susie said a bad thing, therefore, Susie should be set on fire."  Some try to avoid the syllogism by labeling it improper to think that Susie said a bad thing.  No one should judge anyone, ever; anyone who judges is committing a terrible sin, and should be publicly pilloried for it.

As for myself, I deny the therefore.  My syllogism runs, "I think Susie said something wrong, therefore, I will argue against what she said, but I will not set her on fire, or try to stop her from talking by violence or regulation..."

We are all of us players upon that vast gameboard; and one of my interests for the Future is to make the game fair.  The counterintuitive idea underlying science is that factual disagreements should be fought out with experiments and mathematics, not violence and edicts.  This incredible notion can be extended beyond science, to a fair fight for the whole Future.  You should have to win by convincing people, and should not be allowed to burn them. This is one of the principles of Rationality, to which I have pledged my allegiance.

People who advocate relativism or selfishness do not appear to me to be truly relativistic or selfish.  If they were really relativistic, they would not judge. If they were really selfish, they would get on with making money instead of arguing passionately with others. Rather, they have chosen the side of Relativism, whose goal upon that vast gameboard is to prevent the players - all the players - from making certain kinds of judgments.  Or they have chosen the side of Selfishness, whose goal is to make all players selfish.  And then they play the game, fairly or unfairly according to their wisdom.

If there are any true Relativists or Selfishes, we do not hear them - they remain silent, non-players.

I cannot help but care how you think, because - as I cannot help but see the universe - each time a human being turns away from the truth, the unfolding story of humankind becomes a little darker.  In many cases, it is a small darkness only.  (Someone doesn't always end up getting hurt.)  Lying to yourself, in the privacy of your own thoughts, does not shadow humanity's history so much as telling public lies or setting people on fire.  Yet there is a part of me which cannot help but mourn.  And so long as I don't try to set you on fire - only argue with your ideas - I believe that it is right and proper to me, as a human, that I care about my fellow humans.  That, also, is a position I defend into the Future.

Comments (26)

Sort By: Old
Comment author: Richard_Hollerith 15 April 2007 12:04:26PM 2 points [-]

I agree wholeheartedly with this post or blog entry. As one of my favorite authors once said, we are all pawns and players in the Game of Life.

If you (the reader) meet me, I will try to determine whether you are good or evil, that is, whether you're expected impact on the future is positive or negative --and if you care for nothing but pleasure, I'm probably going to decide that you are at least a little evil-- though the worst thing I will do to you is ignore you and refuse to cooperate with you. Moreover, I know how complex the human morality and rationality are, so I know that my judgement about people can always be in error.

Comment author: pdf23ds 15 April 2007 03:18:22PM 9 points [-]

Kevembuangga,

If you take a Bayesian view of the scientific process as opposed to a Popperian one, then theories are never disproved either, just shown to be very unlikely.

But though science can never prove anything conclusively, it doesn't follow that science is not a pursuit of truth. There are no processes that produce certain truths. The ones that claim to are mainly fundamentalist religions. But something doesn't have to be certain to be a truth, if you're a Bayesian, and not a fundamentalist.

Comment author: potato 22 November 2011 09:24:25AM 0 points [-]

Nitpick: something does need to be certain to be true, but it only needs to have a high probability to to be rationally strongly believed.

Comment author: anonymous259 22 November 2011 09:40:02AM 1 point [-]

The person you are replying to is unfortunately no longer with us. :-(

Comment author: potato 22 November 2011 08:07:04PM 0 points [-]

Jeeze, I don't really know what to say.

Comment author: TGGP5 15 April 2007 05:05:39PM 2 points [-]

Some minor quibbles: Would a truly selfish person only be concerned with money? That would indicate that in a society without money, nobody could be selfish. Children behave in a selfish manner without money, so it does not seem plausible to me. A person selfish for attention might very well be prone to arguing with Eliezer.

It is true that absolute relativism cannot pass judgment on others passing judgment. So what reasons might someone have to use relativism to respond to someone passing judgment? It has been brought up on this blog that persistent disagreement is evidence of irrationality, so the relativist might be explaining why they themselves decline to pass judgment in a similar manner on the issue being discussed. They might also personally disapprove of just what the judger is decrying, but also feel compelled to remind everyone that their opinions are just that: opinions.

Comment author: KateGladstone 05 December 2011 03:40:05PM 2 points [-]

Further: A selfish person is (by definition) one who does what s/he believes is in his or her self-interest. One can brlieve "It's in my self-interest to argue passionately against the irrational" — perhaps for the very reasons that Yudkowsky (correctly) gives. Therefore, it is possible for a selfish person to argue passionately against the irrational — in fsct, if the selfish person brlieves (like Yudkowsky and me) that defeating unreason makes this a better universe for him/herself to live in, then such a selfish person will join the fray and argue passionately against unreason.

Comment author: pdf23ds 15 April 2007 05:15:05PM 3 points [-]

I suppose, to put it more clearly, one would say that truth does actually exist, but is only approximated with increasing degrees of accuracy by science. The output of science might not be truth, but it's the closest approximation anyone could ever have to truth.

Comment author: Bob_Unwin 15 April 2007 08:16:48PM 2 points [-]

Yudkowsky: You seem to be responding to a bias that studying economics tends to inculcate in people. The bias is to assume that if people follow their preferences then their action cannot be dismissed as stupid or as ultimately having negative effects. Philosophers often accuse economists of this same bias. (A while ago, a philosopher on this blog accused Robin Hanson of not considering deontological reasons for implementing certain health policies. Yudkowsky rebuked Hanson for failing to distinguish between actions beneficial relative to the self-interest of individuals and beneficial relative to humanity during the discussion of Kahneman's article about biases towards Hawkishness).

This suggests a possible general discussion: which biases do various academic disciplines promote? Knowing this would be useful for individuals trying to de-bias (e.g. if I'm an economist, then I should read some ethics from analytic philosophy). I don't know many academic disciplines well, but I'll make some uninformed speculations to show the sort of biases I mean:

Economics: assuming people are more rational than they are (though economists are getting better at this), assuming that markets can be efficient in practice.

Physics: assuming that everything that is not math or physics is (a) soft and easy and (b) not hard or precise enough to give serious epistemological value. Taking the philosophical view that there is nothing real apart from what is studied by physics (e.g. ethics must be BS because in doesn't involve physics, sociology can't be real because it can't be reduced to physics).

Analytical philosophy: Assuming that science will barely move forward in the future. For example, various analytical philosophers have said (following Chomsky) that understanding consciousness scientifically may be beyond are cognitive abilities (the same is also said of understanding ethics scientifically). Yet the same could have been said about 'life' (in the sense that animals and bacteria are alive) or about the origin of life and species. These problems seemed so mysterious as to be beyond the capability of science to explain (e.g. how does self-replication avoid infinite regress?).

Comment author: Matthew_Pianalto 15 April 2007 08:38:17PM 4 points [-]

W.K. Clifford gave a short, sweet argument for your position in his "The Ethics of Belief": 1. Our beliefs inform how we act. 2. Our actions affect other people. 3. Therefore. our beliefs affect other people. (And to the extent that how you treat me IS something I have a right to make judgments about, it IS my business what you believe or prefer...)

Comment author: Daniel_Greco 15 April 2007 11:32:28PM 1 point [-]

Bob,

Daniel Dennett, an analytic philosopher, makes a very similar point to one that you do in defending physicalist approaches to the philosophy of mind. He thinks that the idea that there's a special, hard problem associated with explaining consciousness is similar to the pre-20th century idea that there's a special, hard problem with explaining life, and that philosophers who posit irreducible mental substances or properties are no better than vitalists, who believed that appeal to irreducible vital forces was necessary to explain life.

Dennett is far from unusual among analytic philosophers in his physicalism. Some form of physicalism about the mental is almost certainly a plurality position among analytic philosophers, if not a majority one. While I'm sure there are other biases that analytic philosophers suffer from, I think the one you've suggested isn't a plausible candidate for a general problem with the profession.

Comment author: TGGP5 16 April 2007 10:56:54PM 0 points [-]

Bob, how would we know if ethics or sociology were B.S or not? If physics were B.S we wouldn't have been able to put men on the moon. The fields disrespected by physicists tend to be less falsifiable.

Comment author: _Gi 17 April 2007 06:00:30PM -2 points [-]

You say you don't want to burn your opponent, but only argue against her view. Argument is a competitive pursuit. Public arguments about important subjects will likely get political. War and violence is but an extension of politics. If you have an intractable argument about a very important topic, you will have to deal with the possibility of violence.

Comment author: Mass_Driver 02 September 2010 05:25:13PM 1 point [-]

My syllogism runs, "I think Susie said something wrong, therefore, I will argue against what she said, but I will not set her on fire, or try to stop her from talking by violence or regulation..."

Exactly how far does your modesty run, Eliezer? Do you disapprove of economic sanctions (e.g. fines, lost business opportunities) for Susie? Do you disapprove of social sanctions (e.g. shunning Susie not because she is useless or dangerous but specifically in order to induce her to change her mind)? Would you avoid such sanctions even if Susie were publicly and successfully tricking large numbers of people into supporting unwise policies based on logically incoherent arguments?

Comment author: ndm25 08 October 2010 05:25:08PM 1 point [-]

While I am, clearly, not Eliezer, I believe that his position as expressed would oppose such sanctions. He seems to want all players of the game to be rational, and the introduction of alternate forms of persuasion (social shunning / economic sanctions) would be an unfair advantage to his side of the argument.

A rationalist shouldn't want to win, they should want to be right. Forms of persuasion outside of pure rational argument contribute only to the first goal, not the latter.

(could be wrong, am new here)

Comment author: Mass_Driver 09 October 2010 05:24:57AM 1 point [-]

Well, you're partially right. Eliezer says all the time that rationalists ought to win, and he even uses the italics. Of course, that's an argument from authority. On my own authority, I also think rationalists should win; it strikes me as irrational to be correct about a belief at the cost of not achieving your preferred state of the world. Surely there are more important things in life than what I personally have going on in my head in terms of abstract propositions.

And, yes, to a certain extent, persuading an annoying maverick to conform by shunning her at parties might interfere with my ability to reach correct beliefs. The question is, though, isn't that a sacrifice worth making? If I am still perfectly willing to debate this person in private and listen to her point of view, how much of her side of the truth am I really losing? If I attempt to restrain my social boycotts to those who appear to me to be clearly (a) factually incorrect and (b) making logically invalid arguments, how often am I likely to register a false positive and deprive the public of my opponent's true and valid points of view?

I'm curious what you think about all this.

By the way, welcome to Less Wrong!

Comment author: ndm25 13 October 2010 05:55:20PM 0 points [-]

Oh, I see what you mean. You're saying that there's not really any disutility created by you shunning them, and there is disutility created by having to talk to them. (I think)

I think that one should avoid penalizing another for their beliefs when other methods of persuasion are available, but did not take that to the next logical step and say "when rational methods (argument / debate / discussion) are not available, should I attempt to convert someone to my point of view anyway?"

I feel this is the question you are asking. If I am wrong, correct me. Anticipating that I am not, I will attempt to answer it thusly: "Yes, if it is truly important enough."

If, for instance, someone believes that the phenomenon of gravity is due to the flying spaghetti monster's invisible appendages holding them down, but is still willing to apply all the experimentally determined equations and does not change their life because of this belief (and especially, does not preach this belief), then the disutility this causes, aggregated over all time and all people, is probably less than the disulitity provided by what I will call active coercion (economic sanctions and the like), but probably more than the disutility provided by what I will call passive coercion (avoidance).

If they believe that, say, the Earth is 6,000 years old and floats through space on the back of a turtle, and they preach this in a manner than may convince others to agree, the aggregate disutility is probably greater than the disutility of either active or passive sanctions. (cases will, of course, vary, but I think this is likely to be true)

Anyway, that's how I think about it. I don't know etiquette here very well, but if it's considered rude to raise old threads from the dead, I'd love to continue this by email. My username at case dot edu will reach me.

Comment author: TheOtherDave 23 October 2010 03:04:38PM 4 points [-]

I sympathize with this conclusion. I may even agree with it.

That said, the argument rests heavily on the rhetorical equation of "daring to criticize others' decisions" with strengthening "the human pursuit of Truth," and ultimately seeks to justify the former by invoking the value of the latter. Along the way it tosses in "arguing against bad ideas" and "convincing people" and treats them all as roughly interchangeable.

I'm not sure that's justified. There are differences, and they matter.

To put it bluntly, I suspect you're rationalizing.

Criticizing people -- especially in public -- has a social cost; so does arguing against bad ideas. If the primary payoff is the chance of convincing them (or listeners) and thereby advancing the cause of Truth, it's worth looking for less expensive methods for buying that payoff.

I think most of us would do that more, except that argument is in certain contexts is also a very good social capital investment strategy.

There's nothing intrinsically wrong with that, any more than there's anything intrinsically wrong with making money in the stock market. But to claim that one's primary reason for playing the markets is to advance the cause of Capitalism is disingenuous.

There are payoffs here besides Truth, and to pretend otherwise is itself a source of darkness.

Comment author: Tyrrell_McAllister 30 May 2011 02:38:21AM *  7 points [-]

The syllogism we desire to avoid runs: "I think Susie said a bad thing, therefore, Susie should be set on fire." [...]

My syllogism runs, "I think Susie said something wrong, therefore, I will argue against what she said, but I will not set her on fire, or try to stop her from talking by violence or regulation..."

Nitpick: These are enthymemes, not syllogisms.

Comment author: MarkusRamikin 28 September 2011 01:59:48PM *  1 point [-]

If they were really selfish, they would get on with making money instead of arguing passionately with others.

This I do not understand. People are built of many different modules, and I don't see why someone can't be selfish in their life decisions, and at the same time, like so many humans, be prone to arguing about their views with other humans in spare time.

I'm clearly missing something, because I also can't see what that whole paragraph is doing in this article, what it contributes to the overall point. If anyone can be bothered to clarify, that'd be appreciated.

Comment author: blacktrance 08 January 2014 06:38:22PM *  1 point [-]

If they enjoy arguing about their views, doing so can be a selfish decision. It's a mistake to conflate selfishness with wealth maximization.

Comment author: TraderJoe 12 April 2012 12:39:33PM *  4 points [-]

No one should judge anyone, ever; anyone who judges is committing a terrible sin, and should be publicly pilloried for it.

You mean, publicly judged for it ;)