The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources. 

Rationality certainly can have instrumental advantages.  There are plenty of situations where being more rational helps one achieve a wide range of goals.  In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational.  In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.

But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us.  So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.

Yes, rationality might help you win that game or argument, get promoted, or win her heart.  Or more rationality for you might hinder those outcomes.  If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends.  In fact we often know it is not.

The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful.  These are actually pretty likely outcomes in many identifiable situations.  You may think you want to know the truth no matter what, but how sure can you really be of that?  Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for. 

Be warned; the truth just is what it is.  If just knowing the truth is not reward enough, perhaps you'd be better off not knowing.  Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics?  Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?

New Comment
81 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

This parallels a discussion I've had numerous times in the field of computer games. I've had any number of artists / scripters / managers say that what a computer game needs is not a realistic physics engine, but a cinematic physics engine. They don't want it to be right, they want it to be pretty.

But, you'll find that "cinematic style" isn't consistent, and if you start from that basis, you won't be able to make boring, every-day events look realistic, and you'll have to add special-case patch-upon-patch and you'll never get it right in the end. The cinematic stuff will look right, but nothing else will.

If you start with a rigidly-correct physics engine (or at least, within current state-of-the-art) you'll find it MUCH easier to layer cinematic effects on top when asked for. Its usually far simpler than the other way around.

In an analogous way, I find that rationality makes it far easier for one to achieve one's goals, EVEN WHEN SAID GOALS ARE NON-RATIONAL. Now, that may mean that the rational thing to do in some cases is to lie to people about your beliefs, or to present yourself in a non-natural way. If you end up being uncomfortable with that, then one needs to reassess what, exactly, one's goals are, and what you are willing to do to achieve them. This may not be easy, but its far simpler than going the route of ignorance and emotionally-driven actions and then trying to put your life back together when you don't end up where you thought you would.

7Vladimir_Nesov
You'll need to clarify what you mean by "non-rational goals".
2swestrup
Yes, I suppose I should. By a non-rational goal I meant a goal that was not necessarily to my benefit, or the benefit of the world, a goal with a negative net sum worth. Things like poisoning a reservoir or marrying someone who will make your life miserable.
2Vladimir_Nesov
You decided to try achieving that "non-rational" goal, so it must be to your benefit (at least, you must believe so). An example that I usually give at this point is as follows. Is it physically possible that in the next 30 seconds I'll open the window and jump out? Can I do it? Since I don't want to do it, I won't do it, and therefore it can not happen in reality. The concept of trying to do something you'll never want to do is not in reality either.
1swestrup
Yes, exactly. The fact that you think its to your benefit, but it isn't, is the very essence of what I mean by a non-rational goal.
3Yosarian2
That might actually be the main cost of rationality. You may have goals that will hurt you if you actually achieve them, and by not being rational, you manage to not achieve those goals, making your life better. Perhaps, in fact, people avoid rationality because they don't really want to achieve those goals, they just think they want to. There's an Amanda Palmer song where the last line is "I don't want to be the person that I want to be." Of course, if you become rational enough, you may be able to untangle those confused goals and conflicting desires. There's a dangerous middle ground, though, where you may get just better at hurting yourself.
1Nick_Tarleton
"Not to my benefit" is ambiguous; I assume you mean working against other goals, like happiness or other people not dying. But since optimizing for one thing means not optimizing for others, every goal has this property relative to every other (for an ideal agent). Still, the concept seems very useful; any thoughts on how to formalize it?
1swestrup
I don't really have any ideas other than the "negative net sum" worth I mentioned above, but then that just begs the question of what metric one is using to measure worth.
4AnnaSalamon
This is a plausible claim, but do you have concrete details, proposed mechanisms, or examples from your own or others lives to back it up? "I find that rationality makes it far easier" is a promising-sounding claim, and it'd be nice to know the causes of your belief.
1swestrup
Hmm. This is a simple question that seems difficult to articulate an answer to. I think the heart of my argument is that it is very difficult to achieve any goal without planning, and planning (to be effective) relies upon a true and consistent set of beliefs and logical inferences from them. This is pretty much the definition of rationality. Now, its not the case that the opposite is random activity which one hopes will bring about the correct outcome. To be driven by emotions, seat-of-the-pants decisions and gut-instincts is to allow an evolutionarily-derived decision-making process to run your life. Its not a completely faulty process, but it did not evolve for the kinds of situations modern people find themselves in so, in practice, its not hard to do better by applying rational principals.
2[anonymous]
As I understand, computer animation (as in Pixar) has built-in capabilities for the physically impossible. For example, there's no constraint in the software that solid bodies have to have constant volume -- when Ratatouille bounces around, he's changing volume all the time for extra expressiveness and dramatic effect. In that way, "cinematic" reality is simpler than realistic reality -- though of course it takes more artistry on the part of the animator to make it look good.
2wedrifid
That isn't technically impossible. ;)
1Alicorn
Ratatouille is not a character, it's a food. The rat's name is Remy.
0[anonymous]
Dang, forgot that.
[-]pwno90

I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.

My definition of truth-seeking is basically Robin's definition of "rational." I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?

Well, sure. Repeating other posts - but one of the most common examples is when an agent's beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist - and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either - since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.

8Scott Alexander
Tim, that's an excellent argument for why rationality isn't always the winning strategy in real life. People have been saying this sort of thing all week, but it was your "most humans lie so poorly" comment that really made it click for me, especially in the context of evolutionary psychology. I'd really like to hear one of the "rationalists should always win" people address this objection.

We're talking about at least two different notions of the word "rational":

  1. Robin Hanson used the definition at the top of this post, regarding believing the truth. There are social/evolutionary costs to that, partly because humans lie poorly.

  2. The causal decision theorists' definition that Eliezer Yudkowsky was annoyed by. CDT defines rationality to be a specific method of deciding what action to take, even though this leads to two-boxing (losing) Newcomb's problem. Yudkowsky's objection, summarized by the slogan "Rationalists should WIN." was NOT a definition. It is a quality of his informal concept of rationality which the CDT definition failed to capture.

The claim "rationalists should always win" comes from taking Yudkowsky's slogan as a definition of rationality. If that is the definition that you are using, then the claim is tautological.

Please note that I don't endorse this misreading of Yudkowsky's post, I'm just trying to answer your question.

9Scott Alexander
Thanks, John. As you say, defining rationality as winning and then saying rationalists always win is a tautology. But aside from your two definitions, there's a third definition: the common definition of rationality as basing decisions on evidence, Bayes, and logic. So as I see it, supporters of "rationalists always win" need to do one of the following: 1. Show that the winning definition is the same as the Bayes/logic/evidence definition. Tim's counterexample of the religious believer who's a poor liar makes me doubt this is possible. 2. Stop using "rationality" to refer to things like the Twelve Virtues and Bayesian techniques, since these virtues and techniques sometimes lose and are therefore not always rational. 3. Abandon "rationalists always win" in favor of Robin's "rationalists always seek the truth". I think that definition is sufficient to demonstrate that a rationalist should one-box on Newcombe's problem anyway. After all, if it's true that one boxing is the better result, a seeker of truth should realize that and decide to one-box.
5Kenny
There are no supporters of "rationalists always win" – the slogan is "rationalists should win". Long-term / on-average, it's rational to expect a high correlation between rationality and success. [1] – I'd bet that the rationalist strategy fares well against other heuristics; let's devise a good test. There may always be an effective upper-bound to the returns to increasing rationality in any community, but reality is dangerous – I'd expect rationalists to fair better. [2] – Winning or losing one 'round' isn't sufficient grounds to declare a strategy, or particular decisions, as being non-rational. Buying lottery tickets isn't rational because some people win. And sometimes, winning isn't possible. [3] – I like "rationalists always seek the truth" but would add "... but they don't seek all truths."
5steven0461
You realize, of course, that under this policy everyone stays Christian forever.
2timtyler
Indeed - religion is persistent. Of course in the real world you would find that isolated communities would arise, where "belief mutations" could arise without them being severely punished by the crowd.
-2[anonymous]
Interesting, if rationality corresponds to winning, and Christianity is persistent, then we should give up on trying to eliminate Christianity. Not merely because it is a waste of resources, but also because their belief in God is not directly tied to winning and losing. Some beliefs lead to winning (philanthropy, community) and some beliefs lead to losing (insert any one of many here). We should focus energies on discouraging the losing beliefs with whatever means at our disposal, including humoring their belief in God in specific arguments. (For example, we could try and convince a bible literalist that God would forgive them for believing evolution because he deliberately gave us convincing evidence of it.) -- learning as I go, I just learned such arguments are called "Pragmatism".
-2[anonymous]
I will likely delete this post now that it has been down-voted. I wrote it as a natural response to the information I read and am not attached to it. Before deleting, I'm curious if I can solicit feedback from the person who down-voted me. Because the post was boring?
7mark_spottswood
Pwno said: I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example? ---------------------------------------- The classic example would invoke the placebo effect. Believing that medical care is likely to be successful can actually make it more successful; believing that it is likely to fail might vitiate the placebo effect. So, if you are taking a treatment with the goal of getting better, and that treatment is not very good (but it is the best available option), then it is better from a rationalist goal-seeking perspective to have an incorrectly high assessment of the treatment's possibility of success. This generalizes more broadly to other areas of life where confidence is key. When dating, or going to a job interview, confidence can sometimes make the difference between success and failure. So it can pay, in such scenarios, to be wrong (so long as you are wrong in the right way). It turns out that we are, in fact, generally optimized to make precisely this mistake. Far more people think they are above average in most domains than hold the opposite view. Likewise, people regularly place a high degree of trust in treatments with a very low probability of success, and we have many social mechanisms that try and encourage such behavior. It might be "irrational" under your usage to try and help these people form more accurate beliefs.
3TobyBartels
I like to distinguish information-theoretic rationality from decision-theoretic rationality. (But these are rather long terms.) Often on this blog it's unclear which is meant (although you and Robin did make it clear.)
1thomblake
The relevant articles: What do we mean by rationality wiki
0TobyBartels
Yeah, I'd just been reading those, but they don't fix the terminology either.
0Pavitra
Perhaps you could call them "truth" and "winning" respectively.

Willful stupidity is often easier and more profitable in the short run, but you just might be picking up pennies in front of a steamroller.

I think it's best to go out of your way to believe the truth, even though you won't always succeed. I'm very suspicious when tempted to do otherwise, it's usually for most unwise or unhealthy reasons. There are exceptions, but they're much rarer than we'd like to think.

1igoresque
I am interested to know what kind of temptations, reasons and exceptions you have in mind.

Learning many true facts that are not Fun and are morally irrelevant (e.g. learning as many digits of pi as you can by spending your whole life on the activity), because this way you can avoid thinking about facts that are much less certain, shouldn't be considered rational. Rationality intrinsically needs to serve a purpose, the necessity for this is implicit even in apparently goal-neutral definitions like the one Robin gave in the post.

Another problem, of course, is that you don't know the cost of irrationality if you are irrational.

1timtyler
I don't see how "seeking truth" is "goal-neutral". It is a goal much like any other. The main thing I feel the urge to say about "seeking truth" is that it usually isn't nature's goal. Nature normally cares about other things a lot more than the truth.
-1Kenny
If nature can be said to have goals, it has "seeking truth" in so far that any thing, including ourselves, does.
1timtyler
Perhaps I was too brief. Organisms are goal oriented - or at least they look as though they are. Teleonomy, rather than teleology, technicallly, of course. Organisms act as though their primary goal is to have grandchildren. Seeking the truth is a proximate goal - and not an especially high-priority one. Prioritising seeking the truth more highly than having babies would be a bizarre and unnatural thing for any living organism to do. I have no idea why anyone would advocate it - except, perhaps as part of some truth-worshiping religion.

Are commitment mechanisms rational?

A malicious genius is considering whether to dose the dashing protagonist with a toxin. The toxin is known to be invariably fatal unless counteracted, and the malicious genius has the only antidote. The antagonist knows that the protagonist will face a choice: Either open a specific locked box containing, among other things, the antidote - surviving, but furthering the antagonist's wicked plan, or refuse to open the box, dying, and foiling the plan.

We analyze this as an extensive form game: The antagonist has a choice to ... (read more)

9Vladimir_Nesov
A decision theory that doesn't need to go through the motions of making a commitment outside the cognitive algorithm is superior. Act as if you have made a commitment in all the situations where you benefit from having made the commitment. Actually make commitment only if it's necessary to signal the resulting decision. (Off-point:) The protagonist may well be rational about sacrificing his life, if he cares about stopping the antagonist's plan more.
1Benya
I believe I agree with the intuition. Does it say anything about a problem like the above, though? Does the villain decide not to poison the hero, because the hero would not open the box even if the villain decided to poison the hero? Or does the hero decide to open the box, because the villain would poison the hero even if the hero decided not to open the box? Is there a symmetry-breaker here? -- Do we get a mixed strategy à la the Nash equilibrium for Rock-Paper-Scissors, where each player makes each choice with 50% probability? (I'm assuming we're assuming the preference orderings are: The hero prefers no poison to opening the box to dying; the villain prefers the box opened to no poison to the hero dying [because the latter would be a waste of perfectly good poison].)
6Benya
I'm not sure why I'm getting downmodded into oblivion here. I'll go out on a limb and assume that I was being incomprehensible, even though I'll be digging myself in deeper if that wasn't the reason... In classical game theory (subgame-perfect equilibrium), if you eat my chocolate, it is not rational for me to tweak your nose in retaliation at cost to myself. But if I can first commit myself to tweaking your nose if you eat my chocolate, it is no longer rational for you to eat it. But, if you can even earlier commit to definitely eating my chocolate even if I commit to then tweaking your nose, it is (still in classical game theory) no longer rational for me to commit to tweaking your nose! The early committer gets the good stuff. Eliezer's arguments have convinced me that a better decision theory would work like Vladimir says, acting as if you had made a commitment in all situations where you would like to make a commitment. But as far as I can see, both the nose-tweaker and the chocolate-eater can do that -- speaking in intuitive human terms, it comes down to who is more stubborn. So what does happen? Is there a symmetry breaker? Can it happen that you commit to eating my chocolate, I commit to tweaking your nose, and we end up in the worst possible world for both of us? (Well, I'm pretty confident that that's not what Eliezer's theory (not shown) would do.) Borrowing from classical game theory, perhaps we say that one of the two commitment scenarios happens, but we can't say which (1. you eat my chocolate and I don't tweak your nose; 2. you don't eat my chocolate, which is a good thing because I would tweak your nose if you did). In the simple commitment game we're considering here, this amounts to considering all Nash equilibria instead of only subgame perfect equilibria (Nash = "no player can do better by changing their strategy" -- but I'm allowed to counterfactually tweak your nose at cost to myself if we don't actually reach that part of the game tree at e
4Vladimir_Nesov
You can't argue with a rock, so you can't stop a rock-solid commitment, even with your own rock-solid commitment. But you can solve the game given the commitments, with the outcome for each side. If this outcome is inferior to other possible commitments, then those other commitments should be used instead. So, if the hero expects that his commitment to die will still result in villain making him die, this commitment is not a good idea and shouldn't be made (for example, maybe the villain just wants to play the game). The tricky part is that if the hero expected his commitment to stop the villain, he still needs to dutifully die once the villain surprised him, to the extent this would be necessary to communicate the commitment to the villain prior to his decision, since it's precisely this communicated model of behavior that was supposed to stop him.
7Jack
I've always wondered if there are any documented instances of someone unscrewing his steering wheel and tossing it out during a game of chicken.

You seem to be taking the opposite tack as in this video, where rationality was best for everyone no matter their cause.

1RobinHanson
I usually use "rationality" the way most economists use the word, but Eliezer has chosen to use the word differently here, and I am trying to accommodate him.
1timtyler
Eliezer's proposal seems worse than your one in this thread to me - partly since it seems so irregular. I think trying to talk him down would be the most sensible strategy. "Truth-seeker" is terminology which is good enough.

A rational belief isn't necessarily correct or true. Rational beliefs are justified, in that they logically follow from premises that are accepted as true. In the case of probabilistic statements, a rational strategy is one that maximizes the chance of being correct or otherwise reaching a defined goal state. It doesn't have to work or be correct in any ultimate sense to be rational.

If I play the lottery and win, playing the lottery turned out to be a way to get lots of money. It doesn't mean that playing the lottery was a rational strategy. If I make a reasonable investment and improbable misfortune strikes, losing the money, that doesn't mean that the investment wasn't rational.

3grobstein
This has no bearing on the point above. In essence you're just rephrasing Robin's definition, "better believing what is true, given one's limited info and analysis resources." The disposition best-calculated to lead to true beliefs will not produce true beliefs in every instance, because true beliefs will not always be justified by available evidence. So what?
[-]Jack40

Places where rationality* is not welcome:

Churches, political parties, Congress, family reunions, dates, cable news, bureaucracy, casinos... . *Of course rationality might dictate deception- but I take it lying confers some cost on the liar.

Please list the rest. Also, who here is involved with any of the things on the list? Am I wrong to include something and if not how do you deal with being rational in a place that discourages it.

0christopherj
I would say rationality is welcome in those places, conditional on it not opposing their goals. It could be argued that opposing your own goals isn't rational -- if acting rationally means you lose, is it really rationality? I guess this is another case where rationality as truth-seeking and rationality as goal-following can conflict. In fact there are many places where truth can be enemy to varying degrees, places where incomplete truth, misleading truth, even outright lies can be advantageous to a goal. For example, in chess it is disadvantageous to explain why you did a move or what you plan to do next, even if your opponent explicitly asked you (so that you either disadvantage yourself or refuse to tell the truth). In fact in almost any non-cooperative interaction you could be disadvantaged by your opponent knowing certain things. Even when mostly cooperating, there are also non-cooperative elements. Even when you are alone, knowing the truth about your chances of success can be discouraging, and you have to account for the fact that you're not perfectly rational and so being discouraged from a course of action might mean you don't take it even if it is the best option. This is probably why self-deception for overestimating one's abilities is so rampant.
1Vaniver
I suspect that Jack is commenting on the likelihood that the condition is satisfied. The interests of organizers and participants are likely to conflict in many of those places- casinos being perhaps the most obvious example- and thus it furthers organizer-goals to insist on or encourage irrationality in participants.

Re: The word "rational" is overloaded with associations, so let me be clear: to me, more "rational" means better believing what is true, given one's limited info and analysis resources.

Ouch! A meta discussion, perhaps - but why define "rational" that way? Isn't the following much more standard?

"In economics, sociology, and political science, a decision or situation is often called rational if it is in some sense optimal, and individuals or organizations are often called rational if they tend to act somehow optimally in p... (read more)

2thomblake
Indeed - that was my first thought, but I was waiting till I figured out a good way of stating it. RH's definition of 'rational' seems to go against the usual definition presented above, while EY's seems to embrace it.

My definition differs from the one in Wikipedia because I require that your goals not call for any particular ritual of cognition. When you care more about winning then about any particular way of thinking - and "winning" is not defined in such a way as to require in advance any particular method of thinking - then you are pursuing rationality.

This, in turn, ends up implying epistemic rationality: if the definition of "winning" doesn't require believing false things, then you can generally expect to do better (on average) by believing true things than false things - certainly in real life, despite various elaborate philosophical thought experiments designed from omniscient truth-believing third-person standpoints.

Conversely you can start with the definition of rational belief as accuracy-seeking, and get to pragmatics via "That which can be destroyed by the truth should be" and the notion of rational policies as those which you would retain even given an epistemically rational prediction of their consequences.

6RobinHanson
For most people, most of the things they want do in fact prefer some ways of thinking, so your definition requires us to consider a counterfactual pretty far from ordinary experience. In contrast, defining in terms of accuracy-seeking is simple and accessible. If this site is going to use the word "rational" a lot, we'd better have a simple clear definition or we'll be arguing this definitional stuff endlessly.
3Eliezer Yudkowsky
I usually define "rationality" as accuracy-seeking whenever decisional considerations do not enter. These days I sometimes also use the phrase "epistemic rationality". It would indeed be more complicated if we began conducting the meta-argument that (a) an ideal Bayesian not faced with various vengeful gods inspecting its algorithm should not decide to rewrite its memories to something calibrated away from what it originally believed to be accurate, or that (b) human beings ought to seek accuracy in a life well-lived according to goals that include both explicit truth-seeking and other goals not about truth. But unless I'm specifically focused on this argument, I usually go so far as to talk as if it resolves in favor of epistemic accuracy, that is, that pragmatic rationality is unified with epistemic rationality rather than implying two different disciplines. If truth is a bad idea, it's not clear what the reader is doing on Less Wrong, and indeed, the "pragmatic" reader who somehow knows that it's a good idea to be ignorant, will at once flee as far as possible...
6RobinHanson
You started off using the word "rationality" on this blog/forum, and though I had misgivings, I tried to continue with your language. But most of the discussion of this post seems to be distracted by my having tried to clarify that in the introductory sentence. I predict we won't be able to get past this, and so from now on I will revert to my usual policy of avoiding overloaded words like "rationality."
0timtyler
If truth is a bad idea, it's not clear what the reader is doing on Less Wrong [...] Believing the truth is usually a good idea - for real organisms. However, I don't think rationality should be defined in terms of truth seeking. For one thing, that is not particularly conventional usage. For another, it seems like a rather arbitrary goal. What if a Buddhist claims that rational behaviour typically involves meditating until you reach nirvana. On what grounds would that claim be dismissed? That seems to me to be an equally biologically realistic goal. I think that convention has it right here - the details of the goal are irrelevances to rationality which should be factored right out of the equation. You can rationally pursue any goal - without any exceptions.
0Johnicholas
I'm confused by the phrase "most of the things they want do in fact prefer some ways of thinking". I thought that EY was saying that he requires goals like "some hot chocolate" or "an interesting book", rather than goals like: "the answer to this division problem computed by the Newton-Raphson algorithm"
4mark_spottswood
Eliezer said: This, in turn, ends up implying epistemic rationality: if the definition of "winning" doesn't require believing false things, then you can generally expect to do better (on average) by believing true things than false things - certainly in real life, despite various elaborate philosophical thought experiments designed from omniscient truth-believing third-person standpoints. -- I think this is overstated. Why should we only care what works "generally," rather than what works well in specific subdomains? If rationality means whatever helps you win, than overconfidence will often be rational. (Examples: placebo effect, dating, job interviews, etc.) I think you need to either decide that your definition of rationality does not always require a preference for true beliefs, or else revise the definition. It also might be worthwhile, for the sake of clarity, to just avoid the word "rationality" altogether in future conversations. It seems to be at risk of becoming an essentially contested concept, particularly because everyone wants to be able to claim that their own preferred cognitive procedures are "rational." Why not just talk about whether a particular cognitive ritual is "goal-optimizing" when we want to talk about Eliezer-rationality, while saving the term "truth-optimizing" (or some variant) for epistemic-rationality?
4Eliezer Yudkowsky
Maybe "truth-seeking" versus "winning", if there's a direct appeal to one and not the other. But I am generally willing to rescue the word "rationality".
4mark_spottswood
Sorry -- I meant, but did not make clear, that the word "rationality" should be avoided only when the conversation involves the clash between "winning" and "truth seeking." Otherwise, things tend to bog down in arguments about the map, when we should be talking about the territory.
1Kenny
I agree – in contexts where 'truth seeking' and 'winning' are different, we should qualify references to 'rationality'.
3CronoDAS
Regarding "rationalists should win" - that still leaves us with the problem of distinguishing between someone who won because he was rational and someone who was irrational but won because of sheer dumb luck. For example, buying lottery tickets is (almost always) a negative EV proposition - but some people do win the lottery. Was it irrational for lottery winners to have bought those specific tickets, which did indeed win? Given a sufficiently large sample, the most spectacular successes are going to be those who pursued opportunities with the highest possible payoff regardless of the potential downside or even the expected value... for every spectacular success, there are probably several times as many spectacular failures.
0timtyler
Re: Regarding "rationalists should win" - that still leaves us with the problem of distinguishing between someone who won because he was rational and someone who was irrational but won because of sheer dumb luck. Just don't go there in the first place. Attempting to increase your utility is enough.
5timtyler
A common example of where rationality and truth-seeking come into conflict is the case where organisms display their beliefs - and have difficulty misrepresenting them. In such cases, it may thus benefit them to believe falsehoods for reasons associated with signalling their beliefs to others: "Definitely on all fronts is has become imperative not to bristle with hostility every time you encounter a stranger. Instead observe him, find out what he might be. Behave to him with politeness, pretending that you like him more than you do - at least while you find out how he might be of use to you. Wash before you go to talk to him so as to conceal your tribal odour and take great care not to let on that you notice his own, foul as it may be. Talk about human brotherhood. In the end don't even just pretend that you like him (he begins to see through that); instead, really like him. It pays." * Discriminating Nepotism - as reprinted in: Narrow Roads of Gene Land, Volume 2 Evolution of Sex, p.359.
5MichaelHoward
Eek, now there's Transhuman babyeaters! I see it also says "Man needs lies like children need toys." :-)
3Eliezer Yudkowsky
Ew. Didn't know that was where it came from, just saw the demotivator.
0swestrup
While I don't necessarily agree that Man needs lies, Terry Pratchett made a very good argument for it in Hogfather: Death: Yes. As practice, you have to start out learning to believe the little lies. Susan: So we can believe the big ones? Death: Yes. Justice, mercy, duty. That sort of thing. Susan: They're not the same at all. Death: You think so? Then take the universe and grind it down to the finest powder, and sieve it through the finest sieve, and then show me one atom of justice, one molecule of mercy. And yet, you try to act as if there is some ideal order in the world. As if there is some, some rightness in the universe, by which it may be judged. Susan: But people have got to believe that, or what's the point? Death: You need to believe in things that aren't true. How else can they become?
6Vladimir_Nesov
See Angry Atoms. Systems can have properties inapplicable to their components. This is not a lie.
5Annoyance
I can't agree that it's a good argument. Pratchett, through the character of Death, conflates the problem of constructing absolute standards with the 'problem' of finding material representations of complex concepts through isolating basic parts. It's the sort of alchemical thinking that should have been discarded with, well, alchemists. Of course you can't grind down reality and find mercy. Can you smash a computer and find the essence of the computations it was carrying out? The very act of taking the computer apart and reducing it destroys the relationships it embodied. Of course, you can find computation in atoms... just not the ones the computer was doing.
3swestrup
No, I don't think that Death is conflating them at all. He is saying that Mercy, Justice and the like are human constructs and are not an inherent part of the universe. In this he is completely correct. Where he goes wrong is in having only two categories "Truth" which seems to include only that which is inherent to the universe and "Lies" which he uses to hold everything else. There is no room in this philosophy for conjecture, goals, hopes, dreams, and the like. Sadly, I have met folks who, while perhaps not as extreme in their classifications as this, nevertheless have no place in their personal philosophies for unproven conjectures, potentially true statements, partially supported beliefs, and the like. They are not comfortable with areas of gray between what they know is true and what they know is false. I think the statements of Death are couched to appeal more to their philosophy than ours, but perhaps that is because Pratchett thinks such people more in need of the instruction.
[-]Alan20

Query: Need the quest for the truth necessarily be quixotic? Tilting at windmills would be an example of delusional activity. Isn't the quixotic then the opposite of the rational?

As I see it, rationality is much more about choosing the right things to use one's success for than it is about achieving success (in the conventional sense). Hopefully it also helps with the latter, but it may well be that rationality is detrimental to people's pursuit of various myopic, egoist, and parochial goals that they have, but that they would reject or downgrade in importance if they were more rational.

It may be possible to have it both ways, to know rationality without having it interfere with achieving happiness and other goals.

For me, rationality is a topic of interest, but I don't make a religion out of it. I cultivate a sort of Zen attitude towards raaitonalit, trying not to grasp it too tightly. I am curious to know what the rational truth is, but I'm willing and, with the right frame of mind, able to ignore it.

I can be aware at some level that my emotional feelings are technically irrational and reflect untrue beliefs, but so what. They're true en... (read more)

I believe a 'simple' cost-benefit analysis is warranted on a case-by case basis. It is not some absolute, abstract decision. Clearly truth has a price. Sometimes the price is relatively high and other times it is lower. The reward will equally vary. Rationally speaking, truth will sometimes be worth the trouble, and sometimes not.

1subod_83
I think the problem is you can't always visualize all the costs or all the benefits.
[-][anonymous]10

I endorse the thusly parsed: "rational" means better believing what is "true" (given one's limited info and analysis resources). This introduces the social dimension and the pragmatic dimension into rationality - which should never be about "paperclips" alone, as Tim Tyler seems to suggest.

Believing what is true is not rationality, but planning what is best, based on what is true, is.

[This comment is no longer endorsed by its author]Reply

The true cost of acting rational is the difference between acting truly rational versus acting purely rational in a situation.

First we have to make the distinction between what is factual truth or strategically optimal and what an agent believes is true or strategically optimal. For non-rational agents these are different, there is at least one instance where what they believe is not what is true or how they act is not optimal. For rationalists, what they believe has to be proven true and they must act optimally.

A situation with only rational agents, the... (read more)

[-][anonymous]-20

"There is also empirical evidence that high self-efficacy can be maladaptive in some circumstances. In a scenario-based study, Whyte et al. showed that participants in whom they had induced high self-efficacy were significantly more likely to escalate commitment to a failing course of action.[28] Knee and Zuckerman have challenged the definition of mental health used by Taylor and Brown and argue that lack of illusions is associated with a non-defensive personality oriented towards growth and learning and with low ego involvement in outcomes.[29] They... (read more)

I often wish I could have two brains - one which is fully, painfully aware of the truth and another which holds that set of beliefs which optimize happiness. I sometimes comfort myself with the thought that on a societal level, people like us function as that first, unhappy brain.

Altering the structure of the second brain to deal with the truth better should be a primary concern.