This parallels a discussion I've had numerous times in the field of computer games. I've had any number of artists / scripters / managers say that what a computer game needs is not a realistic physics engine, but a cinematic physics engine. They don't want it to be right, they want it to be pretty.
But, you'll find that "cinematic style" isn't consistent, and if you start from that basis, you won't be able to make boring, every-day events look realistic, and you'll have to add special-case patch-upon-patch and you'll never get it right in the end. The cinematic stuff will look right, but nothing else will.
If you start with a rigidly-correct physics engine (or at least, within current state-of-the-art) you'll find it MUCH easier to layer cinematic effects on top when asked for. Its usually far simpler than the other way around.
In an analogous way, I find that rationality makes it far easier for one to achieve one's goals, EVEN WHEN SAID GOALS ARE NON-RATIONAL. Now, that may mean that the rational thing to do in some cases is to lie to people about your beliefs, or to present yourself in a non-natural way. If you end up being uncomfortable with that, then one needs to reassess what, exactly, one's goals are, and what you are willing to do to achieve them. This may not be easy, but its far simpler than going the route of ignorance and emotionally-driven actions and then trying to put your life back together when you don't end up where you thought you would.
I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.
My definition of truth-seeking is basically Robin's definition of "rational." I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?
Well, sure. Repeating other posts - but one of the most common examples is when an agent's beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist - and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either - since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.
We're talking about at least two different notions of the word "rational":
Robin Hanson used the definition at the top of this post, regarding believing the truth. There are social/evolutionary costs to that, partly because humans lie poorly.
The causal decision theorists' definition that Eliezer Yudkowsky was annoyed by. CDT defines rationality to be a specific method of deciding what action to take, even though this leads to two-boxing (losing) Newcomb's problem. Yudkowsky's objection, summarized by the slogan "Rationalists should WIN." was NOT a definition. It is a quality of his informal concept of rationality which the CDT definition failed to capture.
The claim "rationalists should always win" comes from taking Yudkowsky's slogan as a definition of rationality. If that is the definition that you are using, then the claim is tautological.
Please note that I don't endorse this misreading of Yudkowsky's post, I'm just trying to answer your question.
Willful stupidity is often easier and more profitable in the short run, but you just might be picking up pennies in front of a steamroller.
I think it's best to go out of your way to believe the truth, even though you won't always succeed. I'm very suspicious when tempted to do otherwise, it's usually for most unwise or unhealthy reasons. There are exceptions, but they're much rarer than we'd like to think.
Learning many true facts that are not Fun and are morally irrelevant (e.g. learning as many digits of pi as you can by spending your whole life on the activity), because this way you can avoid thinking about facts that are much less certain, shouldn't be considered rational. Rationality intrinsically needs to serve a purpose, the necessity for this is implicit even in apparently goal-neutral definitions like the one Robin gave in the post.
Another problem, of course, is that you don't know the cost of irrationality if you are irrational.
Are commitment mechanisms rational?
A malicious genius is considering whether to dose the dashing protagonist with a toxin. The toxin is known to be invariably fatal unless counteracted, and the malicious genius has the only antidote. The antagonist knows that the protagonist will face a choice: Either open a specific locked box containing, among other things, the antidote - surviving, but furthering the antagonist's wicked plan, or refuse to open the box, dying, and foiling the plan.
We analyze this as an extensive form game: The antagonist has a choice to ...
You seem to be taking the opposite tack as in this video, where rationality was best for everyone no matter their cause.
A rational belief isn't necessarily correct or true. Rational beliefs are justified, in that they logically follow from premises that are accepted as true. In the case of probabilistic statements, a rational strategy is one that maximizes the chance of being correct or otherwise reaching a defined goal state. It doesn't have to work or be correct in any ultimate sense to be rational.
If I play the lottery and win, playing the lottery turned out to be a way to get lots of money. It doesn't mean that playing the lottery was a rational strategy. If I make a reasonable investment and improbable misfortune strikes, losing the money, that doesn't mean that the investment wasn't rational.
Places where rationality* is not welcome:
Churches, political parties, Congress, family reunions, dates, cable news, bureaucracy, casinos... . *Of course rationality might dictate deception- but I take it lying confers some cost on the liar.
Please list the rest. Also, who here is involved with any of the things on the list? Am I wrong to include something and if not how do you deal with being rational in a place that discourages it.
Re: The word "rational" is overloaded with associations, so let me be clear: to me, more "rational" means better believing what is true, given one's limited info and analysis resources.
Ouch! A meta discussion, perhaps - but why define "rational" that way? Isn't the following much more standard?
"In economics, sociology, and political science, a decision or situation is often called rational if it is in some sense optimal, and individuals or organizations are often called rational if they tend to act somehow optimally in p...
My definition differs from the one in Wikipedia because I require that your goals not call for any particular ritual of cognition. When you care more about winning then about any particular way of thinking - and "winning" is not defined in such a way as to require in advance any particular method of thinking - then you are pursuing rationality.
This, in turn, ends up implying epistemic rationality: if the definition of "winning" doesn't require believing false things, then you can generally expect to do better (on average) by believing true things than false things - certainly in real life, despite various elaborate philosophical thought experiments designed from omniscient truth-believing third-person standpoints.
Conversely you can start with the definition of rational belief as accuracy-seeking, and get to pragmatics via "That which can be destroyed by the truth should be" and the notion of rational policies as those which you would retain even given an epistemically rational prediction of their consequences.
Query: Need the quest for the truth necessarily be quixotic? Tilting at windmills would be an example of delusional activity. Isn't the quixotic then the opposite of the rational?
As I see it, rationality is much more about choosing the right things to use one's success for than it is about achieving success (in the conventional sense). Hopefully it also helps with the latter, but it may well be that rationality is detrimental to people's pursuit of various myopic, egoist, and parochial goals that they have, but that they would reject or downgrade in importance if they were more rational.
It may be possible to have it both ways, to know rationality without having it interfere with achieving happiness and other goals.
For me, rationality is a topic of interest, but I don't make a religion out of it. I cultivate a sort of Zen attitude towards raaitonalit, trying not to grasp it too tightly. I am curious to know what the rational truth is, but I'm willing and, with the right frame of mind, able to ignore it.
I can be aware at some level that my emotional feelings are technically irrational and reflect untrue beliefs, but so what. They're true en...
I believe a 'simple' cost-benefit analysis is warranted on a case-by case basis. It is not some absolute, abstract decision. Clearly truth has a price. Sometimes the price is relatively high and other times it is lower. The reward will equally vary. Rationally speaking, truth will sometimes be worth the trouble, and sometimes not.
I endorse the thusly parsed: "rational" means better believing what is "true" (given one's limited info and analysis resources). This introduces the social dimension and the pragmatic dimension into rationality - which should never be about "paperclips" alone, as Tim Tyler seems to suggest.
Believing what is true is not rationality, but planning what is best, based on what is true, is.
The true cost of acting rational is the difference between acting truly rational versus acting purely rational in a situation.
First we have to make the distinction between what is factual truth or strategically optimal and what an agent believes is true or strategically optimal. For non-rational agents these are different, there is at least one instance where what they believe is not what is true or how they act is not optimal. For rationalists, what they believe has to be proven true and they must act optimally.
A situation with only rational agents, the...
"There is also empirical evidence that high self-efficacy can be maladaptive in some circumstances. In a scenario-based study, Whyte et al. showed that participants in whom they had induced high self-efficacy were significantly more likely to escalate commitment to a failing course of action.[28] Knee and Zuckerman have challenged the definition of mental health used by Taylor and Brown and argue that lack of illusions is associated with a non-defensive personality oriented towards growth and learning and with low ego involvement in outcomes.[29] They...
I often wish I could have two brains - one which is fully, painfully aware of the truth and another which holds that set of beliefs which optimize happiness. I sometimes comfort myself with the thought that on a societal level, people like us function as that first, unhappy brain.
Altering the structure of the second brain to deal with the truth better should be a primary concern.
The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources.
Rationality certainly can have instrumental advantages. There are plenty of situations where being more rational helps one achieve a wide range of goals. In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational. In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.
But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us. So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.
Yes, rationality might help you win that game or argument, get promoted, or win her heart. Or more rationality for you might hinder those outcomes. If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends. In fact we often know it is not.
The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful. These are actually pretty likely outcomes in many identifiable situations. You may think you want to know the truth no matter what, but how sure can you really be of that? Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for.
Be warned; the truth just is what it is. If just knowing the truth is not reward enough, perhaps you'd be better off not knowing. Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics? Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?