Rationality is about winning.
The about captures the expected systematic winning part, as you are considering the model of winning, not necessarily the accidental winning itself. It limits the scope to the winning only, leaving only the secondary roles for parry, hit, spring, strike or touch. Being a study about the real thing, rationality employs a set of tricks that allow to work it, in special cases and at coarse levels of detail. Being about the real thing, rationality aims to give the means for actually winning.
Wikipedia has this right:
"a rational agent is specifically defined as an agent which always chooses the action which maximises its expected performance, given all of the knowledge it currently possesses."
Expected performance. Not actual performance. Whether its actual performance is good or not depends on other factors - such as how malicious the environment is, whether the agent's priors are good - and so on.
Expected performance is what rational agents are actually maximising.
Whether that corresponds to actual performance depends on what their expectations are. What their expectations are typically depends on their history - and the past is not necessarily a good guide to the future.
Highly rational agents can still lose. Rational actions (that follow the laws of induction and deduction applied to their sense data) are not necessarily the actions that win.
Rational agents try to win - and base their efforts on their expectations. Whether they actually win depends on whether their expectations are correct. In my view, attempts to link rationality directly to "winning" miss the distinction between actual and expected utility.
There are reasons for associations between expected performance and actual performance. Indeed, those associations are why agents have the expectations they do. However, the association is statistical in nature.
Dissect the brain of a rational agent, and it is its expected utility that is being maximised. Its actual utility is usually not something that is completely under its control.
It's important not to define the "rational action" as "the ...
Expected performance is what rational agents are actually maximising.
Does that mean that I should mechanically overwrite my beliefs about the chance of a lottery ticket winning, in order to maximize my expectation of the payout? As Nesov says, rationality is about utility; which is why a rational agent in fact maximizes their expectation of utility, while trying to maximize utility (not their expectation of utility!).
It may help to understand this and some of the conversations below if you realize that the word "try" behaves a lot like "quotation marks" and that having an extra "pair" of quotation "marks" can really make "your" sentences seem a bit odd.
Problem with that in human practice is that it leads to people defending their ruined plans, saying, "But my expected performance was great!"
It's true that people make this kind of response, but that doesn't make it valid, or mean that we have to throw away the notion of rationality as maximizing expected performance, rather than actual performance.
In the case of failed trading companies, can't we just say that despite their fantasies, their expected performance shouldn't have been so great as they thought? And the fact that their actual results differed from their expected results should cast suspicion on their expectations.
Perhaps we can say that expectations about performance be epistemically rational, and only then can an agent who maximizes their expected performance be instrumentally rational.
Achieving a win is much harder than achieving an expectation of winning (i.e. something that it seems you could defend as a good try).
Some expectations win. Some expectations lose. Yet not all expectations are created equal. Non-accidental winning starts with something that seems good to try (can accidental winning be rational?). At least, there is some link between expe...
Personally, I think the word "win" might be the problem. Winning is very binary, which isn't how rationality is defined. Perhaps "Rationalists maximize"?
I always thought that the majority of exposition in your Newcomb example went towards, not "Rationalists should WIN", but a weaker claim which seems to be a smaller inferential distance from most would-be rationalists:
Rationalists should not systematically lose; whatever systematically loses is not rationality.
(Of course, one needs the logical caveat that we're not dealing with a pure irrationalist-rewarder; but such things don't seem to exist in this universe at the moment.)
Suggestion: "Rationalists seek to Win, not to be rational".
Suggestion: "If what you think is rational appears less likely to Win than what you think is irrational, then you need to reassess probabilities and your understanding of what is rational and what is irrational".
Suggestion: "It is not rational to do anything other than the thing which has the best chance of winning".
If I have a choice between what I define as the "Rational" course of action, and a course of action which I describe as "irrational"...
Re: "First, foremost, fundamentally, above all else: Rational agents should WIN."
In an attempt to summarise the objections, there seem to be two fairly-fundamental problems:
Rational agents try. They cannot necessarily win: winning is an outcome, not an action;
"Winning" is a poor synonym for "increasing utility": sometimes agents should minimise their losses.
"Rationalists maximise expected utility" would be a less controversial formulation.
Rationality seems like a good name for the obvious ideal that you should believe things that are true and use this true knowledge to achieve your goals. Because social organisms are weird in ways whose details are beyond the scope of this comment, striving to be more rational might not pay off for a human seeking to move up in a human world---but aside from this minor detail relating to an extremely pathological case, it's still probably a good idea.
Rationalists are the ones who win when things are fair, or when things are unfair randomly over an extended period. Rationality is an advantage, but it is not the only advantage, not the supreme advantage, not an advantage at all in some conceivable situations, and cannot reasonably be expected to produce consistent winning when things are unfair non-randomly. However, it is a cultivable advantage, which is among the things that makes it interesting to talk about.
A rationalist might be unfortunate enough that (s)he does not do well, but ceteris paribus, ...
I guess when I look over the comments, the problem with the phraseology is that people seem to inevitably begin debating over whether rationalists win and asking how much they win - the properties of a fixed sort of creature, the "rationalist" - rather than saying, "What wins systematically? Let us define rationality accordingly."
Not sure what sort of catchphrase would solve this.
It seems to me that the disagreement isn't so much about winning as the expectation.
In fact I don't really agree with this winning vs. belief modes of rationality.
Both approaches are trying to maximize their expected payout. Eliezer's approach has a wider horizon of what it considers when figuring out what the universe is like.
The standard approach is that since the content of the boxes is already determined at the time of the choice, so taking both will always put you $1000 ahead.
Eliezer looks (I think) out to the most likely final outcomes. (or looks ...
I don't think I buy this for Newcomb-like problems. Consider Omega who says, "There will be $1M in Box B IFF you are irrational."
Rationality as winning is probably subject to a whole family of Russell's-Paradox-type problems like that. I suppose I'm not sure there's a better notion of rationality.
The rationality that doesn't secure your wish isn't the true rationality.
Winning has no fixed form. You'll do whatever is needed to succeed, however original or far fetched it would sound. How it sounds is irrelevant, how it works is the crux.
And If at first what you tried didn't work, then you'll learn, adapt, and try again, making no pause for excuses, if you merely want to succeed, you'll be firm as a rock, relentless in your attempts to find the path to success.
And if your winning didn't go as smoothly or well as you wanted or thought it should, in g...
What about cases where any rational course of action still leaves you on the losing side?
Although this may seem to be impossible according to your definition of rationality, I believe it's possible to construct such a scenario because of the fundamental limitations of a human brains ability to simulate.
In previous posts you've said that, at worst, the rationalist can simply simulate the 'irrational' behaviour that is currently the winning strategy. I would contend that humans can't simulate effectively enough for this to be an option. After all we know th...
If humans are imperfect actors then in situations (such as a game of chicken) in which it is better to (1) be irrational and seen as irrational then it is to (2) be rational and seen as rational
then the rational actor will lose.
Of course holding constant everyone else's beliefs about you, you always gain by being more rational.
Alleged rationalists should not find themselves envying the mere decisions of alleged nonrationalists, because your decision can be whatever you like.
Eliezer said this in the Newcomb's Problem post which introduced "Rationalists should win".
Perhaps for a slogan, shorten it to: "Rationalists should not envy the mere decisions of nonrationalists." This emphasizes that rationality contributes to winning through good decisions.
A potential problem is that, in some circumstances, an alleged rationalist could find a factor that seems unrela...
Both boxes might be transparent. In this case, you would see the money in both boxes only if you are rational enough to understand, that you have to pick just B.
Wouldn't that be an irrational move? Not all! You have to understand that to be rational.
It seems to me that some of the kibitzing is due to human cognitive architecture making it difficult to be both epistemologically and instrumentally rational in many contexts, e.g., expected overconfidence in social interactions, motivation issues related to optimism/pessimism, &c.
An ideal rational agent would not have this problem, but human cognition is... suboptimal.
Perhaps it's not about the ad hominem.
"Rationality is whatever wins."
If it's not a winning strategy, you're not doing it right. If it is a winning strategy, overall in as long of terms as you can plan, then it's rationality. It doesn't matter what the person thinks: whether they'd call themselves rationalists or not.
Rationality is the art of the optimal.
Rationality is optimally systematized systematizing. (:-))
Rationality is winning that doesn’t generate a surprise; randomly winning the lottery generates a surprise. A good measure of rationality is the amount of complexity involved in order to win, and the surprise generated by that win. If to win at a certain task requires that your method have many complex steps, and you win, non-surprisingly, then the method used was a very rational one.
All else being equal, shouldn't rationalists, almost by definition, win? The only way this wouldn't happen would be in a contest of pure chance, in which rationality could confer no advantage. It seems like we're just talking semantics here.
As an answer to my and others' constant nagging, your post feels strangely unfulfilling. Just what problems does the Art solve, and how do you check if the solutions are correct? Of course the problems can be theoretical, not real-world - this isn't the issue at all.
How about "Rationality isn't about winning"? Nod to Robin Hanson.
It is this that I intended to guard against by saying: "Rationalists should win!" Not whine, win. If you keep on losing, perhaps you are doing something wrong. Do not console yourself about how you were so wonderfully rational in the course of losing. That is not how things are supposed to go. It is not the Art that fails, but you who fails to grasp the Art.
It is similar to scrub mentality in "playing to win" by sirlin. if you are not winning consistently then there is something wrong in the behaviours, which needs to be recognised
I fully support "Rationality is systematized winning", however, there may be a lack of wider market appeal in the word systematized.
Perhaps consultation with a marketing expert* could get you the most effective results?
*(Someone expert in understanding what words and phrases are most likely to have impactful meaning to one or more groups of humans)
If one values winning above everything else, then everything that leads to winning is rational. The reductio to this is if torturing a googolplex of beings at maximum duration and increasing intensity leads to winning, then that's what must be done.
Yet... perhaps winning then is not what we should most value? Perhaps we should value destroying the thing which values torturing a googolplex of beings. What if we need to torture half of a googolplex of beings to outcompete something willing to torture a googolplex of beings? What if outcompeting such a t...
You're confusing ends with means, terminal goals with instrumental goals, morality with decision theory, and about a dozen other ways of expressing the same thing. It doesn't matter what you consider "good", because for any fixed definition of "good", there are going to be optimal and suboptimal methods of achieving goodness. Winning is simply the task of identifying and carrying out an optimal, rather than suboptimal, method.
I'm not sure if it's better, but here's one that works well. Similar to the phrase, "Physician, heal thyself!" another way to say rationalists should win is to say, "Rationalist, improve thyself!"
If you aren't actually improving yourself and the world around you, then you aren't using the tools of rationality correctly. And it follows that to improve the world around you, you first have to be in a position to do so by doing the same to yourself.
I one box newcombs problem because the payoffs are too disproportionate to make it interesting. how about this? if omega predicted you would two box they are both empty if omega predicted you would one box both boxes have $1000
Let's use one of Polya's 'how to solve it' strategies and see if the inverse helps: Irrationalists should lose. Irrationality is systematized losing.
On another note, rationality can refer to either beliefs or behaviors. Does being a rationalist mean your beliefs are rational, your behaviors are rational, or both? I think behaving rationally, even with high probability priors, is still very hard for us humans in a lot of circumstances. Until we have full control of our subconscious minds and can reprogram our cognitive systems, it is a struggle to wi...
Like William, I think "winning" is the problem, though for different reasons : "winning" has extra connotations, and tends to call up the image of the guy who climbs the corporate ladder through dishonesty and betrayal rather than trying to lead a happy and fulfilling life. Or someone who tries to win all debates by humiliating his opponent till nobody wants to speak to him any more.
Winning often doesn't mean getting what you want, but winning at something defined externally, or competing with others - which may indeed not always the ra...
"abandon reasonableness" is never necessary; though I think we may be using reasonable somewhat differently. I think "reasonable" includes the idea of "appropriate to the situation"
quoting myself : "There is a supposed "old Chinese saying": The wise man defends himself by never being attacked. Which is excellent, if incomplete, advice. I completed it myself with "But only an idiot counts on not being attacked." Don't use violence unless you really need to, but if you need to don't hold back." ht...
There are two possible interpretations of "Rationalists should win", and it's likely the confusion is coming about from the second.
One use of "should" is to indicate a general social obligation: "people should be nice to each other", and the other is to indicate a personal entitlement: "you should be nice to me." i.e., "should" = "I deserve it"
It appears that some people may be using the latter interpretation, i.e., "I'm rational so I should win" -- placing the obligation on the universe rather than on themselves.
Perhaps "Rationalists choose to win", or "Winning is better than being right"?
Rationality leads directly to effectiveness. Or: Rationality faces the truth and therefore produces effectiveness. Or: Rationality is measured by how much effectiveness it produces.
It seems that most the discussion here is caught up on Omega being able to "predict" your decision would require reverse-time causality which some models of reality cannot allow to exist.
Assuming that Omega is a "sufficiently advanced" powerful being, then the boxes could act in exactly the way that the "reverse time" model stipulates without requiring any such bending of causality through technology that can destroy the contents of a box faster than human perception time or use the classical many-worlds interpretation method ...
One problem is that "Rationalists should win" has two obvious interpretations for me:
Compare with:
and
Has it been settled then, that in this Newcomb's Problem, rationality and winning are at odds? I think it is quite relevant to this discussion whether or not they ever can be at odds.
My last comment got voted down -- presumably because whether or not rationality and winning are ever in conflict has been discussed in the previous post. (I'm a quick study and would like feedback as to why I get voted down.) However, was there some kind of consensus in the previous post? Do we just assume here that it is possible that rationality is not always the winning st...
Am I missing something? I think this answer is very simple: rationality and winning are never at odds.
(The only exception is when a rational being has incomplete information. If information tells him that the blue box has $100 and the red box has $0, and it is the other way around, it is rational for him to pick the blue box even though he doesn't win.)
I already understood what you meant by "rationalists should win", Eliezer, but I don't find Newcomb's problem very convincing as an example. The way I see it, if you one-box you've lost. You could have gotten an extra $1000 but you chose not to.
Simple: most situations in real life aren't like this. If you believe Omega and one-box, you'll lose when he's lying. If your decision theory works better in hypothetical situations and worse in real life, then it doesn't make you win.
Winning is all about choosing the right target. There will be disagreement about which target is right. After hitting the target it will sometimes be revealed that it was the wrong target. Not hitting the right target will sometimes be winning. Rationality lies in the evaluaton before, after and whilst aiming.
Somewhat like the game of darts.
Please ... Newcomb is a toy non-mathematizable problem and not a valid argument for anything at all. There must be a better example, or the entire problem is invalid.
There must be a better example, or the entire problem is invalid.
I've long thought that voting in general is largely isomorphic to Newcomb's. If you cop out and don't vote, then everyone like you will reason the same way and not vote, and your favored candidates/policies will fail; but if you vote then the reverse might happen; and if you then carry it one more step... If you could just decide to one-box/vote then maybe everyone else like you will.
Followup to: Newcomb's Problem and Regret of Rationality
"Rationalists should win," I said, and I may have to stop saying it, for it seems to convey something other than what I meant by it.
Where did the phrase come from originally? From considering such cases as Newcomb's Problem: The superbeing Omega sets forth before you two boxes, a transparent box A containing $1000 (or the equivalent in material wealth), and an opaque box B that contains either $1,000,000 or nothing. Omega tells you that It has already put $1M in box B if and only if It predicts that you will take only box B, leaving box A behind. Omega has played this game many times before, and has been right 99 times out of 100. Do you take both boxes, or only box B?
A common position - in fact, the mainstream/dominant position in modern philosophy and decision theory - is that the only reasonable course is to take both boxes; Omega has already made Its decision and gone, and so your action cannot affect the contents of the box in any way (they argue). Now, it so happens that certain types of unreasonable individuals are rewarded by Omega - who moves even before they make their decisions - but this in no way changes the conclusion that the only reasonable course is to take both boxes, since taking both boxes makes you $1000 richer regardless of the unchanging and unchangeable contents of box B.
And this is the sort of thinking that I intended to reject by saying, "Rationalists should win!"
Said Miyamoto Musashi: "The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy's cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him."
Said I: "If you fail to achieve a correct answer, it is futile to protest that you acted with propriety."
This is the distinction I had hoped to convey by saying, "Rationalists should win!"
There is a meme which says that a certain ritual of cognition is the paragon of reasonableness and so defines what the reasonable people do. But alas, the reasonable people often get their butts handed to them by the unreasonable ones, because the universe isn't always reasonable. Reason is just a way of doing things, not necessarily the most formidable; it is how professors talk to each other in debate halls, which sometimes works, and sometimes doesn't. If a hoard of barbarians attacks the debate hall, the truly prudent and flexible agent will abandon reasonableness.
No. If the "irrational" agent is outcompeting you on a systematic and predictable basis, then it is time to reconsider what you think is "rational".
For I do fear that a "rationalist" will clutch to themselves the ritual of cognition they have been taught, as loss after loss piles up, consoling themselves: "I have behaved virtuously, I have been so reasonable, it's just this awful unfair universe that doesn't give me what I deserve. The others are cheating by not doing it the rational way, that's how they got ahead of me."
It is this that I intended to guard against by saying: "Rationalists should win!" Not whine, win. If you keep on losing, perhaps you are doing something wrong. Do not console yourself about how you were so wonderfully rational in the course of losing. That is not how things are supposed to go. It is not the Art that fails, but you who fails to grasp the Art.
Likewise in the realm of epistemic rationality, if you find yourself thinking that the reasonable belief is X (because a majority of modern humans seem to believe X, or something that sounds similarly appealing) and yet the world itself is obviously Y.
But people do seem to be taking this in some other sense than I meant it - as though any person who declared themselves a rationalist would in that moment be invested with an invincible spirit that enabled them to obtain all things without effort and without overcoming disadvantages, or something, I don't know.
Maybe there is an alternative phrase to be found again in Musashi, who said: "The Way of the Ichi school is the spirit of winning, whatever the weapon and whatever its size."
"Rationality is the spirit of winning"? "Rationality is the Way of winning"? "Rationality is systematized winning"? If you have a better suggestion, post it in the comments.