Here is a new empirical paper on folk conceptions of rationality and reasonableness:
Normative theories of judgment either focus on rationality (decontextualized preference maximization) or reasonableness (pragmatic balance of preferences and socially conscious norms). Despite centuries of work on these concepts, a critical question appears overlooked: How do people’s intuitions and behavior align with the concepts of rationality from game theory and reasonableness from legal scholarship? We show that laypeople view rationality as abstract and preference maximizing, simultaneously viewing reasonableness as sensitive to social context, as evidenced in spontaneous descriptions, social perceptions, and linguistic analyses of cultural products (news, soap operas, legal opinions, and Google books). Further, experiments among North Americans and Pakistani bankers, street merchants, and samples engaging in exchange (versus market) economy show that rationality and reasonableness lead people to different conclusions about what constitutes good judgment in Dictator Games, Commons Dilemma, and Prisoner’s Dilemma: Lay rationality is reductionist and instrumental, whereas reasonableness integrates preferences with particulars and moral concerns.
I'd like to become a more reasonable person. How do I change my mindset to make such behaviors more common?
Putting myself in the shoes of a less reasonable person, I imagine that the urge to follow one's own desires constantly overtakes any reasoning which others give. This doesn't feel unreasonable from the inside, because the apologist constantly provides good justifications for actions, and it's hard to second-guess those justifications because it feels like giving up or losing.
I would suggest practicing noticing when you are ignoring what other people say, and using this as a trigger to consider how you can satisfy them. I would also suggest noticing when your brain is spitting out motivated justifications, and in that case, considering from scratch whether the action is getting you what you want (including satisfying others who you want to satisfy).
From my outside view, unreasonable people appear not to be exercising goal-oriented, strategic thinking. I see them as making the same mistake over and over: being inconsiderate of others when it would serve them well to be more considerate. Social interactions are very important, so I tend to try hard to show interest in other people, show concern for their concerns, and so on. (I'm not saying I'm great at it, but it's constantly on my mind.) I perceive less reasonable people as ignoring this.
It's unusual to feel less reasonable than needed, though; mostly everyone sees themselves as one of the most reasonable people they know (irrelevant of whether they should be working on being more reasonable, or less reasonable). This may make our reflection inaccurate, if we are trying to explicitly decide whether we should be more reasonable rather than leaving it up to our system 1 to choose. Also, it seems almost impossible to convince another person to be more reasonable. They (quite rightly, perhaps) will perceive this as a status attack, or simply mistaken.
Okay, thanks! For me, a lot of this advice makes me think I'm too impatient when others disagree with me. I'll work on it. Slow is smooth, smooth is fast.
Another stopgap measure which has helped me is to, when I finish a statement (maybe finish writing an email, or finish making a point in conversation) consider whether I've been unreasonable. An immediate correction can follow in person, or for an email, a new revision.
(Some of that has more to do with being rational than reasonable, but the two aren't completely different, after all.)
A further comment:
This kind of afterthought-based correction eventually trickles into the first-thought reasoning to some extent, because it alters the incentive structure (you learn not to say things that you'll just end up correcting). So, it may be more useful than it sounds.
Setting out to do so is the first and hardest step to take, so congrats! But, of course, the work doesn't end here. As I understand it, someone who's reasonable means one who can be reasoned with, i.e. someone who accepts and occasionally yields to persuasion attempts, and doesn't shut others off through obstinacy or abrasive, uncooperative treatment. In some ways it's the antonym of intransigence. It can also mean someone who possesses enough common sense to facilitate interactions based on a shared view of how the world works.
You may reduce your likelihood of showing such tendencies if you reframe social interactions that involve arguing in a way such that being (acknowledged as) right is less important than maintaining harmony. What some people, the kind who drag out arguments in the name of truth or rightness, don't understand about arguing is that the interaction of arguing takes place in a social context, is awarded limited time and patience (!) before it starts getting on people's nerves (so no, it cannot be prolonged indefinitely until truth finally prevails, however long that may take), and may not be worth the hostility most of the time. Developing some more empathy and thinking about what the other person seeks in the interaction, and whether you're giving it to them, may be of help.
There's a kind of trick that may be of help, but it has to be culturally shared for it to work. You know how LessWrong has some local proverbs such as the Litany of Tarski or Tsuyoku Naritai that people can invoke, but only to other LessWrongers, to remind them of shared values that should prompt an improvement in their behaviour? It would be nice if there were some appeal to being understanding or reasonable that carried the same tone of solemnity. Something that essentially means "I know I can get biased and unreasonable occasionally, but I am committed to the values underpinning collective truth-seeking, and I pledge to allow others to remind me of my commitment, and to attempt to yield when they do so". But in a pithier form.
Is this a decent summary of what you mean by 'reasonable': noticeably rational in socially acceptable ways, i.e. you use reasons and arguments that are in accordance with group norms?
A reasonable person:
Yes, I think that's an accurate succinct definition. (Note: I spent a few minutes writing this comment thinking that there was a small different between your statement and my intention, and ultimately decided that there wasn't.) We could make many fine distinctions in this cluster. To list several notions in this close region:
The last of these is similar to the concept of a person who is playing to win vs a scrub a person who sees overly clever strategies as a kind of cheating, but other than that, plays to win.
Another important concept is negitiability: that the decision-making process is open to scrutiny and adjustment by outsiders. This is similar to corrigibility, as well.
So, should I seek for reasonableness or rationality to prevail, whenever the rational is outside the Overton window? My dilemma is that I find more pleasure on being rational, so rationality stands I should seek for rationality, whereas the reasonable thing to do would be to stand with reasonableness and shut up.
The point is: whenever I can't decide on one over the other, which criterion should I use to make the decission, since each seems to point towards itself? This is fun.
In hindsight, writing a post about Rational vs Reasonable has the unfortunate effect of causing people to ask which is better and how to choose between them, as well as risking causing people to accuse people of being reasonable rather than rational and things of that nature.
These are not good outcomes.
There's a very general issue with "X vs Y" posts, which is that they make the distinction look contentious rather than merely useful. Brienne wrote about this in connection with her Ask Culture vs Guess Culture. A similar failure mode occurs when people debate epistemic vs instrumental rationality.
As nyralech replied, the answer is to use what best serves your goals. The two are not opposed; nor are they allied; nor is it a balancing act between them. Where being reasonable does not serve rationality, the Way opposes your reasonableness; where being reasonable does serve rationality the Way opposes your unreasonableness. "The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means." etc.
A wrote a post based on this, see The Just-Be-Reasonable Predicament. The just-be-reasonable predicament occurs when in order to be seen as being reasonable you must do something irrational or non-optimal.
I'm sorry; re-reading my comment, I think it wasn't clear. I didn't intend to ask which is better, but to arise the following question: Is it possible that whenever I have to decide between rational or reasonable predominance, that decission entails an a-priori decission of one over the other, since each criterion might point towards itself?... it just seemed fun to think about it.
By the way, I'm curious about the Way to which you are referring with a capital W. Is that something like rationality commandments?
It's something Eliezer talks about in some posts; I associate it mainly with The Twelve Virtues and this:
Some people, I suspect, may object that curiosity is an emotion and is therefore "not rational". I label an emotion as "not rational" if it rests on mistaken beliefs, or rather, on irrational epistemic conduct: "If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, and it is hot, the Way opposes your calm."
If being reasonable is necessary to your goals, then it is already instrumentally rational to be reasonable.
I'm glad I read this post. The overton window is such a useful concept!
This post draws ideas from Personhood: A Game for Two or More Players on Melting Asphalt.
I've been lax in my attempt to write something for LW once weekly, but I hope to approximately continue nonetheless. I still have many posts planned -- the next one after this will likely be a rationality game that we've been playing at our weekly meetups in LA.
Last time, I talked about the distinction between associated and relevant. This time I'd like to talk about another distinction which comes up in rationality-conscious communication: that of rational vs reasonable.
Rationality has to do with figuring out what you actually want, being strategic about getting it, understanding what constitutes evidence, and so on. For more information, read the entire LessWrong archive.
Reasonableness is, in contrast, a social skill. It has to do with being able to give explanations for your actions, listening to and often accepting justifications for changing those actions, playing well on a team, behaving in a reliable and predictable manner, and dealing judiciously with guilt and responsibility.
I like reasonable people. Reasonableness is very valuable. It's probably a big part of what attracts me to rationalist circles in the first place: rationalists often value reasonableness more highly and are more careful to exercise it. Yet, rational and reasonable are two very different things. The most rational people are not the most reasonable people, or vice versa. I think it's worth examining in some detail how these two tails come apart.
Perhaps the largest difference comes from the way our explanations for behavior differ from the actual causes. Consciously or unconsciously, we engage in lies and half-truths when it comes to giving reasons for our behavior. This tendency is difficult to overcome because our brain does not ask for permission before generating these justifications. The incentives don't push us to total dishonesty, but they don't push us to total honesty, either. Evolution and our everyday social feedback conspire to make us give reasonable-sounding, socially defensible reasons in place of explaining the causes for our actions to-the-best-of-our-knowledge.
Aspiring rationalists will want to do away with some or all of this. However, this comes at a cost. To be perceived as reasonable by others, you will need to produce justifications for your beliefs and actions. Hollywood rationality would have it that a good rationalist will always have a detailed, accurate explanation at the ready. However, having a best-estimate belief does not entail being able to give a reason for it, and it's not always effective to simply explain that. Furthermore, even if we can produce explanations which are accurate as opposed to convenient, it may not be a good idea to use them. Paul Graham argues that to avoid social considerations, the quest for accurate belief is best kept personal, or shared with a few trusted friends. The idea of tell culture asserts that truth should be attempted nonetheless.
I won't try to state here what the best way to handle this is; only that a decision must be made.
Another difference between reasonable and rational is the perspective on opinions. The Overton window provides a range of reasonable opinions. Step outside of this, and you are likely to be labelled as unreasonable. Stay within it, however, and you're entitled to your opinion, whatever it may be. To a rationalist, you are never entitled to your opinion (not even "I don't know"). From one perspective, the Overton window has become tiny; it consists of the single correct spread of uncertainty given evidence. (This ignores the role of priors, but in most cases it is unrealistic to claim that they play a really large role.) From another perspective, the Overton window is wide open: it's wherever the evidence takes us.
So far, I've only discussed the reasonableness of personal beliefs and actions. The main use-case of reasonable behavior, though, is coordinating group action. Reasons are a currency which is exchanged for favors. If I want you to turn down your music, I can explain to you that I dislike it. I reciprocate by responding to similar requests from you. Reasonableness does not compel me to respond if your justification is missing, or lacking, however. There seems to be an intuitive scale by which we compare the size of the reason and the size of the favor. With no reason whatsoever, I may agree to small tasks, but will refuse most things. If a person's life is in danger, almost any request for aid is seen as justifiable.
Reason appears to be a common standard applied for group coordination. Reason isn't about the coordination norm in itself; for example, driving on the wrong side of the road as a result of forgetting what country I'm in isn't unreasonable. Rather, if I'm told that I am driving on the wrong side of the road, and still do it, that's unreasonable. Another example is fair allocation. Without using words, resources could be split with nonverbal signals of displeasure (and when needed, threats of violence); this calls to mind the reaction of a monkey when it sees another monkey given a larger reward for the same task. When reasons can be exchanged, however, more sophisticated coordination can occur. The group can agree to give bonuses for good behavior and withdraw resources for bad. Favors or slights can be remembered and brought up later (and we develop a self-serving bias, keeping track of all the reasons others owe us, as a strategy to game this system). These reputations can spread by word of mouth.
The use of reason becomes a matter of reputation in itself, as well. We like reasonable people, and behave favorably toward them. We dislike unreasonable people, and find ways to punish them. Accepting reasons is like cooperating in a game of societal prisoner's dilemma; the tit-for-tat strategy will be among the common strategies, making would-be defectors wary. Many different patterns of cooperation and defection can emerge as strategies in different situations, however. This will also become intertwined with other status games, as well; a low-status person may be obliged to accept almost any reason given by higher-status individuals, while the higher-status ignore good reasons with impunity.
What constitutes a good reason will depend on group norms. Reason can be applied to these norms themselves, producing a further-refined group standard. Perhaps we can see rationality as an extremely refined standard of this kind. Reasonable people throughout time gradually built up a picture of what kinds of reasons can be given, drawing a line in the sand between logic and fallacies. Continued disagreements called for further and further refinements. Probability theory and notions of induction became necessary. Foundational problems arose as we continued to recursively ask for the reasons behind our reasons. Bayesian thought rose and fell and rose again. Now we find ourselves discussing meta-ethics and advanced decision theories.