Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Rationally Irrational

-13 Post author: HungryTurtle 07 March 2012 07:21PM

I understand rationality to be related to a set of cognitive tools rather than a certain personality or genetic type. Like any other tool it can be misused. You can kill a person with a spoon, but that is a misuse of its intended function. You cut a pound of raw meat with a chainsaw, but that is a misuse of its intended function. Tools are designed with both intended purposes and functional limitations. Intended purposes serve to provide the user with an understanding of how to achieve optimal impact. For example, some intended uses of a sword would be killing, disabling, acting, or training (and many more). Tools can be used outside of their intended purposes. The use might not result in optimal output, it might even damage the tool, but it is possible.  A sword can be used to cut wood, clear shrubbery, as a decoration, a sword could even be used as a door stop. Doorstop has long departed from the intended function for a sword upon its design, but nevertheless it exists as possibility given the structure of a sword. Functional limitations are desired uses that a tool cannot meet given its structure.  A sword alone cannot allow you to fly or breathe underwater, at least not without making significant alterations to its structure, rendering it no longer a sword.

Every tool exists with both intended functions and functional limitations. From reading some essays on this website I get the impression that many members of this community view rationality as a universal tool. That no matter what the conflict a certain degree of rationality would provide the appropriate remedy. I would like to question this idea. I think there are both functional limitations to rationality and ways to misuse one's powers of reasoning. To address these, it is first necessary to identify what the primary function of rationality is.

The Function of rationality

From reading various articles on this website I would suggest that rationality is seen as a tool for accuracy in obtaining desired results, or as Eliezer puts it, for “winning.” I agree with this analysis. Rationality is a tool for accuracy; increased accuracy leads to successfully obtainment of some desired result; obtainment of some desired result can broadly be described as “winning.” If rationality is a tool for increasing accuracy, then the questions becomes “are there ever times when it is more beneficial to be inaccurate,” or in other words, are there times when it should be desired to lose.

Why would a person ever want to lose?

I can think of two situations where increased accuracy is detrimental: 1.) In maintaining moderation; 2.) In maintaining respectful social relations.

1.) *It is better to air on the side of caution*: The more accurate you become the faster you obtain your goals. The faster you obtain your goals the quicker you progress down a projected course. In some sense this is a good thing, but I do not think it is universally good. **The pleasure winning may deter the player from the fundamental question “Is this a game I should be playing?”** A person who grew up playing the violin from an early age could easily find themselves barreling along a trajectory that leads them to a conservatory without addressing the fundamental question “is becoming a violinist what is going to most benefit my life? It is easy to do something you are good at, but it is fallacious to think that just because you are good at something it is what you should be doing. If Wille E. Coyote has taught us anything it is that progressing along a course too fast can result in unexpected pitfalls. Our confidence in an idea, job, a projected course, has no real bearing on its ultimate benefit to us (see my comment here for more on how being wrong feels right). While we might not literally run three meters off a cliff and then fall into the horizon, is it not possible for things to be moving too fast?

2.) *”Wining” all the time causes other people narrative dissonance*:  People don’t like it when someone is right about everything. It is suffocating.  Why is that? I am sure that a community of dedicated rationalists will have experienced this phenomenon, where relationships with family, friends, and other personal networks are threatened/damaged by you having an answer for everything, every causal debate, every trivial discussion; where you being extremely good at “winning” has had a negative effect on those close to you. I have a theory for why this is, is rather extensive, but I will try to abridge it as much as possible. First, it is based in the sociological field of symbolic interactionism, where individuals are constantly working to achieve some role confirmation in social situations. My idea is that there are archetypes of desired roles, and that every person needs the psychological satisfaction of being cast into those roles some of the time. I call these roles “persons of interest.” The wise one, the smart one, the caring one, the cool one, the funny one, these are all roles of interest that I believe all people need the chance to act out. If in a relationship you monopolize one of these roles to the point that your relations are unable to take it on, than I believe you are hurting your relationship. If you win too much, deprive those close to you the chance of winning, effectively causing them anxiety.

For example, I know when I was younger my extreme rationality placed a huge burden on my relationship with my parents. After going to college I began to have a critique of almost everything they did. I saw a more efficient, more productive way of doing things than my parents who had received outdated educations. For a while I was so mad that they did not trust me enough to change their lives, especially when I knew I was right. Eventually, What I realized was that it is psychologically damaging for a parent’s 20 something year old kid to feel that it is their job to show you how to live. Some of the things (like eating healthier and exercising more) I did not let go, because I felt the damages of my role reversal were less than the damages of their habits; however, other ideas, arguments, beliefs, I did let go because they did not seem worth the pain I was causing my parents. I have experienced the need to not win as much in many other relationships. Be they friends, teachers, lovers, peers, colleagues, in general if one person monopolizes the social role of imparter of knowledge it can be psychologically damaging to those they interact with. I believe positive coexistence is more important than achieving some desired impact (winning). Therefore I think it is important to ease up on one’s accuracy for the sake of one’s relationships.

- Honestly I have more limitation and some misuses I to address, but decided to hold off and see what the initial reception of my essay was. I realize this is a rationalist community and I am not trying to pick a fight. I just strongly believe in moderation and wanted to share my idea. Please don't hate me too much for that.

- HungryTurtle

 

Comments (414)

Comment author: Swimmer963 07 March 2012 07:46:03PM *  19 points [-]

Your article is interesting, and a lot of the points you make are valid. In practice, LW-style rationality might well have some of the effects you describe, especially in the hands of those who use it or understand it in a limited way. However, I don't think your point is valid as a general argument. For example:

If you win too much, deprive those close to you the chance of winning, effectively causing them anxiety.

To me, this seems to be based on a fallacious understanding of LW-style "winning." Winning here means accomplishing your goals, and using a "rationality" toolkit to win means that you accomplish more of your goals, or accomplish them better, than you would have without those tools.

For some people, being right about everything is a goal. For some people, harmonious social relationships are a goal. For a lot of people, these are both goals, although they may be prioritized differently, i.e. a different weight may be placed on each. If the goal of being right conflicts with the goal of harmonious relationships, and harmonious relationships are prioritary, then according to the toolkit of "rationality", it is rational to lay off a bit and avoid threatening the self-image of your friends and family. This is certainly true for me. Being right, coming across as exceptionally smart, etc, are rather low-priority goals for me compared to making and keeping friends. (The fact that the former has always been easier than the latter may be a factor).

Naive use of a rationality toolkit, by people who don't know their own desires, may in fact result in the kind of interpersonal conflict you describe, or in barreling too fast towards the wrong goal. That would be improper use of the tool...and if you cared to ask, the tool would be able to tell you that the use was improper-and aiming for the wrong goal is something that LW specifically warns against.

Nitpick: there's something funny up with the formatting of this article. The text is appearing smaller than usual, making it somewhat hard to read. Maybe go back to 'edit' and see if you can play around with the font size?

Comment author: HungryTurtle 09 March 2012 02:42:46PM 0 points [-]

Thank you for your comments,

For some people, being right about everything is a goal. For some people, harmonious social relationships are a goal. For a lot of people, these are both goals, although they may be prioritized differently, i.e. a different weight may be placed on each.

Thank you, your comments have helped crystallize my ideas. When I said to "rethink what game you are playing" that was a misleading statement. It would be more accurate to my idea to say that some times you have to know when to stop playing. The point I was trying to make is not that the goal you choose is damaging to your relations, but literally winning itself regardless of the goal. From my experience, people don't care about what's right as much as they care about being right. Let's imagine, as you say, your goal is social harmony. This is not an individual goal, like golf, it is a team goal. Achieving this goal requires both a proper method and team subordination. If you let the other players of your team play out their strategies, then you will not win. However, because of the phenomenon I have attempted to explain above (people's need to fulfill certain ideal roles) taking the steps necessary to "win" is damaging to the other players, because it forces them to acknowledge their subordination, and thus in reality does not achieve the desired goal. Does this make sense?

It is similar to the daoist idea of action vs inaction. Inaction is technically a type of action, but it is also defined by existing outside of action. The type of "game" I am talking about is technically a game, but it is defined by relinquishing the power/position of control. Even if you can win/ know how to win, sometimes what people need more than winning is to attempt to win by themselves and know that you are in it with them.

Of course there are times when it is worth more to win, but I think there are times when it is worth the risk of losing to allow others the chance to feel that they can win, even if it is a lesser win than you envision.

Thank you again for your comments.

Comment author: Swimmer963 09 March 2012 08:23:48PM *  3 points [-]

I'm glad my comment helped clarify your ideas for you. I can't say that I entirely understand your point, though.

It would be more accurate to my idea to say that some times you have to know when to stop playing.

Stop playing what game? Say you're with a group of friends, and you're all playing a game together, like Monopoly or something. You're also playing the "game" of social relations, where people have roles like "the smart one", "the cool one" or "the helpful one" that they want to fulfill. Do you mean that sometimes you have to know when to stop playing to win at Monopoly in order to smooth over the social relations game and prevent people from getting frustrated and angry with you? Or do you mean that sometimes you have to stop playing the social status/relations game? The former is, I think, fairly obvious. Some people get too caught up in games like Monopoly and assign more value to "winning" than to letting everyone else have fun, but that's more a failure of social skills than "rationality".

As for the latter, I'm not sure I understand what "deciding to stop playing" at social relations would mean. That you would stop trying to make yourself look good? That you would stop talking to the other people with you? More to the point, I don't think social relations is a game where one person wins over everyone else. If I got to look cool, but it meant that some of my friends didn't have fun and felt neglected, I certainly wouldn't feel like I'd won the game of social harmony.

However, because of the phenomenon I have attempted to explain above (people's need to fulfill certain ideal roles) taking the steps necessary to "win" is damaging to the other players, because it forces them to acknowledge their subordination, and thus in reality does not achieve the desired goal. Does this make sense?

This paragraph makes it sound like you're talking about social status. Yes, social status is somewhat of a zero-sum game, in that you being cooler and getting tons of attentions makes everyone else a bit less cool by comparison and takes away from the attention they get. But that's in no way the goal of social harmony, at least not as I define it. In a harmonious group, no one feels neglected, and everyone enjoys themselves.

In summary, I think you may just be describing a problem that doesn't really happen to me (although, thinking back, it happened to me more back when I was 12 and didn't have good social skills.) Given that intelligence and "nerdiness" is associated with poor social skills, and LW is considered a nerdy community, I can see why it wouldn't be an unreasonable assumption to think that others in the community have this problem, and are liked less by other people because they try too hard to be right. But that's most likely because they don't think of "getting along with others" or "improving their social skills" as specific goals in their own right. Anyone who does form those goals, and apply the toolkit of LW-rationality to them, would probably realize on their own that trying to be right all the time, and "winning" in that sense, would mean losing at a different and perhaps more important game.

Comment author: HungryTurtle 06 April 2012 05:15:57PM -1 points [-]

Sorry for such a late response, life really picked up this month in many amazing and wondrous ways and I found myself lacking the time or desire to respond. Now things have lulled back, and I would like to address your, and all the other responses to my ideas.

Stop playing what game? ...As for the latter, I'm not sure I understand what "deciding to stop playing" at social relations would mean.

When I say game I am referring to a board game, a social game, a dream, really any desired outcome. Social status is a type of game, and it was the one I thought provided the most powerful analogy, but it is not the overall point. The overall point is the social harmony you speak of. You say that in your opinion,

In a harmonious group, no one feels neglected, and everyone enjoys themselves...

I agree with this definition of harmony. The idea I am trying to express goes beyond the poor social skills you are assuming I am attributing to this "nerdy community" (which I am not). Beyond individually motivated goals, I am suggesting that for no one to feel neglected and everyone to enjoy themselves it is necessary for the actor to stop trying to achieve any goal. The pursuit of any one goal-orientation automatically excludes all other potential goal-orientations. If you have an idea of what is funny, what is cool, in attempting to actualize these ideas you are excluding all other possible interpretations of them. For no one to feel neglected and everyone to truly enjoy themselves, then everyone’s ideas of happiness, security, camaraderie, humor, etc must be met. My idea is somewhat similar to Hinesburg’s uncertainty principle, in that your intentionally makes the goal you desire unattainable. Does this make sense?

Comment author: Swimmer963 06 April 2012 11:58:45PM 1 point [-]

for no one to feel neglected and everyone to enjoy themselves it is necessary for the actor to stop trying to achieve any goal.

Do you mean that the person in question has to just sit back and relax? That they have to stop trying to steer the direction of the conversation and just let it flow? Or that they have to focus on other people's enjoyment rather than their own enjoyment? The former doesn't feel true for me, in that having someone with good social skills and an idea of people's interests steer the conversation can make it more enjoyable rather than less so. The latter, maybe true, but I wouldn't want to live like that.

Comment author: TimS 07 March 2012 07:34:07PM *  14 points [-]

Some of the things (like eating healthier and exercising more) I did not let go, because I felt the damages of my role reversal were less than the damages of their habits; however, other ideas, arguments, beliefs, I did let go because they did not seem worth the pain I was causing my parents.

Why call this losing instead of winning-by-choosing-your-battles? I don't think members of this community would endorse always telling others "I know a better way to do that" whenever one thinks this is true. At the very least, always saying that risks being wrong because (1) you were instrumentally incorrect about what works better or (2) you did not correctly understand the other person's goals.

More generally, the thing you are labeling rationality is what we might call straw vulcan rationality. We don't aspire to be emotionless computrons. We aspire to be better at achieving our goals.

Eliezer wrote a cute piece about how pathetic Spock was to repeatedly predict things had <1% of succeeding when those sorts of things always worked. As outsiders, we can understand why the character said that, but from inside Spock-the-person, being repeated wrong like that shows something is wrong in how one is thinking. Can't find that essay, sorry.


It doesn't bother me, but some people will be bothered by the non-standard font and spacing. I'd tell you how to fix it, but I don't actually know.

Comment author: Swimmer963 07 March 2012 07:59:49PM 8 points [-]

Reminds me of this chapter from Eliezer's fanfiction. "Winning" in the naive, common-usage-of-the-word sense doesn't always result in better accomplishing your goals, and it is sometimes "rational" to lose, which means that losing is sometimes "winning" in the LW/rationality sense.

Words are confusing sometimes!

Comment author: HungryTurtle 09 March 2012 03:12:07PM 0 points [-]

Tims,

It is always a pleasure talking! Thanks for the great link to the straw vulcan rationality. Ironically, what Julia says here is pretty much the point I am trying to make

Clearly Spock has persistent evidence accumulated again and again over time that other people are not actually perfectly rational, and he’s just willfully neglecting the evidence; The exact opposite of epistemic rationality.

Humans are irrational by nature; humans are also social by nature. There is individual health and there is social health. Because humans are irrational, often times social health contradicts individual health. That is what I call rationally irrational.

Comment author: Swimmer963 09 March 2012 09:31:52PM 5 points [-]

Humans are irrational by nature; humans are also social by nature.

One: what is your evidence that humans are "irrational by nature", and how do you define this irrationality.

Two: I've found that since I started reading LW and trying to put some of its concepts into practice, my ability to handle social situations has actually improved. I am now much better at figuring out what people really want and what I really want, and then finding a way to get both without getting derailed by which options "feel high-status". The specific LW rationality toolkit, at least for me, has been VERY helpful in improving both my individual psychological health and my "social health."

Comment author: faul_sname 09 March 2012 10:16:21PM 4 points [-]

One: I think Lukeprog says it pretty well here:

“Oh my God,” you think. “It’s not that I have a rational little homunculus inside that is being ‘corrupted’ by all these evolved heuristics and biases layered over it. No, the data are saying that the software program that is me just is heuristics and biases. I just am this kluge of evolved cognitive modules and algorithmic shortcuts. I’m not an agent designed to have correct beliefs and pursue explicit goals; I’m a crazy robot built as a vehicle for propagating genes without spending too much energy on expensive thinking neurons.”

Two: Good point. Social goals and nonsocial goals are only rarely at odds with one another, so this may not be a particularly fruitful line of thought. Still, it is possible that the idea of rational "irrationality" is neglected here.

Comment author: thomblake 10 April 2012 06:21:42PM 1 point [-]

Social goals and nonsocial goals are only rarely at odds with one another

This seems implausible on the face of it, as goals in general tend to conflict. Especially to the extent that resources are fungible.

Comment author: Swimmer963 09 March 2012 10:20:22PM 1 point [-]

I agree with you on Lukeprog's description being a good one. I'm curious about whether HungryTurtle agrees with this description, too, or whether he's using a more specific sense of "irrational."

Comment author: HungryTurtle 06 April 2012 05:31:36PM 0 points [-]

Social goals and nonsocial goals are only rarely at odds with one another

hahah than why is smoking cool for many people? Why is binge drinking a sign of status in American colleges? Why do we pull all nighters and damage our health for the pursuit of the perfect paper, party, or performance.

Social goals are a large portion of the time at odds with individual health goals.

Comment author: faul_sname 06 April 2012 08:05:43PM 0 points [-]

I'm probably generalizing too much from my own experience, which is social pressure to get educated and practice other forms of self-improvement. I've never actually seen anyone who considers binge drinking a good thing, so I had just assumed that was the media blowing a few isolated cases out of proportion. I could easily be wrong though.

Comment author: HungryTurtle 09 March 2012 11:27:41PM *  -2 points [-]

One: what is your evidence that humans are "irrational by nature", and how do you define this irrationality.

Do you think humans can avoid interpreting the world symbolically? I do not. The human body, the human brain is hardwired to create symbols. Symbols are irrational. If symbols are irrational, and humans are unable to escape symbols, then humans are fundamentally irrational. That said, I should have added to my above statement that humans are also rational by nature.

Comment author: [deleted] 09 March 2012 11:55:09PM 4 points [-]

humans are also rational by nature.

Humans are irrational by nature

Why isn't this just a contradiction? In virtue of what are these two sentences compatible?

Comment author: Gastogh 11 March 2012 07:30:34AM 0 points [-]

I think they're compatible in that the inaccurate phrasing of the original statement doesn't reflect the valid idea behind it. Yobi is right: it's not a clean split into black and white, though the original statement reads like it is. I think it would've been better phrased as, "There are rational sides to humans. There are also irrational sides to humans." The current phrasing suggests the simultaneous presence of two binary states, which would be a contradiction.

Comment author: Swimmer963 10 March 2012 11:13:05PM 1 point [-]

Symbols are irrational. If symbols are irrational, and humans are unable to escape symbols, then humans are fundamentally irrational.

In what sense do you mean that symbols are irrational? Is it because they only imperfectly represent the world that is "really out there?" Is there a better option for humans/hypothetical other-minds to use instead of symbols?

Comment author: HungryTurtle 06 April 2012 05:26:11PM -1 points [-]

Symbols by definition are analogies to reality. Analogies are not rationally based, they are rhetorically based. Rhetoric is by no means rational in the sense that this community uses the word. Therefore language is by definition irrational.

Is there a better option for humans/hypothetical other-minds to use instead of symbols?

No, that is my point. Humans have no other way to relate to reality. The idea of a better option is a fiction of essentialist philosophy.

Comment author: Dustin 07 March 2012 11:55:22PM 0 points [-]

I don't know if this is what you were thinking of, but here is what lukeprog wrote about Spock.

Comment author: sixes_and_sevens 09 March 2012 12:38:46PM 3 points [-]

I believe this is what he's thinking of.

Comment author: TimS 09 March 2012 01:29:47PM 2 points [-]

What kind of tragic fool gives four significant digits for a figure that is off by two orders of magnitude?

That's it. Thanks.

Comment author: Vladimir_Nesov 08 March 2012 09:54:28AM *  13 points [-]

[A]re there times when it should be desired to lose[?]

When you should "lose", "losing" is the objective, and instrumental rationality is the art of successfully attaining this goal. When you do "lose", you win. On the other hand, if you "win", you lose. It's very simple.

Comment author: Matt_Simpson 08 March 2012 06:44:53PM 3 points [-]

When you do "lose", you win. On the other hand, if you "win", you lose. It's very simple.

Cue laugh track.

Comment author: HungryTurtle 06 April 2012 04:37:55PM 0 points [-]

When you should "lose", "losing" is the objective, and instrumental rationality

Thank you for your insightful comments. I chose to call it winning to try and build off the existing terminology of the community, but that might have been a mistake. What was meant by "winning" was goal achievement, what was meant by "losing" was acting in a way that did not move towards any perceived goal, perhaps it would be better described as having no goal.

Inaction is technically a type of action, but I think there needs to be a distinction between them. Choosing to suspend intentionality is technically a type of intentionality, but I still think there needs to be a distinction. What do you think?

Comment author: [deleted] 08 March 2012 05:32:50PM *  5 points [-]

You've confused goal-winning (LW sense) with social-winning.

Rationality is the optimal tool for goal-winning, which is always what is desirable. This relation is established by definition, so don't bother criticizing it.

You can show that our current understanding of rationality or winning does not live up to the definition, but that is not a criticism of the definition. Usually when people debate the above definition, they are taking it to be an empirical claim about spock or some specific goal, which is not how we mean it.

EDIT: Also, "air on the side". It's "err" as in "error". Read Orwell's "politics and the english langauge".

Comment author: faul_sname 08 March 2012 11:15:36PM 4 points [-]

This relation is established by definition, so don't bother criticizing it.

This phrase worries me.

Comment author: wedrifid 09 March 2012 03:46:53AM 2 points [-]

I hope it means "If you want to criticize this relationship you must focus your criticism on the definition that establishes it".

Comment author: faul_sname 09 March 2012 07:14:26AM 0 points [-]

Yes, but considering that social winning is quite often entangled quite closely with goal winning, and that the goal sometimes is social winning. To paraphrase a fairly important post, you only argue a point by definition when it's not true any other way.

Comment author: Nectanebo 09 March 2012 02:20:14PM *  1 point [-]

I agree with you that that particular sentence could have been phrased better.

But nyan_sandwich pointed out the key point, that turtle was arguing based upon a specific definition of rationality that did not mean the same thing that LW refers to when they talk about rationality. Therefore when she said the words "by definition" in this case, she was trying to make clear that arguing about it would therefore be arguing about the definition of the word, and not anything genuinely substantial.

Therefore it seems that it is very unlikely that sandwich was falling into the common problem the article you linked to is refering to: of saying that (a thing) is (another thing) by definition when actually the definition of the thing does not call for such a statement to be the case at all.

Yes, the wording made it seem like it may have been the case that she was falling into that trap, however I percieved that what she was actually doing was trying to inform hungry turtle that he was talking about a fairly different concept to what LW talks about, even though we used the same word (a phenomenon that is explained well in that sequence).

Comment author: HungryTurtle 06 April 2012 03:00:00PM 1 point [-]

Nectanebo,

Perhaps you can explain to me how the LW definition differs from the one I provide, because I pulled my definition from this sites terminology to specifically avoid this issue. I am willing to accept that there is a problem in my wording of this definition, but I respectfully hold the position that we are talking about the same rationality.

In my opinion, the problem is not with my concept of rationality, but that I am attacking, even if it is a mild attack, an idea that is held in the highest regard among this community. It is the dissonance of my idea that leads nyan_sandwich to see fault with it, not the idea itself. I hope we can talk this out and see what happens.

Comment author: Nectanebo 07 April 2012 04:13:02AM *  2 points [-]

I can think of two situations where increased accuracy is detrimental: 1.) In maintaining moderation; 2.) In maintaining respectful social relations.

increased accuracy is not rationality

Think about it this way: if you want increased accuracy, then rationality is the best way to increase accuracy. If you want to maintain social relations, then the rational choice is the choice that optimally maintains social relations.

I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be. Therefore if you think that being rational is not necessarily the best option in some cases, we are not talking about the same concept any longer, because when you attack rationality in this way, you are not attacking the same rationality that people on LessWrong refer to.

For example, it is silly for people to try to attempt to increase accuracy to the detriment of their social relationships. This is irrational if you want to maintain your social relationships, based on how LessWrong tends to use the word.

The points I make have been covered fairly well by many others who have replied in this thread. if you want to know more about what we may have been trying say, that sequence about words also covers it in detail, I personally found that particular sequence to be one of the best and most useful, and it is especially relevant to the discussion at hand.

Comment author: adamisom 21 April 2012 01:32:07AM 0 points [-]

Anything can be included in rationality after you realize it needs to be.

Or: You can always define your utility function to include everything relevant, but in real life estimations of utility, some things just don't occur to us (at least until later). So sure, increased accuracy [to social detriment] is not rationality. Once you realize it.* But you need to realize it. I think HungryTurtle is helping us realize it.

So I think the real question is *does your current model of rationality, the way you think about it right now and actually (hopefully) use it, is that inoptimal?

Comment author: HungryTurtle 11 April 2012 02:59:20PM 0 points [-]

I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be.

Do you ever think it is detrimental having goals?

Comment author: Nectanebo 12 April 2012 12:15:10PM 0 points [-]

Sure, some goals may be detrimental to various things.

But surely people have the goal of not wanting detrimental goals, if the detriment is to things they care about.

Comment author: HungryTurtle 12 April 2012 12:36:02PM 0 points [-]

Yes! So this idea is the core of my essay.

I suggest that the individual who has the goal of not wanting detrimental goals acknowledges the following:

1.) Goal-orientations (meaning the desired state of being that drives one's goals at a particular time) are dynamic.

2.) The implementation of genuine rational methodology to a goal-orientation consumes a huge amount of the individual/group's resources.

If the individual has the goal of not having detrimental goals, and if they accept that goal-orientations are dynamic, and that a genuinely rational methodology consumes a huge amount of resources, then such an individual would rationally desire a system of regulating when to implement rational methodology and when to abandon rational methodology due to the potential triviality of immediate goals.

Because the individual is choosing to abandon rationality in the short-term, I label this as being rationally irrational.

Comment author: [deleted] 11 April 2012 03:29:15PM 0 points [-]

Do you ever think it is detrimental having goals?

What would that even mean? Do you by detrimental mean something different than ‘making it harder to achieve your goals’?

Comment author: HungryTurtle 11 April 2012 05:34:04PM 0 points [-]

Detrimental means damaging, but you could definitely read it as damaging to goals.

So do you think it is ever damaging or ever harmful to have goals?

Comment author: TheOtherDave 11 April 2012 03:44:15PM 0 points [-]

Hm.
If I have a goal G1, and then I later develop an additional goal G2, it seems likely that having G2 makes it harder for me to achieve G1 (due to having to allocate limited resources across two goals). So having G2 would be detrimental by that definition, wouldn't it?

Comment author: HungryTurtle 06 April 2012 02:54:16PM 0 points [-]

agreed!

Comment author: HungryTurtle 06 April 2012 02:53:23PM *  -1 points [-]

Rationality is the optimal tool for goal-winning, which is always what is desirable.

You can show that our current understanding of rationality or winning does not live up to the definition, but that is not a criticism of the definition.

With all due respect, you are missing the point I am trying to make with "erring on the side of caution" segment. I would agree that in theory goal-winning is always desirable, but as you yourself point out, the individual's understanding of rationality or winning (goal-orientation) is flawed. You imply that as time progresses the individual will slowly but surely recognize what "true winning is." In response to this notion, I would ask

1.) * How do you rationalize omitting the possibility that the individual will never understand what "true rationality" or "true winning" are?* What evidence do you have that such knowledge is even obtainable? If there is none, then would it not be more rational adjust one's confidence in one’s goal-orientation to include the very real possibility that any immediate goal-orientation might later be revealed as damaging?

2.) Even if we make the assumption that eventually the individual will obtain a perfect understanding of rationality and winning, how does this omit the need for caution in early stage goal-orientation? If given enough time, I will understand true rationality, then rationally shouldn't all my goals up until that point is reached by approached with caution?

My point is that while one’s methodology in achieving goals can become more and more precise, there is no way to guarantee that the bearings at which we place our goals will lead us down a nourishing (and therefore rational) path; and therefore, the speed at which we achieve goals (accelerated by rationality) is potentially dangerous to achieving the desired results of those goals. Does that make sense?

Comment author: Arran_Stirton 07 April 2012 05:30:52AM 2 points [-]

1.) You should read up on what it really means to have "true rationality". Here's the thing, we don't omit the possibility that the individual will never understand what "true rationality" is, in fact Bayes' Theorem shows that it's impossible to assign a probability of 1.0 to any theory of anything (never mind rationality). You can't argue with math.

2.) Yes, all of your goals should be approached with caution, just like all of your plans. We're not perfectly rational beings, that's why we try to become stronger. However, we approach things with due caution. If something is our best course of action given the amount of information we have, we should take it.

Also remember, you're allowed to plan for more than one eventuality, that's why we use probabilities and Bayes’ theorem it order to work out what eventualities we should plan for.

Comment author: Furslid 07 March 2012 09:02:58PM 5 points [-]

So, sometimes actions that are generally considered rational lead to bad results in certain situations. I agree with this.

However, how are we to identify and anticipate these situations? If you have a tool other than rationality, present it. If you have a means of showing its validity other than the rationalist methods we use here, present that as well.

To say that rationality itself is a problem leaves us completely unable to act.

Comment author: HungryTurtle 06 April 2012 04:50:29PM 0 points [-]

So, sometimes actions that are generally considered rational lead to bad results in certain situations. I agree with this.

Well said! I was not trying to attack the use of rationality as a method, but rather to attack the immoderate use of this method. Rationality is a good and powerful tool for acting intentionally, but should there not be some regulation in its use? You state

To say that rationality itself is a problem leaves us completely unable to act.

I would counter: To say that there is no problem with rationality leaves us completely without reason to suspend action.

As you have suggested, rationality is a tool for action. Are there not times when it is harmful to not act? Are there no reasons to suspend action?

Comment author: TimS 06 April 2012 05:10:42PM 1 point [-]

Rationality is a tool for making choices. Sometimes the rational choice is not to play.

Comment author: HungryTurtle 06 April 2012 05:34:23PM 0 points [-]

Which is why I call it rational irrationality, or rationally irrational if you would prefer. I do think it is possible to semantically stretch the conception of rationality to cover this, but I still think a fundamental distinction needs to be acknowledged between rationality that leads to taking control in a situation, and rationality that leads to intentional inaction.

Comment author: TimS 06 April 2012 06:47:48PM *  0 points [-]

I feel like you are conflating terminal values (goals) and instrumental values (means/effectiveness) a little bit here. There's really no good reason to adopt an instrumental value that doesn't help you achieve your goals. But if you aren't sure of what your goals are, then no amount of improvement of your instrumental values will help.

I'm trying to distinguish between the circumstance where you aren't sure if inactivity will help achieve what you want (if you want your spouse to complete a chore, should you remind them or not?) or aren't sure if inactivity is what you want (do I really like meditation or not?).

In particular, your worry about accuracy of maps and whether you should act on them or check on them seems to fundamentally be a problem about goal uncertainty. Some miscommunication is occurring because the analogy is focused on instrumental values. To push a little further on the metaphor, a bad map will cause you to end up in Venice instead of Rome, but improving the map won't help you decide if you want to be in Rome.

Comment author: Dustin 07 March 2012 11:57:13PM 2 points [-]

If the action you are engaging in is not helping you achieve your goals, than it is not rational.

You are describing a failure of rationality rather than rationality itself.

Comment author: HungryTurtle 06 April 2012 04:41:18PM 0 points [-]

What I am describing is the need for a safeguard against overly confident goal orientation.

Comment author: aliciaparr 08 March 2012 12:27:09PM 0 points [-]

I find it interesting, even telling, that nobody has yet challenged the assumptions behind the proposition "Rationality is a tool for accuracy," which would be that "rationality is the best tool for accuracy" and/or that "rationality is the sole tool that can be used to achieve accuracy."

Comment author: RichardKennaway 08 March 2012 01:27:21PM 5 points [-]

Why would someone challenge a proposition that they agree with? While I don't see that the proposition "Rationality is a tool for accuracy" presumes "Rationality is the tool for accuracy", I'd agree with the latter anyway. Rationality is the only effective tool there is, and more than merely by definition. Praying to the gods for revelation doesn't work. Making stuff up doesn't work. Meditating in a cave won't tell you what the stars are made of. Such things as observing the world, updating beliefs from experience, making sure that whatever you believe implies something about what you will observe, and so on: these are some of the things in the rationality toolbox, these are the things that work.

If you disagree with this, please go ahead and challenge it yourself.

Comment author: AspiringKnitter 11 March 2012 09:22:22PM 2 points [-]

Praying to the gods for revelation doesn't work.

Supposing that you lived in a universe where you could pray for and would then always receive infallible instruction, it would be rational to pray.

If it leads to winning more than other possibilities, it's rational to do it. If your utility function values pretending to be stupid so you'll be well-liked by idiots, that is winning.

Comment author: MagnetoHydroDynamics 12 March 2012 12:05:29AM *  5 points [-]

pretending

Key phrase. The accurate map leads to more winning. Acknowledging that X obviously doesn't work, but pretending that it does in order to win is very different from thinking X works.

ETA: It is all fine and dandy that I am getting upvotes for this, and by all means don't stop, but really I am just a novice applying Rationality 101 whereever I see fit in order to earn my black belt.

Comment author: HungryTurtle 06 April 2012 03:17:52PM 1 point [-]

The accurate map leads to more winning.

What evidence is there that the map is static? We make maps and the world transforms. Rivers become canyons; mountains become mole hills (pardon the rhetorical ring I could not resist). Given that all maps are approximations isn't it rational to moderate one's navigation with the occasional off course exploration to verify that not drastic changes have occurred in the geography?

And because I feel the analogy is pretty far removed at this point, what I mean by that, is that if we have charted a goal-orientation based on our map that puts us on a specific trajectory, would it not be beneficial to occasional abandon our goal-orientation to explore other trajectories for potentially new and more lucrative paths.

Comment author: MagnetoHydroDynamics 06 April 2012 06:57:48PM 1 point [-]

The evidence that the territory is static is called Physics. The laws does not change, and the elegant counterargument against anti-inductionism is that if induction didn't work our brains would stop working, because our brains depend on static laws.

There is no evidence whatsoever that the map is static. It should never be, you should always be prepared to update, there isn't a universal prior that lets you reason inductively about any universe.

Comment author: HungryTurtle 06 April 2012 08:18:32PM -3 points [-]

The evidence that the territory is static is called Physics

The territory is not static. Have you ever heard of quantum physics?

Comment author: [deleted] 06 April 2012 10:39:33PM 0 points [-]

Quantum physics is invariant under temporal translation too.

Comment author: Dmytry 06 April 2012 10:48:12PM *  2 points [-]

The laws don't change by definition. If something changes, we try to figure out some invariant description of how it changes, and call that a law. We presume a law even when we don't know the invariant description (as is the case with QM&gravity combined). If there was magic in the real world, we'd do the same thing and have same invariant laws of magic, even though number of symmetries may have been lower.

Comment author: MagnetoHydroDynamics 06 April 2012 08:47:06PM 0 points [-]

The territory is governed by unchanging perfectly global basic mathematically simple universal laws.

The Schrödinger equation does not change. Ever.

And further more, you can plot the time dimension as a spatial dimension and then navigate a model of an unchanging structure of world lines. That is an accepted model called the Block Universe in General Relativity. The Block universe is 'static' that is, without time.

There is reason to believe the same can be done in quantum mechanics.

Comment author: Vaniver 06 April 2012 03:37:18PM 0 points [-]

would it not be beneficial to occasional abandon our goal-orientation to explore other trajectories for potentially new and more lucrative paths.

Why would that not be part of the trajectory traced out by your goal-orientation, or a natural interaction between the fuzziness of your map and your goals?

Comment author: HungryTurtle 06 April 2012 06:26:58PM 0 points [-]

Well you would try to have that as part of your trajectory, but what I am suggesting is that there will always be things beyond your planning, beyond your reasoning, so in light of this perhaps we should strategically deviate from those plans every now and then to double check what else is out there.

Comment author: Vaniver 06 April 2012 06:43:52PM 0 points [-]

I'm still confused by what you're considering inside my reasoning and outside my planning / reasoning. If I say "spend 90% of your time in the area with the highest known EV and 10% of your time measuring areas which have at least a 1% chance of having higher reward than the current highest EV, if they exist," then isn't my ignorance about the world part of my plan / reasoning, such that I don't need to deviate from those plans to double check?

Comment author: AspiringKnitter 12 March 2012 02:00:17AM 1 point [-]

It is all fine and dandy that I am getting upvotes for this, and by all means don't stop, but really I am just a novice applying Rationality 101 whereever I see fit in order to earn my black belt.

Personally, I think that behavior should be rewarded.

Comment author: MagnetoHydroDynamics 12 March 2012 02:13:33AM 1 point [-]

Personally, I think that behavior should be rewarded.

Thank you, and I share that view. Why don't we see everyone doing it? Why, I would be overjoyed if everyone was so firmly trained in Rat101 that comments like these were not special.

But now I am deviating into a should-world + diff.

Comment author: Ben_Welchner 12 March 2012 02:36:02AM *  1 point [-]

I'm pretty sure we do see everyone doing it. Randomly selecting a few posts, in The Fox and the Low-Hanging Grapes the vast majority of comments received at least one upvote, the Using degrees of freedom to change the past for fun and profit thread have slightly more than 50% upvoted comments and the Rationally Irrational comments also have more upvoted than not.

It seems to me that most reasonably-novel insights are worth at least an upvote or two at the current value.

EDIT: Just in case this comes off as disparaging LW's upvote generosity or average comment quality, it's not.

Comment author: AspiringKnitter 12 March 2012 02:42:17AM *  2 points [-]

Though among LW members, people probably don't need to be encouraged to use basic rationality. If we could just upvote and downvote people's arguments in real life...

I'm also considering the possibility that MHD was asking why we don't see everyone using Rationality 101.

Comment author: RichardKennaway 12 March 2012 07:47:49AM 3 points [-]

Praying to the gods for revelation doesn't work.

Supposing that you lived in a universe where you could pray for and would then always receive infallible instruction, it would be rational to pray.

I'm talking about the real world, not an imaginary one. You can make up imaginary worlds to come up with a counterexample to any generalisation you hear, but it amounts to saying "Suppose that were false? Then it would be false!"

Comment author: HungryTurtle 06 April 2012 03:09:22PM 0 points [-]

Richard,

Would you agree that the rate of speed that you try to do something is directly correlated to the accuracy you can produce?

I imagine the faster you try to do something to poorer your results will be. Do you disagree?

If it is true that at times accuracy demands some degree of suspension/inaction, then I would suggest to you that tools such as praying, meditating, and "making stuff up" serve to slow the individual down, allowing for better accuracy in the long term. Whereas, increasing intentionality will beyond some threshold decrease overall results.

Does that make sense?

Comment author: RichardKennaway 06 April 2012 08:30:05PM 0 points [-]

Slowing down will only give better results if it's the right sort of slowing down. For example, slowing down to better attend to the job, or slowing down to avoid exhausting oneself. But I wasn't talking about praying, meditating, and making stuff up as ways of avoiding the task, but as ways of performing it. As such, they don't work.

It may be very useful to sit for a while every day doing nothing but contemplating one's own mind, but the use of that lies in more clearly observing the thing that one studies in meditation, i.e. one's own mind.

Comment author: HungryTurtle 11 April 2012 03:31:43PM *  1 point [-]

But I wasn't talking about praying, meditating, and making stuff up as ways of avoiding the task, but as ways of performing it. As such, they don't work.

I am suggesting the task they perform has two levels. The first is a surface structure, defined by whatever religious or creative purpose the performer thinks they serve. In my opinion, the medium of this level is completely arbitrary. It does not matter what you pray to, or if you meditate or pray, or play baseball for that matter. The importance of such actions comes from their deep structure, which develops beneficial cognitive, emotional, or physical habits.

Prayer is in many cultures a means of cultivating patience and concentration. The idea, which has been verified by the field of psychology, is that patience, concentration, reverence, toleration, empathy, sympathy, anxiety, serenity, these and many other cognitive dispositions are not the result of a personality type, but rather the result of intentional development.

Within the last several decades there has been a revolution within the field of psychology as to what action is. Previously cognitive actions were not thought of as actions, and therefore not believed to be things that you develop. It was believed that some people where just born kinder, more stressed, more sympathetic, etc, that there were cognitive types. We know now is that this is not true. While it is true that everyone probably is born with a different degree of competency in these various cognitive actions (just as some people are probably born slightly better at running, jumping, or other more physical actions), more important than innate talent is the amount of work someone puts into a capacity. Someone born with a below average disposition for running can work hard and become relatively fast. In the same way, while there are some biological grounds and limitations, for the majority of people, the total level of capacity they are able to achieve in some action is determined by the amount of work they devote to improving that action. If you work out your tolerance muscles, you will become able to exhibit greater degrees of tolerance. If you work out your concentration muscle, you will be able to concentrate to greater degrees. How do you work out tolerance or concentration muscles? By engaging in tasks that require concentration or tolerance. So, does praying 5 times a day to some God have an impact on reality? Well if you mean in the sense that a “God” listens to and acts on your prayers, No. But if you mean in the sense that the commitment to keeping a schedule and concentration on one thing 5 times, then yes it does. It impacts the reality of your cognition and consciousness.

So returning to what I was saying about suspending action. You interpreted it as “avoiding a task” but I would suggest that suspending action here has deeper meaning. It is not avoiding a task, but developing competencies in caution, accepting a locus of control, limitations, and acceptance. There are more uses in meditation than just active reflection of thought. In fact, most meditation discourages thought. The purpose is to clear your mind, suggesting that there is a benefit in reducing intentionality to some degree. Now, let me be clear that what I am advocating here is very much a value based position. I am saying there is a benefit in exercising the acceptance of limitations to some degree , a benefit in caution to some degree, etc. I would be interested to know do you disagree?

Comment author: RichardKennaway 12 April 2012 11:17:22AM 0 points [-]

That is a lot of words, but it seems to me that all you are saying is that meditation (misspelled as "mediation" throughout) can serve certain useful purposes. So will a spade.

BTW, slowing a drum rhythm down for a beginner to hear how it goes is more difficult than playing it to speed.

Comment author: HungryTurtle 12 April 2012 12:14:12PM 0 points [-]

it seems to me that all you are saying is that meditation (misspelled as "mediation" throughout) can serve certain useful purposes.

Along with religion, praying, and making stuff up. Meditating (thanks for the correction) was just an example.

BTW, slowing a drum rhythm down for a beginner to hear how it goes is more difficult than playing it to speed.

Oh, I also don't get the spade comment either. I mean I agree a spade has useful purposes but what is the point of saying so here?

Not exactly sure what you are trying to express here. Do you mind further explanation?

Comment author: [deleted] 06 April 2012 07:06:29PM 0 points [-]

Cox's theorem does show that Bayesian probability theory (around here a.k.a. epistemic rationality) is the only way to assign numbers to beliefs which satisfies certain desiderata.

Comment author: HungryTurtle 06 April 2012 03:03:47PM 0 points [-]

Aliciaparr,

This is in a sense the point of my essay! I define rationality as a tool for accuracy, because I believed that was a commonly held position on this blog (perhaps I was wrong). But if you look at the overall point of my essay, it is to suggest that there are times when what is desired is achieved without rationality, therefore suggesting alternative tools for accuracy. As to the idea of a "best tool", as I outline in my opening, I do not think such a thing exists. A best tool implies a universal tool for some task. I think that there are many tools for accuracy, just as there are many tools for cooking. In my opinion it all depends on what ingredients you are faced with and what you want to make out of them.

Comment author: DSimon 06 April 2012 08:20:32PM 1 point [-]

Maybe think about it this way: what we mean by "rationality" isn't a single tool, it's a way of choosing tools.

Comment author: HungryTurtle 11 April 2012 03:04:55PM 0 points [-]

That is just pushing it back one level of meta-analysis. The way of choosing tools is still a tool. It is a tool for choosing tools.

Comment author: DSimon 12 April 2012 04:42:08AM 0 points [-]

I agree, and the thing about taking your selection process meta is that you have to stop at some point. If you have more than 1 tool for choosing tools, how do you choose which one to pick for a given situation? You'd need a tool that chooses tools that chooses tools! Sooner or later you have to have a single top level tool or algorithm that actually kicks things into motion.

Comment author: HungryTurtle 12 April 2012 12:51:12PM 1 point [-]

This is where we disagree. To have rationality be the only tool for choosing tools is to assume all meaningful action is derived from the intentional transformation. I disagree with this idea, and I think modern psychology disagrees as well. It is not only possible, it is at times essential to have meaningful action that is not intentionally driven. If you accept this statement as fact, then it implies the need for a secondary system of tool choosing. More specifically, a type of emergency brake system. You have rationality that is the choosing system, and then the secondary system that shuts the system down when it is necessary to halt further production of intentionality.

Comment author: DSimon 12 April 2012 08:05:06PM *  1 point [-]

[I]t is at times essential to have meaningful action that is not intentionally driven.

If by "not intentionally driven" you mean things like instincts and intuitions, I agree strongly. For one thing, the cerebral approach is way too slow for circumstances that require immediate reactions. There is also an aesthetic component to consider; I kind of enjoy being surprised and shocked from time to time.

Looking at a situation from the outside, how do you determine whether intentional or automatic action is best? From another angle, if you could tweak your brain to make certain sorts of situations trigger certain automatic reactions that otherwise wouldn't, or vice versa, what (if anything) would you pick?

These evaluations themselves are part of yet another tool.

Comment author: HungryTurtle 12 April 2012 09:04:03PM 0 points [-]

If by "not intentionally driven" you mean things like instincts and intuitions, I agree strongly.

Yes, exactly.

if you could tweak your brain to make certain sorts of situations trigger certain automatic reactions that otherwise wouldn't, or vice versa, what (if anything) would you pick?

I think both intentional and unintentional action are required at different times. I have tried to devise a method of regulation, but as of now, the best I have come up with is moderating against extremes on either end. So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition. It is rarely the case that I am relying too heavily on the later ^_^

Comment author: DSimon 13 April 2012 01:56:35AM 1 point [-]

So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition.

Right, this is a good idea! You might want to consider an approach that goes by deciding what situations best require intuition, and which ones require intentional thought, rather than aiming only to keep their balance even (though the latter does approximate the former to the degree that these situations pop up with equal frequency).

Overall, what I've been getting at is this: Value systems in general have this property that you have to look at a bunch of different possible outcomes and decide which ones are the best, which ones you want to aim for. For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values". This is true even though the human brain actually physically implements a person's values as multiple modules operating at the same time rather than a single central dispatch.

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Comment author: HungryTurtle 13 April 2012 12:26:39PM 0 points [-]

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Yeah, this is exactly what I am arguing.

For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values".

Could you explain the technical reasons more, or point me to some essays where I could read about this? I am still not convinced why it is more benefical to have a single operating system.

Comment author: Arran_Stirton 07 April 2012 05:44:46AM 0 points [-]

If you're going to use the word rationality, use its definition as given here. Defining rationality as accuracy just leads to confusion and ultimately bad karma.

As for a universal tool for some task? (i.e. updating on your belief) Well you really should take a look at Bayes' theorem before you claim that there is no such thing.

Comment author: HungryTurtle 11 April 2012 03:04:11PM 0 points [-]

I am willing to look at your defintion of rationality, but don't you see how it is problematic to attempt to prescribe one static defintion to a word?

As for a universal tool for some task? (i.e. updating on your belief) Well you really should take a look at Bayes' theorem before you claim that there is no such thing.

Ok, so you do believe that bayes theorem is a universal tool?