Nectanebo comments on Rationally Irrational - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (414)
increased accuracy is not rationality
Think about it this way: if you want increased accuracy, then rationality is the best way to increase accuracy. If you want to maintain social relations, then the rational choice is the choice that optimally maintains social relations.
I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be. Therefore if you think that being rational is not necessarily the best option in some cases, we are not talking about the same concept any longer, because when you attack rationality in this way, you are not attacking the same rationality that people on LessWrong refer to.
For example, it is silly for people to try to attempt to increase accuracy to the detriment of their social relationships. This is irrational if you want to maintain your social relationships, based on how LessWrong tends to use the word.
The points I make have been covered fairly well by many others who have replied in this thread. if you want to know more about what we may have been trying say, that sequence about words also covers it in detail, I personally found that particular sequence to be one of the best and most useful, and it is especially relevant to the discussion at hand.
Anything can be included in rationality after you realize it needs to be.
Or: You can always define your utility function to include everything relevant, but in real life estimations of utility, some things just don't occur to us (at least until later). So sure, increased accuracy [to social detriment] is not rationality. Once you realize it.* But you need to realize it. I think HungryTurtle is helping us realize it.
So I think the real question is *does your current model of rationality, the way you think about it right now and actually (hopefully) use it, is that inoptimal?
Do you ever think it is detrimental having goals?
Sure, some goals may be detrimental to various things.
But surely people have the goal of not wanting detrimental goals, if the detriment is to things they care about.
Yes! So this idea is the core of my essay.
I suggest that the individual who has the goal of not wanting detrimental goals acknowledges the following:
1.) Goal-orientations (meaning the desired state of being that drives one's goals at a particular time) are dynamic.
2.) The implementation of genuine rational methodology to a goal-orientation consumes a huge amount of the individual/group's resources.
If the individual has the goal of not having detrimental goals, and if they accept that goal-orientations are dynamic, and that a genuinely rational methodology consumes a huge amount of resources, then such an individual would rationally desire a system of regulating when to implement rational methodology and when to abandon rational methodology due to the potential triviality of immediate goals.
Because the individual is choosing to abandon rationality in the short-term, I label this as being rationally irrational.
Let's play this out with an example.
Imagine I have a goal of running a marathon. To do that, I run every day to increase my endurance. One day, I trip and fall, twisting my ankle. My doctor tells me that if I run on the ankle, I will cause myself permanent injury. Using my powers of rationality, I decide to stop running until my ankle has healing, to avoid the permanent injury that would prevent me from achieving my goal of running a marathon.
Is my decision to stop training for the marathon, which inevitably moves my goal of running in a marathon further away, "rationally irrational"? Or is there something wrong with my example?
No, your example is fine, but I would say it is the most elementary use of this idea. When faced with a serious threat to health it is relatively easy and obvious to realign goal-orientation. It is harder to make such realignments prior to facing serious damage or threats. In your example, a more sophisticated application of this idea would theoretically remove the possibility of twisting an ankle during training, excluding any extreme circumstances.
I imagine this might raise a lot of questions so let me explain a little more.
Training is not serious. The purpose of training is to prepare for a race, but the purpose of training is subsumed over the larger purpose of personal health, happiness, and survival. Therefore, any training one does should always be taken with the context of being trivial in light of these overarching goals. Having this mindset, I do not see how a runner could sprain their ankle, barring extreme circumstances.
A real runner, taking these overarching values into account would
To me, someone who has integrated my idea would never prioritize a race to the point that they risk spraining their ankle in training. Of course there are bizarre situations that are hard/ impossible to plan for. But tripping and twisting your ankle does not seem to be one of these.
That kinda falls apart because it's not being irrational if it's rational not to consume too much of your resources on "rational methodology". I guess it's just a bad label, "rationally irrational", that is, because you're not abandoning rationality, you're just doing the rational thing by choosing not to use too much of your resources when it's better not to. So at no point you're doing anything that could be considered irrational.
Let’s say I am playing soccer. I have decided that any goal-orientation within my soccer game is ultimately not worth the expenditure of resources beyond X amount. Because of this I have tuned out my rational calculating of how to best achieve a social, personal, or game-related victory. To anyone who has not appraised soccer related goal-orientations in this way, my actions would appear irrational within the game. Do you see how this could be considered irrational?
I definitely understand how this idea can also be understood as still rational, it is because of that I called it 'rationally irrational,' implying the actor is never truly abandoning rationality. The reason I choose to word it this way instead of finding some other way to label it as meta-rationality is for rhetorical purposes. This community targets a relatively small demographic of thinkers. That being individuals who have both the capacity and the work history to achieve upper levels of rationality. Perhaps this demographic is the majority within this blog, but I thought it was highly possible that there existed Less Wrong members who were not quite at that level, and that it would be a more symbolically appealing idea if it suggest an element of necessary irrationality within the rationalists paradigm. Maybe this was the a poor choice, but it was what I choose to do.
That is a good assessment. Saying something false constitutes exceptionally bad rhetoric here.
I still don't think what I said is false, it is a rhetorical choice. Saying it is rational irrationality still makes sense, it just hits some buzz words for this group and is less appealing than choosing some other form of label.
No, it doesn't. It's a blatant contradiction, which is by definition false.
Also:
Yes, someone could consider it irrational, and that person would be wrong.
Rational Irrationality is talking about rationality within two different levels of analysis. The result of being rational at the level of goal prioritization, the individual abandons rational methodology at the level of goal achievement.
L1- Goal Prioritization L2- Goal Achievement
If I am at a party I have desired outcomes for my interactions and experiences that produce goals. In prioritizing my goals I am not abandoning these goals, but placing them in the context of having desires that exist outside of that immediate situation. I still am trying to achieve my goals, but by correctly assessing their relevance to overarching goals, I either prioritize or de-prioritize them. If I de-prioritize my party goals, I am limiting the effort I put into their achievement. So even if I could think of more potent and effective strategies for achieving my party goals, I have abandon these strategies.
L1 rationality limits L2 rationality within low priority goal context. Rationally condoning the use of irrational methods in minor goal achievement.
Good, now that you've realised that, perhaps you might want to abandon that name.
The idea of using your time and various other resources carefully and efficiently is a good virtue of rationality. Framing it as being irrational is innaccurate and kinda incendiary.
Here is my reasoning for choosing this title. If you don't mind could you read it and tell me where you think I am mistaken.
I realize that saying 'rationally irrational' appears to be a contradiction. However, the idea is talking about the use of rational methodology at two different levels of analysis. Rationality at the level of goal prioritization potentially results in the adoption of an irrational methodology at the level of goal achievement.
L1- Goal Prioritization L2- Goal Achievement
L1 rationality can result in a limitation of L2 rationality within low priority goal context. Let’s say that someone was watching me play a game of soccer (since I have been using the soccer analogy). As they watched, they might critique the fact that my strategy was poorly chosen, and the overall effort exerted by me and my teammates was lackluster. To this observer, who considers themselves a soccer expert, it would be clear that my and my team’s performance was subpar. The observer took notes of all are flaws and inefficient habits, then after the game wrote them all up to present to us. Upon telling me all these insightful f critiques, the observer is shocked to hear that I am grateful for his effort, but am not going to change how I or my team plays soccer. He tries to convince me that I am playing wrong, that we will never win the way I am playing. And he is correct. To any knowledgeable observer I was poorly, even irrationally, playing the game of soccer. Without knowledge of L1 (which is not observable) the execution of L2 (which is observable) cannot be deemed rational or irrational, and in my opinion, will appear irrational in many situations.
Would you say that to you it appears irrational that I have chosen to label this idea as ‘rationally irrational?’ If that is correct. I would suggest that I have some L1 that you are unaware of, and that while my labeling is irrational in regard to L2 (receiving high karma points / recognition in publishing my essay on your blog) that I have de-prioritized this L2 for the sake of my L1. What do you think?
I think you're welcome to have whatever goals you like, and so are the soccer players. But don't be surprised if the soccer players, acknowledging that your goal does not in fact seem to be at all relevant to anything they care about, subsequently allocate their resources to things they care about more and treat you as a distraction rather than as a contributor to their soccer-playing community.
What would you say if I said caring about my goals in addition to their own goals would make them a better soccer player?
Can you be more concrete with your soccer example. I don't understand what you mean.
In a game of soccer, you could want to improve teamwork, you could want to win the game, you could want to improve your skills, you could want to make a good impression. All these are potential goals of a game of soccer. There is a group of objecetives that would most accurately acheive each of these possible goals. I am suggesting that the for each goal, acheiving the goal to the utmost level requres an objective with relatively high resource demands.
Is that better?
An observer who thinks you are being stupid for not committing all possible effort to achieving your goal in the game (for example, impressing others) needs a justification for why achieving this goal is that important. In the absence of background like "this is the only chance for the scout from the professional team to see you play, sign you, and cause you to escape the otherwise un-escapable poverty and starvation," the observer seems like an idiot.
I hope you don't think pointing out the apparent idiocy of the observer is an insightful lesson. In short, show some examples of people here (or anywhere) making the mistake (or mistakes) you identify, or stop acting like you are so much wiser than us.
What would that even mean? Do you by detrimental mean something different than ‘making it harder to achieve your goals’?
Detrimental means damaging, but you could definitely read it as damaging to goals.
So do you think it is ever damaging or ever harmful to have goals?
Goals can be damaging or harmful to each other, but not to themselves. And if you have no goal at all, there's nothing to be damaged or harmed.
I think goals can be damaging to themselves. For example, I think anyone who has the explicit goal of becoming the strongest they can be, effectively limits their strength by the very nature of this type of statement.
Can you clarify this? What do you think is the goal other than 'be the strongest I can be' that would result in me ending up stronger? (Also, not sure what sort of strength you are talking about here: physical? psychological?)
To me true strength is a physical and physiological balance. I feel that anyone who has the goal of being "the strongest" (whether they mean physically, mentally, in a game, etc) is seeking strength out of a personally insecurity about their strength. Being insecure is a type of weakness. Therefore by having the goal of being the strongest will never allow them to be truly strong. Does that make sense? It is a very Daoist idea.
Do you mean someone who wants to be 'the strongest' compared to others. I don't think that's ever a good goal, because whether or not it is achievable doesn't depend on you. But 'be the strongest' is also an incredibly non-specific goal, and problematic for that reason. If you break it down, you could say "right now, my weaknesses are that a) I'm out of shape and can't jog more than 1 mile, and b) I'm insecure about it" then you could set sub-goals in both these areas, prioritize them, make plans on how to accomplish them, and evaluate afterwards whether they had been accomplished...and then make a new list of weaknesses, and a new list of goals, and a new list of plans. You're doing a lot more than just trying to be as strong as you can, but you're not specifically holding back or trying not to be as strong as you can either, which is what your comment came across as recommending.
No not compared to others. Just someone whose goal is to be the strongest. It is the fact that it is an "est" based goal that makes it damaging to itself. I suppose if I were to take all the poetry out of the above mentioned statement I would say that any goal that involves "ests" (fastest, strongest, smartest, wealthiest, etc) involves a degree of abstraction that signifies a lack of true understanding of what the actual quality/ state of being they are targeting encompasses, and that until said person better understands that quality/state they will never be able to achieve said goal.
Note that all your examples take my goal and rewrite to have incredibly practical parameters. You define reachable objectives as targets for your examples, but the point of my example was that it was a goal that lacked such empirically bounded markers.
Well, there's "be stronger than I am right now."
OK, clarify: If I follow the goal 'be the strongest I can be' I will reach a level of strength X. What other goal would allow me to surpass the level of strength X (not just my initial level)?
Again: "be stronger than I am right now."
Of course, I need to keep that goal over time, as phrased, rather than unpack it to mean "be stronger than I was back then".
Some context: when I was recovering from my stroke a few years back, one of the things I discovered was that having the goal of doing a little bit better every day than the day before was a lot more productive for me (in terms of getting me further along in a given time period) than setting some target far from my current state and moving towards it. If I could lift three pounds with my right arm this week, I would try for 3.5 next week. If I could do ten sit-ups this week, I would try for 12 next week. And so forth.
Sure, I could have instead had a goal of "do as many situps as I can", but for me, that goal resulted in my being able to do fewer situps.
I suspect people vary in this regard.
Hm.
If I have a goal G1, and then I later develop an additional goal G2, it seems likely that having G2 makes it harder for me to achieve G1 (due to having to allocate limited resources across two goals). So having G2 would be detrimental by that definition, wouldn't it?
Hm... Yeah. So, having goals other than your current goals is detrimental (to your current goals). (At least for ideal agents: akrasia etc. mean that it's not necessarily true for humans.) But I took HungryTurtle to mean ‘having any goals at all’. (Probably I was primed by this.)
Yes.
This is very interesting, but I was actually thinking about it in a different manner. I like your idea too, but this is more along the lines of what I meant:
Ultimately, I have goals for the purpose of arriving at some desired state of being. Overtime goals should change rationally to better reach desired states. However, what is viewed as a desired state of being also changes over time.
When I was 12 I wanted to be the strongest person in the world, when I was 18 I wanted to be a world famous comedian. Both of these desired states undoubtedly have goals that the achievement of would more readily and potently produce such desired states. If I had adopted the most efficient methods of pursuing these dreams, I would have been making extreme commitments for the sake of something that later would turn out to be a false desired state. Until one knows their end desired state, any goal that exceeds a certain amount of resources is damaging to the long term achievement of a desired state. Furthermore, I think people rarely know when to cut their losses. It could be that after investing X amount into desired state Y, the individual is unwilling to abandon this belief, even if in reality it is no longer their desired state. People get into relationships and are too afraid of having wasted all that time and resources to get out. I don’t know if I am being clear, but the train of my logic is roughly
Throughout the progression of time what a person finds to be a desired state changes. (Perhaps the change is more drastic in some than others, but I believe this change is normal. Just as through trial and error you refine your methods of goal achievement, through the trials and errors of life you reshape your beliefs and desires. )
If desired states of being are dynamic, then it not wise to commit to too extreme goals or methods for the sake of my current desired state of being. (There needs to be some anticipation of the likelihood that my current desired state might not be in agreement with my final/ actual desired state of being.)
(nods)
I certainly agree that the goals people can articulate (e.g., "become a world-famous comedian" or "make a trillion dollars" or whatever) are rarely stable over time, and are rarely satisfying once achieved, such that making non-reversible choices (including, as you say, the consumption of resources) to achieve those goals may be something we regret later.
That said, it's not clear that we have alternatives we're guaranteed not to regret.
Incidentally, it's conventional on LW to talk about this dichotomy in terms of "instrumental" and "terminal" goals, with the understanding that terminal goals are stable and worth optimizing for but mostly we just don't know what they are. That said, I'm not a fan of that convention myself, except in the most metaphorical of senses, as I see no reason for believing terminal goals exist at all.
But do you believe that most people pretty predictably experience shifts in goal orientation over a lifetime?
I'd have to know more clearly what you mean by "goal orientation" to answer that.
I certainly believe that most (actually, all) people, if asked to articulate their goals at various times during their lives, would articulate different goals at different times. And I'm pretty confident that most (and quite likely all, excepting perhaps those who die very young) people express different implicit goals through their choices at different times during their lives.
Are either of those equivalent to "shifts in goal orientation"?
Yes
Then yes, I believe that most people pretty predictably experience shifts in goal orientation over a lifetime.