wedrifid comments on Rationally Irrational - Less Wrong

-11 Post author: HungryTurtle 07 March 2012 07:21PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (414)

You are viewing a single comment's thread. Show more comments above.

Comment author: wedrifid 09 March 2012 03:46:53AM 2 points [-]

I hope it means "If you want to criticize this relationship you must focus your criticism on the definition that establishes it".

Comment author: faul_sname 09 March 2012 07:14:26AM 0 points [-]

Yes, but considering that social winning is quite often entangled quite closely with goal winning, and that the goal sometimes is social winning. To paraphrase a fairly important post, you only argue a point by definition when it's not true any other way.

Comment author: Nectanebo 09 March 2012 02:20:14PM *  1 point [-]

I agree with you that that particular sentence could have been phrased better.

But nyan_sandwich pointed out the key point, that turtle was arguing based upon a specific definition of rationality that did not mean the same thing that LW refers to when they talk about rationality. Therefore when she said the words "by definition" in this case, she was trying to make clear that arguing about it would therefore be arguing about the definition of the word, and not anything genuinely substantial.

Therefore it seems that it is very unlikely that sandwich was falling into the common problem the article you linked to is refering to: of saying that (a thing) is (another thing) by definition when actually the definition of the thing does not call for such a statement to be the case at all.

Yes, the wording made it seem like it may have been the case that she was falling into that trap, however I percieved that what she was actually doing was trying to inform hungry turtle that he was talking about a fairly different concept to what LW talks about, even though we used the same word (a phenomenon that is explained well in that sequence).

Comment author: HungryTurtle 06 April 2012 03:00:00PM 1 point [-]

Nectanebo,

Perhaps you can explain to me how the LW definition differs from the one I provide, because I pulled my definition from this sites terminology to specifically avoid this issue. I am willing to accept that there is a problem in my wording of this definition, but I respectfully hold the position that we are talking about the same rationality.

In my opinion, the problem is not with my concept of rationality, but that I am attacking, even if it is a mild attack, an idea that is held in the highest regard among this community. It is the dissonance of my idea that leads nyan_sandwich to see fault with it, not the idea itself. I hope we can talk this out and see what happens.

Comment author: Nectanebo 07 April 2012 04:13:02AM *  2 points [-]

I can think of two situations where increased accuracy is detrimental: 1.) In maintaining moderation; 2.) In maintaining respectful social relations.

increased accuracy is not rationality

Think about it this way: if you want increased accuracy, then rationality is the best way to increase accuracy. If you want to maintain social relations, then the rational choice is the choice that optimally maintains social relations.

I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be. Therefore if you think that being rational is not necessarily the best option in some cases, we are not talking about the same concept any longer, because when you attack rationality in this way, you are not attacking the same rationality that people on LessWrong refer to.

For example, it is silly for people to try to attempt to increase accuracy to the detriment of their social relationships. This is irrational if you want to maintain your social relationships, based on how LessWrong tends to use the word.

The points I make have been covered fairly well by many others who have replied in this thread. if you want to know more about what we may have been trying say, that sequence about words also covers it in detail, I personally found that particular sequence to be one of the best and most useful, and it is especially relevant to the discussion at hand.

Comment author: adamisom 21 April 2012 01:32:07AM 0 points [-]

Anything can be included in rationality after you realize it needs to be.

Or: You can always define your utility function to include everything relevant, but in real life estimations of utility, some things just don't occur to us (at least until later). So sure, increased accuracy [to social detriment] is not rationality. Once you realize it.* But you need to realize it. I think HungryTurtle is helping us realize it.

So I think the real question is *does your current model of rationality, the way you think about it right now and actually (hopefully) use it, is that inoptimal?

Comment author: HungryTurtle 11 April 2012 02:59:20PM 0 points [-]

I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be.

Do you ever think it is detrimental having goals?

Comment author: Nectanebo 12 April 2012 12:15:10PM 0 points [-]

Sure, some goals may be detrimental to various things.

But surely people have the goal of not wanting detrimental goals, if the detriment is to things they care about.

Comment author: HungryTurtle 12 April 2012 12:36:02PM 0 points [-]

Yes! So this idea is the core of my essay.

I suggest that the individual who has the goal of not wanting detrimental goals acknowledges the following:

1.) Goal-orientations (meaning the desired state of being that drives one's goals at a particular time) are dynamic.

2.) The implementation of genuine rational methodology to a goal-orientation consumes a huge amount of the individual/group's resources.

If the individual has the goal of not having detrimental goals, and if they accept that goal-orientations are dynamic, and that a genuinely rational methodology consumes a huge amount of resources, then such an individual would rationally desire a system of regulating when to implement rational methodology and when to abandon rational methodology due to the potential triviality of immediate goals.

Because the individual is choosing to abandon rationality in the short-term, I label this as being rationally irrational.

Comment author: TimS 12 April 2012 02:36:15PM 0 points [-]

Let's play this out with an example.

Imagine I have a goal of running a marathon. To do that, I run every day to increase my endurance. One day, I trip and fall, twisting my ankle. My doctor tells me that if I run on the ankle, I will cause myself permanent injury. Using my powers of rationality, I decide to stop running until my ankle has healing, to avoid the permanent injury that would prevent me from achieving my goal of running a marathon.

Is my decision to stop training for the marathon, which inevitably moves my goal of running in a marathon further away, "rationally irrational"? Or is there something wrong with my example?

Comment author: HungryTurtle 12 April 2012 04:26:56PM *  0 points [-]

No, your example is fine, but I would say it is the most elementary use of this idea. When faced with a serious threat to health it is relatively easy and obvious to realign goal-orientation. It is harder to make such realignments prior to facing serious damage or threats. In your example, a more sophisticated application of this idea would theoretically remove the possibility of twisting an ankle during training, excluding any extreme circumstances.

I imagine this might raise a lot of questions so let me explain a little more.

Training is not serious. The purpose of training is to prepare for a race, but the purpose of training is subsumed over the larger purpose of personal health, happiness, and survival. Therefore, any training one does should always be taken with the context of being trivial in light of these overarching goals. Having this mindset, I do not see how a runner could sprain their ankle, barring extreme circumstances.

A real runner, taking these overarching values into account would

  • Prior to running build knowledge about safe running style and practices
  • During running be primarily concerned with safety and developing positive running habits rather than meeting some short term goal.

To me, someone who has integrated my idea would never prioritize a race to the point that they risk spraining their ankle in training. Of course there are bizarre situations that are hard/ impossible to plan for. But tripping and twisting your ankle does not seem to be one of these.

Comment author: Nectanebo 12 April 2012 02:23:19PM 0 points [-]

That kinda falls apart because it's not being irrational if it's rational not to consume too much of your resources on "rational methodology". I guess it's just a bad label, "rationally irrational", that is, because you're not abandoning rationality, you're just doing the rational thing by choosing not to use too much of your resources when it's better not to. So at no point you're doing anything that could be considered irrational.

Comment author: HungryTurtle 12 April 2012 03:29:33PM 0 points [-]

Let’s say I am playing soccer. I have decided that any goal-orientation within my soccer game is ultimately not worth the expenditure of resources beyond X amount. Because of this I have tuned out my rational calculating of how to best achieve a social, personal, or game-related victory. To anyone who has not appraised soccer related goal-orientations in this way, my actions would appear irrational within the game. Do you see how this could be considered irrational?

I definitely understand how this idea can also be understood as still rational, it is because of that I called it 'rationally irrational,' implying the actor is never truly abandoning rationality. The reason I choose to word it this way instead of finding some other way to label it as meta-rationality is for rhetorical purposes. This community targets a relatively small demographic of thinkers. That being individuals who have both the capacity and the work history to achieve upper levels of rationality. Perhaps this demographic is the majority within this blog, but I thought it was highly possible that there existed Less Wrong members who were not quite at that level, and that it would be a more symbolically appealing idea if it suggest an element of necessary irrationality within the rationalists paradigm. Maybe this was the a poor choice, but it was what I choose to do.

Comment author: [deleted] 11 April 2012 03:29:15PM 0 points [-]

Do you ever think it is detrimental having goals?

What would that even mean? Do you by detrimental mean something different than ‘making it harder to achieve your goals’?

Comment author: HungryTurtle 11 April 2012 05:34:04PM 0 points [-]

Detrimental means damaging, but you could definitely read it as damaging to goals.

So do you think it is ever damaging or ever harmful to have goals?

Comment author: [deleted] 11 April 2012 05:45:05PM 0 points [-]

Goals can be damaging or harmful to each other, but not to themselves. And if you have no goal at all, there's nothing to be damaged or harmed.

Comment author: HungryTurtle 11 April 2012 07:51:42PM 0 points [-]

I think goals can be damaging to themselves. For example, I think anyone who has the explicit goal of becoming the strongest they can be, effectively limits their strength by the very nature of this type of statement.

Comment author: TheOtherDave 11 April 2012 03:44:15PM 0 points [-]

Hm.
If I have a goal G1, and then I later develop an additional goal G2, it seems likely that having G2 makes it harder for me to achieve G1 (due to having to allocate limited resources across two goals). So having G2 would be detrimental by that definition, wouldn't it?

Comment author: [deleted] 11 April 2012 04:43:24PM *  1 point [-]

Hm... Yeah. So, having goals other than your current goals is detrimental (to your current goals). (At least for ideal agents: akrasia etc. mean that it's not necessarily true for humans.) But I took HungryTurtle to mean ‘having any goals at all’. (Probably I was primed by this.)

Comment author: HungryTurtle 11 April 2012 05:54:47PM 0 points [-]

Yes.

This is very interesting, but I was actually thinking about it in a different manner. I like your idea too, but this is more along the lines of what I meant:

Ultimately, I have goals for the purpose of arriving at some desired state of being. Overtime goals should change rationally to better reach desired states. However, what is viewed as a desired state of being also changes over time.

When I was 12 I wanted to be the strongest person in the world, when I was 18 I wanted to be a world famous comedian. Both of these desired states undoubtedly have goals that the achievement of would more readily and potently produce such desired states. If I had adopted the most efficient methods of pursuing these dreams, I would have been making extreme commitments for the sake of something that later would turn out to be a false desired state. Until one knows their end desired state, any goal that exceeds a certain amount of resources is damaging to the long term achievement of a desired state. Furthermore, I think people rarely know when to cut their losses. It could be that after investing X amount into desired state Y, the individual is unwilling to abandon this belief, even if in reality it is no longer their desired state. People get into relationships and are too afraid of having wasted all that time and resources to get out. I don’t know if I am being clear, but the train of my logic is roughly

  1. Throughout the progression of time what a person finds to be a desired state changes. (Perhaps the change is more drastic in some than others, but I believe this change is normal. Just as through trial and error you refine your methods of goal achievement, through the trials and errors of life you reshape your beliefs and desires. )

  2. If desired states of being are dynamic, then it not wise to commit to too extreme goals or methods for the sake of my current desired state of being. (There needs to be some anticipation of the likelihood that my current desired state might not be in agreement with my final/ actual desired state of being.)

Comment author: TheOtherDave 11 April 2012 06:14:56PM 0 points [-]

(nods)

I certainly agree that the goals people can articulate (e.g., "become a world-famous comedian" or "make a trillion dollars" or whatever) are rarely stable over time, and are rarely satisfying once achieved, such that making non-reversible choices (including, as you say, the consumption of resources) to achieve those goals may be something we regret later.

That said, it's not clear that we have alternatives we're guaranteed not to regret.

Incidentally, it's conventional on LW to talk about this dichotomy in terms of "instrumental" and "terminal" goals, with the understanding that terminal goals are stable and worth optimizing for but mostly we just don't know what they are. That said, I'm not a fan of that convention myself, except in the most metaphorical of senses, as I see no reason for believing terminal goals exist at all.