nyan_sandwich comments on Rationally Irrational - Less Wrong

-11 Post author: HungryTurtle 07 March 2012 07:21PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (414)

You are viewing a single comment's thread.

Comment author: [deleted] 08 March 2012 05:32:50PM *  4 points [-]

You've confused goal-winning (LW sense) with social-winning.

Rationality is the optimal tool for goal-winning, which is always what is desirable. This relation is established by definition, so don't bother criticizing it.

You can show that our current understanding of rationality or winning does not live up to the definition, but that is not a criticism of the definition. Usually when people debate the above definition, they are taking it to be an empirical claim about spock or some specific goal, which is not how we mean it.

EDIT: Also, "air on the side". It's "err" as in "error". Read Orwell's "politics and the english langauge".

Comment author: faul_sname 08 March 2012 11:15:36PM 4 points [-]

This relation is established by definition, so don't bother criticizing it.

This phrase worries me.

Comment author: wedrifid 09 March 2012 03:46:53AM 2 points [-]

I hope it means "If you want to criticize this relationship you must focus your criticism on the definition that establishes it".

Comment author: faul_sname 09 March 2012 07:14:26AM 0 points [-]

Yes, but considering that social winning is quite often entangled quite closely with goal winning, and that the goal sometimes is social winning. To paraphrase a fairly important post, you only argue a point by definition when it's not true any other way.

Comment author: Nectanebo 09 March 2012 02:20:14PM *  1 point [-]

I agree with you that that particular sentence could have been phrased better.

But nyan_sandwich pointed out the key point, that turtle was arguing based upon a specific definition of rationality that did not mean the same thing that LW refers to when they talk about rationality. Therefore when she said the words "by definition" in this case, she was trying to make clear that arguing about it would therefore be arguing about the definition of the word, and not anything genuinely substantial.

Therefore it seems that it is very unlikely that sandwich was falling into the common problem the article you linked to is refering to: of saying that (a thing) is (another thing) by definition when actually the definition of the thing does not call for such a statement to be the case at all.

Yes, the wording made it seem like it may have been the case that she was falling into that trap, however I percieved that what she was actually doing was trying to inform hungry turtle that he was talking about a fairly different concept to what LW talks about, even though we used the same word (a phenomenon that is explained well in that sequence).

Comment author: HungryTurtle 06 April 2012 03:00:00PM 1 point [-]

Nectanebo,

Perhaps you can explain to me how the LW definition differs from the one I provide, because I pulled my definition from this sites terminology to specifically avoid this issue. I am willing to accept that there is a problem in my wording of this definition, but I respectfully hold the position that we are talking about the same rationality.

In my opinion, the problem is not with my concept of rationality, but that I am attacking, even if it is a mild attack, an idea that is held in the highest regard among this community. It is the dissonance of my idea that leads nyan_sandwich to see fault with it, not the idea itself. I hope we can talk this out and see what happens.

Comment author: Nectanebo 07 April 2012 04:13:02AM *  2 points [-]

I can think of two situations where increased accuracy is detrimental: 1.) In maintaining moderation; 2.) In maintaining respectful social relations.

increased accuracy is not rationality

Think about it this way: if you want increased accuracy, then rationality is the best way to increase accuracy. If you want to maintain social relations, then the rational choice is the choice that optimally maintains social relations.

I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be. Therefore if you think that being rational is not necessarily the best option in some cases, we are not talking about the same concept any longer, because when you attack rationality in this way, you are not attacking the same rationality that people on LessWrong refer to.

For example, it is silly for people to try to attempt to increase accuracy to the detriment of their social relationships. This is irrational if you want to maintain your social relationships, based on how LessWrong tends to use the word.

The points I make have been covered fairly well by many others who have replied in this thread. if you want to know more about what we may have been trying say, that sequence about words also covers it in detail, I personally found that particular sequence to be one of the best and most useful, and it is especially relevant to the discussion at hand.

Comment author: adamisom 21 April 2012 01:32:07AM 0 points [-]

Anything can be included in rationality after you realize it needs to be.

Or: You can always define your utility function to include everything relevant, but in real life estimations of utility, some things just don't occur to us (at least until later). So sure, increased accuracy [to social detriment] is not rationality. Once you realize it.* But you need to realize it. I think HungryTurtle is helping us realize it.

So I think the real question is *does your current model of rationality, the way you think about it right now and actually (hopefully) use it, is that inoptimal?

Comment author: HungryTurtle 11 April 2012 02:59:20PM 0 points [-]

I think LessWrong considers rationality as the art of finding the best way of achieving your goals, whatever they may be.

Do you ever think it is detrimental having goals?

Comment author: Nectanebo 12 April 2012 12:15:10PM 0 points [-]

Sure, some goals may be detrimental to various things.

But surely people have the goal of not wanting detrimental goals, if the detriment is to things they care about.

Comment author: HungryTurtle 12 April 2012 12:36:02PM 0 points [-]

Yes! So this idea is the core of my essay.

I suggest that the individual who has the goal of not wanting detrimental goals acknowledges the following:

1.) Goal-orientations (meaning the desired state of being that drives one's goals at a particular time) are dynamic.

2.) The implementation of genuine rational methodology to a goal-orientation consumes a huge amount of the individual/group's resources.

If the individual has the goal of not having detrimental goals, and if they accept that goal-orientations are dynamic, and that a genuinely rational methodology consumes a huge amount of resources, then such an individual would rationally desire a system of regulating when to implement rational methodology and when to abandon rational methodology due to the potential triviality of immediate goals.

Because the individual is choosing to abandon rationality in the short-term, I label this as being rationally irrational.

Comment author: [deleted] 11 April 2012 03:29:15PM 0 points [-]

Do you ever think it is detrimental having goals?

What would that even mean? Do you by detrimental mean something different than ‘making it harder to achieve your goals’?

Comment author: HungryTurtle 11 April 2012 05:34:04PM 0 points [-]

Detrimental means damaging, but you could definitely read it as damaging to goals.

So do you think it is ever damaging or ever harmful to have goals?

Comment author: TheOtherDave 11 April 2012 03:44:15PM 0 points [-]

Hm.
If I have a goal G1, and then I later develop an additional goal G2, it seems likely that having G2 makes it harder for me to achieve G1 (due to having to allocate limited resources across two goals). So having G2 would be detrimental by that definition, wouldn't it?

Comment author: HungryTurtle 06 April 2012 02:54:16PM 0 points [-]

agreed!

Comment author: HungryTurtle 06 April 2012 02:53:23PM *  -1 points [-]

Rationality is the optimal tool for goal-winning, which is always what is desirable.

You can show that our current understanding of rationality or winning does not live up to the definition, but that is not a criticism of the definition.

With all due respect, you are missing the point I am trying to make with "erring on the side of caution" segment. I would agree that in theory goal-winning is always desirable, but as you yourself point out, the individual's understanding of rationality or winning (goal-orientation) is flawed. You imply that as time progresses the individual will slowly but surely recognize what "true winning is." In response to this notion, I would ask

1.) * How do you rationalize omitting the possibility that the individual will never understand what "true rationality" or "true winning" are?* What evidence do you have that such knowledge is even obtainable? If there is none, then would it not be more rational adjust one's confidence in one’s goal-orientation to include the very real possibility that any immediate goal-orientation might later be revealed as damaging?

2.) Even if we make the assumption that eventually the individual will obtain a perfect understanding of rationality and winning, how does this omit the need for caution in early stage goal-orientation? If given enough time, I will understand true rationality, then rationally shouldn't all my goals up until that point is reached by approached with caution?

My point is that while one’s methodology in achieving goals can become more and more precise, there is no way to guarantee that the bearings at which we place our goals will lead us down a nourishing (and therefore rational) path; and therefore, the speed at which we achieve goals (accelerated by rationality) is potentially dangerous to achieving the desired results of those goals. Does that make sense?

Comment author: Arran_Stirton 07 April 2012 05:30:52AM 2 points [-]

1.) You should read up on what it really means to have "true rationality". Here's the thing, we don't omit the possibility that the individual will never understand what "true rationality" is, in fact Bayes' Theorem shows that it's impossible to assign a probability of 1.0 to any theory of anything (never mind rationality). You can't argue with math.

2.) Yes, all of your goals should be approached with caution, just like all of your plans. We're not perfectly rational beings, that's why we try to become stronger. However, we approach things with due caution. If something is our best course of action given the amount of information we have, we should take it.

Also remember, you're allowed to plan for more than one eventuality, that's why we use probabilities and Bayes’ theorem it order to work out what eventualities we should plan for.