Comment author: TheOtherDave 13 April 2012 01:57:27PM 2 points [-]

I think you're welcome to have whatever goals you like, and so are the soccer players. But don't be surprised if the soccer players, acknowledging that your goal does not in fact seem to be at all relevant to anything they care about, subsequently allocate their resources to things they care about more and treat you as a distraction rather than as a contributor to their soccer-playing community.

Comment author: HungryTurtle 19 April 2012 12:29:39PM -1 points [-]

What would you say if I said caring about my goals in addition to their own goals would make them a better soccer player?

Comment author: DSimon 16 April 2012 03:57:53AM *  0 points [-]

In regards to why it's possible, I'll just echo what TheOtherDaveSaid.

The reason it's helpful to try for a single top-level utility function is because otherwise, whenever there's a conflict among the many many things we value, we'd have no good way to consistently resolve it. If one aspect of your mind wants excitement, and another wants security, what should you do when you have to choose between the two?

Is quitting your job a good idea or not? Is going rock climbing instead of staying at home reading this weekend a good idea or not? Different parts of your mind will have different opinions on these subjects. Without a final arbiter to weigh their suggestions and consider how important comfort and security are relative to each other, how do you do decide in a non-arbitrary way?

So I guess it comes down to: how important is it to you that your values are self-consistent?

More discussion (and a lot of controversy on whether the whole notion actually is a good idea) here.

Comment author: HungryTurtle 18 April 2012 12:44:35PM 0 points [-]

Thanks for the link. I'll respond back when I get a chance to read it.

Comment author: Nectanebo 13 April 2012 05:08:06AM 1 point [-]

Maybe this was the a poor choice, but it was what I choose to do.

Good, now that you've realised that, perhaps you might want to abandon that name.

The idea of using your time and various other resources carefully and efficiently is a good virtue of rationality. Framing it as being irrational is innaccurate and kinda incendiary.

Comment author: HungryTurtle 13 April 2012 12:57:42PM -1 points [-]

The idea of using your time and various other resources carefully and efficiently is a good virtue of rationality. Framing it as being irrational is inaccurate and kinda incendiary.

Here is my reasoning for choosing this title. If you don't mind could you read it and tell me where you think I am mistaken.

I realize that saying 'rationally irrational' appears to be a contradiction. However, the idea is talking about the use of rational methodology at two different levels of analysis. Rationality at the level of goal prioritization potentially results in the adoption of an irrational methodology at the level of goal achievement.

L1- Goal Prioritization L2- Goal Achievement

L1 rationality can result in a limitation of L2 rationality within low priority goal context. Let’s say that someone was watching me play a game of soccer (since I have been using the soccer analogy). As they watched, they might critique the fact that my strategy was poorly chosen, and the overall effort exerted by me and my teammates was lackluster. To this observer, who considers themselves a soccer expert, it would be clear that my and my team’s performance was subpar. The observer took notes of all are flaws and inefficient habits, then after the game wrote them all up to present to us. Upon telling me all these insightful f critiques, the observer is shocked to hear that I am grateful for his effort, but am not going to change how I or my team plays soccer. He tries to convince me that I am playing wrong, that we will never win the way I am playing. And he is correct. To any knowledgeable observer I was poorly, even irrationally, playing the game of soccer. Without knowledge of L1 (which is not observable) the execution of L2 (which is observable) cannot be deemed rational or irrational, and in my opinion, will appear irrational in many situations.

Would you say that to you it appears irrational that I have chosen to label this idea as ‘rationally irrational?’ If that is correct. I would suggest that I have some L1 that you are unaware of, and that while my labeling is irrational in regard to L2 (receiving high karma points / recognition in publishing my essay on your blog) that I have de-prioritized this L2 for the sake of my L1. What do you think?

Comment author: DSimon 13 April 2012 01:56:35AM 1 point [-]

So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition.

Right, this is a good idea! You might want to consider an approach that goes by deciding what situations best require intuition, and which ones require intentional thought, rather than aiming only to keep their balance even (though the latter does approximate the former to the degree that these situations pop up with equal frequency).

Overall, what I've been getting at is this: Value systems in general have this property that you have to look at a bunch of different possible outcomes and decide which ones are the best, which ones you want to aim for. For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values". This is true even though the human brain actually physically implements a person's values as multiple modules operating at the same time rather than a single central dispatch.

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Comment author: HungryTurtle 13 April 2012 12:26:39PM 0 points [-]

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Yeah, this is exactly what I am arguing.

For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values".

Could you explain the technical reasons more, or point me to some essays where I could read about this? I am still not convinced why it is more benefical to have a single operating system.

Rationally Irrational

-11 HungryTurtle 07 March 2012 07:21PM

I understand rationality to be related to a set of cognitive tools rather than a certain personality or genetic type. Like any other tool it can be misused. You can kill a person with a spoon, but that is a misuse of its intended function. You cut a pound of raw meat with a chainsaw, but that is a misuse of its intended function. Tools are designed with both intended purposes and functional limitations. Intended purposes serve to provide the user with an understanding of how to achieve optimal impact. For example, some intended uses of a sword would be killing, disabling, acting, or training (and many more). Tools can be used outside of their intended purposes. The use might not result in optimal output, it might even damage the tool, but it is possible.  A sword can be used to cut wood, clear shrubbery, as a decoration, a sword could even be used as a door stop. Doorstop has long departed from the intended function for a sword upon its design, but nevertheless it exists as possibility given the structure of a sword. Functional limitations are desired uses that a tool cannot meet given its structure.  A sword alone cannot allow you to fly or breathe underwater, at least not without making significant alterations to its structure, rendering it no longer a sword.

Every tool exists with both intended functions and functional limitations. From reading some essays on this website I get the impression that many members of this community view rationality as a universal tool. That no matter what the conflict a certain degree of rationality would provide the appropriate remedy. I would like to question this idea. I think there are both functional limitations to rationality and ways to misuse one's powers of reasoning. To address these, it is first necessary to identify what the primary function of rationality is.

The Function of rationality

From reading various articles on this website I would suggest that rationality is seen as a tool for accuracy in obtaining desired results, or as Eliezer puts it, for “winning.” I agree with this analysis. Rationality is a tool for accuracy; increased accuracy leads to successfully obtainment of some desired result; obtainment of some desired result can broadly be described as “winning.” If rationality is a tool for increasing accuracy, then the questions becomes “are there ever times when it is more beneficial to be inaccurate,” or in other words, are there times when it should be desired to lose.

Why would a person ever want to lose?

I can think of two situations where increased accuracy is detrimental: 1.) In maintaining moderation; 2.) In maintaining respectful social relations.

1.) *It is better to air on the side of caution*: The more accurate you become the faster you obtain your goals. The faster you obtain your goals the quicker you progress down a projected course. In some sense this is a good thing, but I do not think it is universally good. **The pleasure winning may deter the player from the fundamental question “Is this a game I should be playing?”** A person who grew up playing the violin from an early age could easily find themselves barreling along a trajectory that leads them to a conservatory without addressing the fundamental question “is becoming a violinist what is going to most benefit my life? It is easy to do something you are good at, but it is fallacious to think that just because you are good at something it is what you should be doing. If Wille E. Coyote has taught us anything it is that progressing along a course too fast can result in unexpected pitfalls. Our confidence in an idea, job, a projected course, has no real bearing on its ultimate benefit to us (see my comment here for more on how being wrong feels right). While we might not literally run three meters off a cliff and then fall into the horizon, is it not possible for things to be moving too fast?

2.) *”Wining” all the time causes other people narrative dissonance*:  People don’t like it when someone is right about everything. It is suffocating.  Why is that? I am sure that a community of dedicated rationalists will have experienced this phenomenon, where relationships with family, friends, and other personal networks are threatened/damaged by you having an answer for everything, every causal debate, every trivial discussion; where you being extremely good at “winning” has had a negative effect on those close to you. I have a theory for why this is, is rather extensive, but I will try to abridge it as much as possible. First, it is based in the sociological field of symbolic interactionism, where individuals are constantly working to achieve some role confirmation in social situations. My idea is that there are archetypes of desired roles, and that every person needs the psychological satisfaction of being cast into those roles some of the time. I call these roles “persons of interest.” The wise one, the smart one, the caring one, the cool one, the funny one, these are all roles of interest that I believe all people need the chance to act out. If in a relationship you monopolize one of these roles to the point that your relations are unable to take it on, than I believe you are hurting your relationship. If you win too much, deprive those close to you the chance of winning, effectively causing them anxiety.

For example, I know when I was younger my extreme rationality placed a huge burden on my relationship with my parents. After going to college I began to have a critique of almost everything they did. I saw a more efficient, more productive way of doing things than my parents who had received outdated educations. For a while I was so mad that they did not trust me enough to change their lives, especially when I knew I was right. Eventually, What I realized was that it is psychologically damaging for a parent’s 20 something year old kid to feel that it is their job to show you how to live. Some of the things (like eating healthier and exercising more) I did not let go, because I felt the damages of my role reversal were less than the damages of their habits; however, other ideas, arguments, beliefs, I did let go because they did not seem worth the pain I was causing my parents. I have experienced the need to not win as much in many other relationships. Be they friends, teachers, lovers, peers, colleagues, in general if one person monopolizes the social role of imparter of knowledge it can be psychologically damaging to those they interact with. I believe positive coexistence is more important than achieving some desired impact (winning). Therefore I think it is important to ease up on one’s accuracy for the sake of one’s relationships.

- Honestly I have more limitation and some misuses I to address, but decided to hold off and see what the initial reception of my essay was. I realize this is a rationalist community and I am not trying to pick a fight. I just strongly believe in moderation and wanted to share my idea. Please don't hate me too much for that.

- HungryTurtle

 

View more: Prev | Next