HungryTurtle comments on Rationally Irrational - Less Wrong

-11 Post author: HungryTurtle 07 March 2012 07:21PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (414)

You are viewing a single comment's thread. Show more comments above.

Comment author: aliciaparr 08 March 2012 12:27:09PM 0 points [-]

I find it interesting, even telling, that nobody has yet challenged the assumptions behind the proposition "Rationality is a tool for accuracy," which would be that "rationality is the best tool for accuracy" and/or that "rationality is the sole tool that can be used to achieve accuracy."

Comment author: HungryTurtle 06 April 2012 03:03:47PM 0 points [-]

Aliciaparr,

This is in a sense the point of my essay! I define rationality as a tool for accuracy, because I believed that was a commonly held position on this blog (perhaps I was wrong). But if you look at the overall point of my essay, it is to suggest that there are times when what is desired is achieved without rationality, therefore suggesting alternative tools for accuracy. As to the idea of a "best tool", as I outline in my opening, I do not think such a thing exists. A best tool implies a universal tool for some task. I think that there are many tools for accuracy, just as there are many tools for cooking. In my opinion it all depends on what ingredients you are faced with and what you want to make out of them.

Comment author: DSimon 06 April 2012 08:20:32PM 1 point [-]

Maybe think about it this way: what we mean by "rationality" isn't a single tool, it's a way of choosing tools.

Comment author: HungryTurtle 11 April 2012 03:04:55PM 0 points [-]

That is just pushing it back one level of meta-analysis. The way of choosing tools is still a tool. It is a tool for choosing tools.

Comment author: DSimon 12 April 2012 04:42:08AM 0 points [-]

I agree, and the thing about taking your selection process meta is that you have to stop at some point. If you have more than 1 tool for choosing tools, how do you choose which one to pick for a given situation? You'd need a tool that chooses tools that chooses tools! Sooner or later you have to have a single top level tool or algorithm that actually kicks things into motion.

Comment author: HungryTurtle 12 April 2012 12:51:12PM 1 point [-]

This is where we disagree. To have rationality be the only tool for choosing tools is to assume all meaningful action is derived from the intentional transformation. I disagree with this idea, and I think modern psychology disagrees as well. It is not only possible, it is at times essential to have meaningful action that is not intentionally driven. If you accept this statement as fact, then it implies the need for a secondary system of tool choosing. More specifically, a type of emergency brake system. You have rationality that is the choosing system, and then the secondary system that shuts the system down when it is necessary to halt further production of intentionality.

Comment author: DSimon 12 April 2012 08:05:06PM *  1 point [-]

[I]t is at times essential to have meaningful action that is not intentionally driven.

If by "not intentionally driven" you mean things like instincts and intuitions, I agree strongly. For one thing, the cerebral approach is way too slow for circumstances that require immediate reactions. There is also an aesthetic component to consider; I kind of enjoy being surprised and shocked from time to time.

Looking at a situation from the outside, how do you determine whether intentional or automatic action is best? From another angle, if you could tweak your brain to make certain sorts of situations trigger certain automatic reactions that otherwise wouldn't, or vice versa, what (if anything) would you pick?

These evaluations themselves are part of yet another tool.

Comment author: HungryTurtle 12 April 2012 09:04:03PM 0 points [-]

If by "not intentionally driven" you mean things like instincts and intuitions, I agree strongly.

Yes, exactly.

if you could tweak your brain to make certain sorts of situations trigger certain automatic reactions that otherwise wouldn't, or vice versa, what (if anything) would you pick?

I think both intentional and unintentional action are required at different times. I have tried to devise a method of regulation, but as of now, the best I have come up with is moderating against extremes on either end. So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition. It is rarely the case that I am relying too heavily on the later ^_^

Comment author: DSimon 13 April 2012 01:56:35AM 1 point [-]

So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition.

Right, this is a good idea! You might want to consider an approach that goes by deciding what situations best require intuition, and which ones require intentional thought, rather than aiming only to keep their balance even (though the latter does approximate the former to the degree that these situations pop up with equal frequency).

Overall, what I've been getting at is this: Value systems in general have this property that you have to look at a bunch of different possible outcomes and decide which ones are the best, which ones you want to aim for. For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values". This is true even though the human brain actually physically implements a person's values as multiple modules operating at the same time rather than a single central dispatch.

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Comment author: HungryTurtle 13 April 2012 12:26:39PM 0 points [-]

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Yeah, this is exactly what I am arguing.

For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values".

Could you explain the technical reasons more, or point me to some essays where I could read about this? I am still not convinced why it is more benefical to have a single operating system.

Comment author: Arran_Stirton 07 April 2012 05:44:46AM 0 points [-]

If you're going to use the word rationality, use its definition as given here. Defining rationality as accuracy just leads to confusion and ultimately bad karma.

As for a universal tool for some task? (i.e. updating on your belief) Well you really should take a look at Bayes' theorem before you claim that there is no such thing.

Comment author: HungryTurtle 11 April 2012 03:04:11PM 0 points [-]

I am willing to look at your defintion of rationality, but don't you see how it is problematic to attempt to prescribe one static defintion to a word?

As for a universal tool for some task? (i.e. updating on your belief) Well you really should take a look at Bayes' theorem before you claim that there is no such thing.

Ok, so you do believe that bayes theorem is a universal tool?