DSimon comments on Rationally Irrational - Less Wrong

-11 Post author: HungryTurtle 07 March 2012 07:21PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (414)

You are viewing a single comment's thread. Show more comments above.

Comment author: DSimon 12 April 2012 04:42:08AM 0 points [-]

I agree, and the thing about taking your selection process meta is that you have to stop at some point. If you have more than 1 tool for choosing tools, how do you choose which one to pick for a given situation? You'd need a tool that chooses tools that chooses tools! Sooner or later you have to have a single top level tool or algorithm that actually kicks things into motion.

Comment author: HungryTurtle 12 April 2012 12:51:12PM 1 point [-]

This is where we disagree. To have rationality be the only tool for choosing tools is to assume all meaningful action is derived from the intentional transformation. I disagree with this idea, and I think modern psychology disagrees as well. It is not only possible, it is at times essential to have meaningful action that is not intentionally driven. If you accept this statement as fact, then it implies the need for a secondary system of tool choosing. More specifically, a type of emergency brake system. You have rationality that is the choosing system, and then the secondary system that shuts the system down when it is necessary to halt further production of intentionality.

Comment author: DSimon 12 April 2012 08:05:06PM *  1 point [-]

[I]t is at times essential to have meaningful action that is not intentionally driven.

If by "not intentionally driven" you mean things like instincts and intuitions, I agree strongly. For one thing, the cerebral approach is way too slow for circumstances that require immediate reactions. There is also an aesthetic component to consider; I kind of enjoy being surprised and shocked from time to time.

Looking at a situation from the outside, how do you determine whether intentional or automatic action is best? From another angle, if you could tweak your brain to make certain sorts of situations trigger certain automatic reactions that otherwise wouldn't, or vice versa, what (if anything) would you pick?

These evaluations themselves are part of yet another tool.

Comment author: HungryTurtle 12 April 2012 09:04:03PM 0 points [-]

If by "not intentionally driven" you mean things like instincts and intuitions, I agree strongly.

Yes, exactly.

if you could tweak your brain to make certain sorts of situations trigger certain automatic reactions that otherwise wouldn't, or vice versa, what (if anything) would you pick?

I think both intentional and unintentional action are required at different times. I have tried to devise a method of regulation, but as of now, the best I have come up with is moderating against extremes on either end. So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition. It is rarely the case that I am relying too heavily on the later ^_^

Comment author: DSimon 13 April 2012 01:56:35AM 1 point [-]

So if it seems like I have been overly intentional in recent days, weeks, etc, I try to rely more on instinct and intuition.

Right, this is a good idea! You might want to consider an approach that goes by deciding what situations best require intuition, and which ones require intentional thought, rather than aiming only to keep their balance even (though the latter does approximate the former to the degree that these situations pop up with equal frequency).

Overall, what I've been getting at is this: Value systems in general have this property that you have to look at a bunch of different possible outcomes and decide which ones are the best, which ones you want to aim for. For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values". This is true even though the human brain actually physically implements a person's values as multiple modules operating at the same time rather than a single central dispatch.

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Comment author: HungryTurtle 13 April 2012 12:26:39PM 0 points [-]

In your article, you seemed to be saying that you specifically think that one shouldn't have a single "final decision" function at the top of the meta stack. That's not going to be an easily accepted argument around here, for the reasons I stated above.

Yeah, this is exactly what I am arguing.

For technical reasons, it is always possible (and also usually helpful) to describe this as a single function or algorithm, typically around here called one's "utility function" or "terminal values".

Could you explain the technical reasons more, or point me to some essays where I could read about this? I am still not convinced why it is more benefical to have a single operating system.

Comment author: TheOtherDave 13 April 2012 02:05:49PM 1 point [-]

I'm no technical expert, but: if I want X, and I also want Y, and I also want Z, and I also want W, and I also want A1 through A22, it seems pretty clear to me that I can express those wants as "I want X and Z and W and A1 through A22." Talking about whether I have one goal or 26 goals therefore seems like a distraction.

Comment author: DSimon 16 April 2012 03:57:53AM *  0 points [-]

In regards to why it's possible, I'll just echo what TheOtherDaveSaid.

The reason it's helpful to try for a single top-level utility function is because otherwise, whenever there's a conflict among the many many things we value, we'd have no good way to consistently resolve it. If one aspect of your mind wants excitement, and another wants security, what should you do when you have to choose between the two?

Is quitting your job a good idea or not? Is going rock climbing instead of staying at home reading this weekend a good idea or not? Different parts of your mind will have different opinions on these subjects. Without a final arbiter to weigh their suggestions and consider how important comfort and security are relative to each other, how do you do decide in a non-arbitrary way?

So I guess it comes down to: how important is it to you that your values are self-consistent?

More discussion (and a lot of controversy on whether the whole notion actually is a good idea) here.

Comment author: TheOtherDave 16 April 2012 01:01:21PM 1 point [-]

Without a final arbiter to weigh their suggestions and consider how important comfort and security are relative to each other, how do you do decide in a non-arbitrary way?

Well, there's always the approach of letting all of me influence my actions and seeing what I do.

Comment author: HungryTurtle 18 April 2012 12:44:35PM 0 points [-]

Thanks for the link. I'll respond back when I get a chance to read it.