Nick_Tarleton comments on I'm Not Saying People Are Stupid - Less Wrong

38 Post author: Eliezer_Yudkowsky 09 October 2009 04:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (91)

You are viewing a single comment's thread. Show more comments above.

Comment author: Yasser_Elassal 09 October 2009 10:24:37PM 4 points [-]

An emotion that doesn't correlate with reality is itself a bug. Sure, it may not be easy to fix (or even possible without brain-hacking), but it's a bug in the human source code nonetheless.

To extend the analogy, it's like a bug in the operating system. If that low-level bug causes a higher-level program to malfunction, you can still blame "buggy code" even if the higher-level program itself is bug-free.

Comment author: Nick_Tarleton 11 October 2009 12:23:19AM 2 points [-]

An emotion that doesn't correlate with reality is itself a bug.

Even if it's advantageous to the agent's goals (not evolutionary fitness)? Emotions don't have XML tags that say "this should map to reality in the following way".

Comment author: Yasser_Elassal 11 October 2009 05:28:51PM 3 points [-]

My response was to Christian's implication that a rationality program isn't necessarily buggy for outputting irrational behaviors because it must account for human emotions. My point was that human emotions are part of the human rationality program (whether we can edit our source code or not) and that if they cause an otherwise bug-free rationality program to output irrational behaviors, then the emotions themselves are the bugs.

In your response, you asked about emotions that produce behaviors advantageous to the agent's goals, which is rational behavior, not irrational behavior as was stipulated in Christian's post.

If those emotions are part of an otherwise bug-free rationality program that outputs rational beliefs and behaviors, then there is no bug. And that's what it means for an emotion to be correlated with reality, precisely because there are no XML tags mapping certain neural spike patterns (i.e. emotions) to the state of reality.

Emotions aren't beliefs about the world that can be verified by looking at the territory. Emotions are threads within the running program that maps and traverses the territory, so the only thing it can mean for them to correlate with reality is that they don't cause the program to malfunction.

What I was trying to point out to Christian is that emotions are part of the system, not outside of it. So if the system produces irrational behavior, then the system as a whole is irrational, even if some of the subroutines are rational in isolation.

The irrationality of the emotions don't somehow cancel out with the irrationality of the outputs to make the whole system rational.