Contra this post from the Sequences

In Eliezer's sequence post, he makes the following (excellent) point:

I can’t find any theorem of probability theory which proves that I should appear ice-cold and expressionless.

This debunks the then-widely-held view that rationality is counter to emotions. He then goes on to claim that emotions have the same epistemic status as the beliefs they are based on.

For my part, I label an emotion as “not rational” if it rests on mistaken beliefs, or rather, on mistake-producing epistemic conduct. “If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, and it is hot, the Way opposes your calm.”

I think Eliezer is making a type error here. When he says "rational", he is of course talking about epistemic rationality. However, epistemic rationality is a property of beliefs, not emotions.  In other words, I can't find any theorem of probability theory which proves that I should feel sad when my expected utility decreases, and happy when it increases.[1]

The only type of "Rationality" emotions can apply to is instrumental rationality, i.e. "the science of winning at life", and the most instrumentally rational emotions don't always stem from beliefs in the intuitive way that Eliezer describes in his sequence post.

Example: Being sad about a high P(doom) can make you less productive at reducing P(doom), as well as incentivize you to self-decieve into a low P(doom).

If I see more good examples in the comments, I will add them.

  1. ^

    ...and fear when there is high probability mass around a future event that will result in a large loss of utility, and anger when another agent causes your utility to go down, and curiosity when you have an opportunity to gain useful information, etc.

New Comment
5 comments, sorted by Click to highlight new comments since:

It seems instrumental rationality is an even worse tool to classify "irrational" emotions. Instrumental rationality is about actions, or intentions and desires, but emotions are neither of those. We can decide what to do, but we can't decide what emotions to have.

Emotions can be treated as properties of the world, optimized with respect to constraints like anything else. We can't edit our emotions directly but we can influence them.

We can "influence" them only insofar we can "influence" what we want or believe: to a very low degree.

(From the top of my head, maybe I’ll change my mind if I think about it more or see a good point.) What can be destroyed by truth, shall be. Emotions and beliefs are entangled. If you don’t think about how high p(doom) actually is because on the back of your mind you don’t want to be sad, you end up working on things that don’t reduce p(doom).

As long as you know the truth, emotions are only important depending on your terminal values. But many feelings are related to what we end up believing, motivated cognition, etc.

Eliezer decided to apply the label "rational" to emotions resulting from true beliefs. I think this is an understandable way to apply that word. I don't think you and Eliezer disagree with anything substantive except the application of that label. 

That said, your point about keeping the label "rational" for things strictly related to the fundamental laws regulating beliefs is good. I agree it might be a better way to use the word.

My reading of Eliezer's choice is this: you use the word "rational" for the laws themselves. But you also use the word "rational" for beliefs and actions that are correct according to the laws (e.g., "It's rational to believe x!). In the same way, you can also use the word "rational" for emotion directly caused by rational beliefs, whatever those emotions might be. 

About the instrumental rationality part: if you are strict about only applying the word "rational" to the laws of thinking, then you shouldn't use it to describe emotions even when you are talking about instrumental rationality, although I agree it seems to be closer to the original meaning, as there isn't the additional causal step. It's closer in the way that "rational belief" is closer to the original meaning. But note that this is true insofar as you can control your emotions, and you treat them at the same level of actions. Otherwise, it would be as saying "state of the world x that helps me achieve my goals is rational", which I haven't heard anywhere.