lessdazed comments on Communicating rationality to the public: Julia Galef's "The Straw Vulcan" - Less Wrong

21 Post author: lukeprog 25 November 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread.

Comment author: lessdazed 26 November 2011 08:28:02PM *  3 points [-]

Emotions are clearly necessary for forming the goals, rationality is simply lame without them.

What does this mean?

a) Emotions are logically necessary for forming goals, rational beings are incapacitated without emotions.
b) Emotions are logically necessary for forming goals, rational beings are incapacitated without goals.
c) Emotions are logically necessary for forming goals, rationality has no normative value to a rational being without emotions.
d) Emotions are logically necessary for forming goals, rationality has no normative value to a rational being without goals.
e) Emotions are necessary for forming goals among humans, rational humans are incapacitated without emotions.
f) Emotions are necessary for forming goals among humans, rational humans are incapacitated without goals.
g) Emotions are necessary for forming goals among humans, rationality has no normative value to humans without emotions.
h) Emotions are necessary for forming goals among humans, rationality has no normative value to humans without goals.
i) (Other.)

Comment author: Julia_Galef 26 November 2011 08:52:24PM 5 points [-]

Good question. My intended meaning was closest to (h). (Although isn't (g) pretty much equivalent?)

Comment author: [deleted] 26 November 2011 09:08:55PM 3 points [-]

Yay! Word of God on the issue! (Warning: TvTropes). Good to know I wasn't too far off-base.

I can see how g and h can be considered equivalent using the: emotions-> goals . In fact I would assume that would also make a and b pretty much equivalent, as well as c and d, e and f, etc.

Comment author: Julia_Galef 26 November 2011 10:36:15PM *  9 points [-]

Incidentally, the filmmaker didn't capture my slide with the diagram of the revised model of rationality and emotions in ideal human* decision-making, so I've uploaded it.

The Straw Vulcan model of ideal human* decisionmaking: http://measureofdoubt.files.wordpress.com/2011/11/screen-shot-2011-11-26-at-3-58-00-pm.png

My revised model of ideal human* decisionmaking: http://measureofdoubt.files.wordpress.com/2011/11/screen-shot-2011-11-26-at-3-58-14-pm.png

*I realize now that I need this modifier, at least on Less Wrong!

Comment author: lessdazed 27 November 2011 01:58:42AM 0 points [-]

If emotions are necessary but not sufficient for forming goals among humans, the claim might be that rationality has no normative value to humans without goals without addressing rationality's normative value to humans with emotions who don't have goals.

If you see them as equivalent, this implies that you believe emotions are necessary and sufficient for forming goals among humans.

As much as this might be true for humans, it would be strange to say that after goals are formed, the loss of emotion in a person would obviate all their already formed non-emotional goals. So it's not just that you're discussing the human case and not the AI case, you're discussing the typical human.

Comment author: [deleted] 27 November 2011 02:14:25AM 0 points [-]

How would you rephrase Julia's sentence (with the same rough word count)?

Comment author: [deleted] 26 November 2011 08:59:36PM *  0 points [-]

From the context of her talk I have a high confidence that the "them" at the end of her sentence refers to emotions, not goals. Therefore I would reject translations b, d, f, and h.

I would also reject a as being far too reaching for the level of her talk.

Also from the context of her talk I would say that the "normative value" translations are much more likely than the "incapacitated" translations. My confidence in this is much lower than my confidence in my first assertion though.

That leaves us with c, g, and other. I've already argued that I think her talk was implied to be about human rationality, leaving us with g, or other.

Can't think of a better option, so my personal opinion is g.