Julia_Galef comments on Communicating rationality to the public: Julia Galef's "The Straw Vulcan" - Less Wrong

21 Post author: lukeprog 25 November 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread. Show more comments above.

Comment author: Julia_Galef 26 November 2011 08:52:24PM 5 points [-]

Good question. My intended meaning was closest to (h). (Although isn't (g) pretty much equivalent?)

Comment author: [deleted] 26 November 2011 09:08:55PM 3 points [-]

Yay! Word of God on the issue! (Warning: TvTropes). Good to know I wasn't too far off-base.

I can see how g and h can be considered equivalent using the: emotions-> goals . In fact I would assume that would also make a and b pretty much equivalent, as well as c and d, e and f, etc.

Comment author: Julia_Galef 26 November 2011 10:36:15PM *  9 points [-]

Incidentally, the filmmaker didn't capture my slide with the diagram of the revised model of rationality and emotions in ideal human* decision-making, so I've uploaded it.

The Straw Vulcan model of ideal human* decisionmaking: http://measureofdoubt.files.wordpress.com/2011/11/screen-shot-2011-11-26-at-3-58-00-pm.png

My revised model of ideal human* decisionmaking: http://measureofdoubt.files.wordpress.com/2011/11/screen-shot-2011-11-26-at-3-58-14-pm.png

*I realize now that I need this modifier, at least on Less Wrong!

Comment author: lessdazed 27 November 2011 01:58:42AM 0 points [-]

If emotions are necessary but not sufficient for forming goals among humans, the claim might be that rationality has no normative value to humans without goals without addressing rationality's normative value to humans with emotions who don't have goals.

If you see them as equivalent, this implies that you believe emotions are necessary and sufficient for forming goals among humans.

As much as this might be true for humans, it would be strange to say that after goals are formed, the loss of emotion in a person would obviate all their already formed non-emotional goals. So it's not just that you're discussing the human case and not the AI case, you're discussing the typical human.

Comment author: [deleted] 27 November 2011 02:14:25AM 0 points [-]

How would you rephrase Julia's sentence (with the same rough word count)?