Vladimir_Nesov comments on Communicating rationality to the public: Julia Galef's "The Straw Vulcan" - Less Wrong

21 Post author: lukeprog 25 November 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_Nesov 26 November 2011 03:35:07PM *  4 points [-]

I don't think she implies that emotions are necessary for implementing a goal

That phrase was primarily in reply to daenerys, not Julia.

Can we trace the flow chart back to any entirely non-emotional desires/preferences? I suspect that it would quickly become a semantic issue surrounding the word "emotion."

What about laws of physics, or evolution? While true (if technically vague) explanations for actions, they are not true cognitive or decision theoretic or normative reasons for actions. See this post.

Comment author: KatieHartman 26 November 2011 03:37:56PM *  2 points [-]

Upvoted for the clarification. Thanks!

What about laws of physics, or evolution? While true (if technically vague) explanations for actions, they are not true cognitive reasons for actions.

"I don't want to die," for example, is obviously both an emotional preference and the result of the natural evolution of the brain. That the brain is an evolved organ isn't disputed here.

Comment author: [deleted] 26 November 2011 07:03:05PM *  2 points [-]

Upvoting everyone. This was a really useful conversation, and I'm pretty sure I was wrong, so I definitely learned something. The evolutionary drives example was much more useful to me than the AI example. Thanks!

(Though I am still of the opinion that the speech itself was still great without the info; Due to being an introduction to the topic, I still don't expect it to be able to cover everything. )

Comment author: Vladimir_Nesov 26 November 2011 07:19:20PM *  0 points [-]

There are explanations of different kinds that hold simultaneously. An explanation of the wrong kind (for example, evolutionary explanation) that is only similar (because of shared reasons) to the relevant explanation (of the right kind, in this case "goals", a normative or at least cognitive explanation) can be used to gain correct answers, used as a heuristic (evolutionary psychology has a bit of predictive power about human behavior and even goals). This further simplifies confusing them, so that instead of a rule of thumb, a source of knowledge, an explanation of the wrong kind would try taking a role that doesn't belong to it, becoming a definition of the thing being sought. For example, "maximizing inclusive fitness" can be believed to be an actual human goal.