Eliezer_Yudkowsky comments on What I Think, If Not Why - Less Wrong

25 Post author: Eliezer_Yudkowsky 11 December 2008 05:41PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (100)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 12 December 2008 09:37:00PM 3 points [-]

Wei, when you're trying to create intelligence, you're not trying to get it human, you're trying to get it rational.

When it comes to morality - well, my morality doesn't talk about things being right in virtue of my brain thinking them, but it so happens that my morality is only physically written down in my brain and nowhere else in the physical universe. Likewise with all other humans.

So to get a powerful moral intelligence, you've got to create intelligence to start with using an implementation-independent understanding, and then direct that intelligence to acquire certain information off of physical human brains (because that information doesn't exist anywhere else), whether that has to be done by directly scanning a brain via nondestructive nanotech, or can be confidently asserted just from examining the causal shadows of brains (like their written words).