MugaSofer comments on Nonperson Predicates - Less Wrong

28 Post author: Eliezer_Yudkowsky 27 December 2008 01:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (162)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Luke_A_Somers 15 January 2013 04:13:16PM 0 points [-]

Is a human mind the simplest possible mind that can be sentient? What if, in the course of trying to model its own programmers, a relatively younger AI manages to create a sentient simulation trapped within itself? How soon do you have to start worrying? Ask yourself that fundamental question, "What do I think I know, and how do I think I know it?"

I read this as being simpler than a real human mind. Since it's simpler, the abstractions used are going to be imperfect, and the design would end up being something that is in some way artificial. It's not as explicit as I said, but I still think the implication is pretty strong.

Comment author: MugaSofer 21 January 2013 09:22:09AM *  -2 points [-]

"Is a human mind the simplest possible mind?"

"But if it was simpler, it wouldn't be human!"

Downvoted.

Comment author: Luke_A_Somers 21 January 2013 02:20:17PM 1 point [-]

What? That's completely irrelevant to the question at hand.

By considering the question of whether simpler-than-human minds are possible in this context, it's clear that Eliezer was thinking about the question and giving them moral weight. He doesn't need to ANSWER the question I was posing to make that much clear.

Comment author: MugaSofer 21 January 2013 03:44:35PM 0 points [-]

Wait, what?

*Clicks "Show more comments above."

Oops. I thought you were replying to the quoted text. Upvoted and retracted my comment.