Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Bugmaster comments on Can't Unbirth a Child - Less Wrong

24 Post author: Eliezer_Yudkowsky 28 December 2008 05:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Bugmaster 18 June 2012 09:03:43PM 0 points [-]

What exactly makes both me and EY and presumably many others think sentience is a thing and distinguish "sentient" and "non-sentient"?

Wait, is "sentient" actually a thing ? I always thought that it was just a shorthand we use for describing a wide gamut of phenomena. Humans are quite sentient, chimps less so, dogs even less so, our current AIs even less sentient than that, and rocks aren't sentient at all. Am I wrong about this ?

Comment author: [deleted] 18 June 2012 09:06:28PM 1 point [-]

That is what I try to discern: Is "sentient" a computational property or reducible to "why does my brain make me think it."

I agree with your statement, but I fail to see how to distinguish a "sentient" super-intelligence for a "non-sentient" one.

In general I am confused.

Comment author: Bugmaster 18 June 2012 09:23:10PM 1 point [-]

Is "sentient" a computational property or reducible to "why does my brain make me think it."

I'm not entirely sure what "why does my brain make me think it" means, but I've just noticed that I incorrectly used the word "sentient" in its science-fictional sense; I should've said something like "sapient", instead. The word sentient is often incorrectly used (f.ex. by me) to mean "capable of rational thought and communication", whereas the more correct definition is "capable of having subjective experiences".

As luck would have it, my previous comment applies to both meanings of the word, but still, they are distinct (though probably related). I apologize for the confusion.

Comment author: TheOtherDave 18 June 2012 10:24:29PM 1 point [-]

I fail to see how to distinguish a "sentient" super-intelligence for a "non-sentient" one.

Well, you could ask it whether it has subjective experience and trust its self-report. That's basically the same strategy we use for other intelligences, after all.

Comment author: [deleted] 18 June 2012 11:42:36PM 0 points [-]

And we return to the back box of subjective experience.

Comment author: Bugmaster 18 June 2012 11:44:31PM 0 points [-]

What do you mean by "black box" ? If the AI (or alien or uplifted dolphin or whatever) tells me that it has subjective experiences, why shouldn't I take it at its word ?

Comment author: [deleted] 19 June 2012 12:36:21AM 0 points [-]

Oh, I am not denying that they exist, just saying I don't know a solid theory of subjective experience. I think there was something about Bayesian {Predictive world model, planning engine, utility function, magic AI algorithm} AIs would not have philosophy.

Comment author: Bugmaster 19 June 2012 12:42:37AM 0 points [-]

I think there was something about Bayesian {Predictive world model, planning engine, utility function, magic AI algorithm} AIs would not have philosophy.

Sorry, I have trouble parsing this sentence. But in general, I don't think we need a detailed theory of subjective experiences (assuming that it even makes sense to conceive of such a theory) in order to determine whether some entity is sentient -- as long as that entity is also sapient, and capable of communication. If that's the case, then we can just ask it, and trust its word. If that's not the case, then I agree, we have a problem.