Qiaochu_Yuan comments on The flawed Turing test: language, understanding, and partial p-zombies - Less Wrong

11 Post author: Stuart_Armstrong 17 May 2013 02:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (184)

You are viewing a single comment's thread. Show more comments above.

Comment author: Qiaochu_Yuan 18 May 2013 03:43:35AM 4 points [-]

I don't think that's dissolving far enough. The questions those questions are stand-ins for, I think, are questions like "does X deserve legal consideration?" or "does X deserve moral consideration?" and we might as well be explicit about this.

Comment author: elharo 18 May 2013 04:55:54PM *  5 points [-]

I don't think those questions are mere stand-ins. I think the answers to "does X deserve legal consideration?" or "does X deserve moral consideration?" depend heavily on "Is X conscious?" and "Does X experience pain/pleasure?" That is, if we answer "Is X conscious?" and "Does X experience pain/pleasure?" then we can answer "does X deserve legal consideration?" and "does X deserve moral consideration?"

If "Is X conscious?" and "Does X experience pain/pleasure?" simply stand-ins for "does X deserve legal consideration?" or "does X deserve moral consideration?", then if we answered the latter two we'd stop caring about the former. I don't think that's so. There are still very interesting, very deep scientific questions to be answered about just what it means when we say something is conscious.

The problem is that I, for one, don't know what the question "Is X conscious?" means and I'm not sure how to judge "Does X experience pain/pleasure?" in a non-biological context either. Nor has anyone else ever convinced me they know the answers to these questions. Still, it does seem as if neurobiology is making slow progress on these questions so they're probably not intractable or meaningless. When all is said and done, they may not mean exactly what we vaguely feel they mean today; but I suspect that "conscious" will be more like the concept of "atom" than the concept of "ether". I.e. we'll recognize a clear connection between the original use of the word and the much more refined and detailed understanding we eventually come to. On the other hand, I could be wrong about that; and consciousness could turn out to be as useless a concept as ether or phlogiston.

Comment author: TheOtherDave 18 May 2013 04:01:00AM *  1 point [-]

Yeah, I waffled about this and ultimately decided not to say that, but I'm not confident.

I'm not really clear on whether what people are really asking is (e.g) "does X deserve moral consideration?," or whether it just happens to be true that people believe (e.g.) that it's immoral to cause pain, so if X can experience pain it's immoral to cause X pain, and therefore X deserves moral consideration.

But I agree with you that if the former turns out to be true, then that's the right question to be asking.

Admittedly, my primary reason for being reluctant to accept that is that I have no idea how to answer that question a priori, so I'd rather not ask it... which of course is more bias than evidence.

Comment author: Manfred 18 May 2013 03:55:14AM 1 point [-]

So how do you decide whether or not X deserve's moral consideration based on something like long-term interactions, or looking at its code? I mean, if the real question is "how do I feel about X," something something explicit.

Comment author: Qiaochu_Yuan 18 May 2013 05:35:13AM 3 points [-]

Dunno. But I'd rather admit my ignorance about the right question.