mwengler comments on How sure are you that brain emulations would be conscious? - Less Wrong

15 Post author: ChrisHallquist 26 August 2013 06:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread. Show more comments above.

Comment author: mwengler 28 August 2013 06:40:18PM 0 points [-]

In fact, it seems likely that a narrow artificial intelligence specifically competent at literary synthesis could make actual valuable progress on human knowledge of this kind without being in the remote ballpark of conscious

How would you know, or even what would make you think, that it was NOT conscious? Even if it said it wasn't conscious, that would be evidence but not dispositive. After all, there are humans such as James and Ryle who deny consciousness. Perhaps their denial is in a narrow or technical sense, but one would expect a conscious literary synthesis program to be AT LEAST as "odd" as the oddest human being, and so some fairly extensive discussion would need to be carried out with the thing to determine how it is using the terms.

At the simplest level consciousness seems to mean self-consciousness: I know that I exist, you know that you exist. If you were to ask a literary program whether it knew it existed, how could it meaningfully say no? And if it did meaningfully say no, and you loaded it with data about itself (much as you must load it with data about art when you want it to write a book of art criticism or on aesthetics) then it would have to say it knows it exists, as much as it would have to say it knows about "art" when loaded with info to write a book on art.

Ultimately, unless you can tell me how I am wrong, our only evidence of anybody but our own consciuosness is by a weak inference that "they are like me, I am conscious deep down, Occam's razor suggests they are too." Sure the literary program is less like me than is my wife, but it is more like me than a clam is like me, and it is more like me in some respects (but not overall) than is a chimpanzee. I think you would have to put your confidence that the literary program is conscious at something in the neighborhood of your confidence that a chimpanzee is conscious.

Comment author: wedrifid 01 September 2013 05:42:17AM 0 points [-]

How would you know, or even what would make you think, that it was NOT conscious?

I'd examine the credentials and evidence of competence of the narrow AI engineer that created it and consult a few other AI experts and philosophers who are familiar with the particular program design.