Bugmaster comments on How sure are you that brain emulations would be conscious? - Less Wrong

15 Post author: ChrisHallquist 26 August 2013 06:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread. Show more comments above.

Comment author: Bugmaster 28 August 2013 10:06:04PM *  0 points [-]

Unfortunately the video you linked to is offline, but still:

The bar to actually have an interesting conversation is likely well below that needed for whole brain emulation.

Is there a chatbot in existence right now that could participate in this conversation we are having right here ?

Yes, people befriend chatbots and sticks, as well as household pets, but I am not aware of any cat or stick that could pass for a human on the internet. Furthermore, most humans (with the possible exception of some die-hard cat ladies) would readily agree that cats are nowhere near as intelligent as other humans, and neither are plants.

The Turing Test requires the agent to act indistinguishably from a human; it does not merely require other humans to befriend the agent.

Comment author: JoshuaZ 29 August 2013 04:08:52AM 0 points [-]

Sure, but there's likely a large gap between "have an interesting conversation" and "pass the Turing test". The second is likely much more difficult than the first.

Comment author: Bugmaster 29 August 2013 04:13:23AM *  0 points [-]

I was using "interesting conversation" as a short-hand for "a kind of conversation that usually occurs on Less Wrong, for example the one we are having right now" (which, admittedly, may or may not be terribly interesting, depending on who's listening).

Do you believe that passing the Turing Task is much harder than fully participating in our current conversation (and doing so at least as well as we are doing right now) ? If so, how ?