whpearson comments on Open Thread: February 2010, part 2 - Less Wrong

10 Post author: CronoDAS 16 February 2010 08:29AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (857)

You are viewing a single comment's thread. Show more comments above.

Comment author: whpearson 16 February 2010 10:14:38PM *  2 points [-]

Listening to the longer version isn't so bad. The snippet was definitely the most objectionable.

It appears that Lanier thinks AI is suffering from the puppet problem bought on by taking the Turing test too seriously. The puppet problem is that computers can be used to implement puppets. Things that fake being intelligence. Imagine Omega makes a program for the Turing Test that looks intelligent by predicting you and having the program output intelligent sounding responses at different times, so that you (and only you!) think it is intelligent but you are really talking to the advanced equivalent of an answer phone*. So he thinks that AIs are going to be puppets. Which is a semi-reasonable opinion to come to if you just look at chatbots.

However Lanier doesn't, but should, argue that computers can only be puppets.

Edited: For clarity.

*I think Eliezer said something like if you see intelligent behaviour you should guess that there is an intelligence somewhere, it may just not be in the system that appears intelligent. I'm not organised enough to keep a quote file. Anyone?

Comment author: Zack_M_Davis 16 February 2010 10:30:44PM 4 points [-]

I think Eliezer said something like [...] Anyone?

"GAZP vs. GLUT":

If someday you come to understand consciousness, and look back, and see that there's a program you can write which will output confused philosophical discourse that sounds an awful lot like humans without itself being conscious - then when I ask "How did this program come to sound similar to humans?" the answer is that you wrote it to sound similar to conscious humans, rather than choosing on the criterion of similarity to something else.