Eliezer_Yudkowsky comments on [LINK]s: Who says Watson is only a narrow AI? - Less Wrong

4 Post author: shminux 21 May 2013 06:04PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 21 May 2013 10:39:07PM 6 points [-]

I say Watson is only a narrow AI.

Comment author: shminux 21 May 2013 11:02:09PM *  2 points [-]

:)

But isn't it getting too wide too quickly?

Anyway, I am guessing that by your definition the difference between a narrow and a general AI is not the number of problem solving or reasoning tasks where it is as good as or better than humans, even if it's the vast majority of these tasks, but having a "general, flexible learning ability that would let them tackle entirely new domains", i.e. being vastly better than an average single human being, who generally sucks at adapting to "new domains".

Comment author: Manfred 22 May 2013 12:24:59AM 0 points [-]

the number of problem solving or reasoning tasks where it is as good as or better than humans

What, four?

Comment author: shminux 22 May 2013 01:32:10AM 7 points [-]

Sigh. Charitable reading is not a strong suit of this place. Not four. How about 100? 1000? Would that be enough?

Comment author: Manfred 22 May 2013 01:35:10AM 2 points [-]

Sorry, the silliness too tempting. But however you want to count the number of things that go into what Watson does, it really is a small portion of things humans can do.

Comment author: Decius 22 May 2013 04:52:13AM *  3 points [-]

What exponent is on the number of things that humans can do, generalized to the degree of "drive cars"?

Comment author: Manfred 22 May 2013 02:31:58PM *  1 point [-]

Hm, tough question. One way to get a quick lower bound might be "what's something that uses the same general skills as driving cars, but is very different, and in how many ways is it different?" So if we use the same spatial skills to do knitting, and we say it's different from driving in a car in about 10 ways (where what we consider a "way" sets our scale), then there are at least 2^10 things that use the same skills as, but are different from, knitting and driving cars (among other bad assumptions, this assumes that two things being alike in some way is binary and transitive). If there are 10 domains (everything is approximately 10) like "spatial skills and spatial planning, basic motor coordination," then the lower bound would be more like 2^100.

Comment author: Decius 23 May 2013 01:30:19AM 2 points [-]

If all of the things that humans can do are required for knitting and driving cars, than there are two things that humans can do, generalized to that level. If an AI could learn the hard way to drive and to knit, it would be able to do everything a human could do. I estimate that controlling vehicles is about four different skills by that definition (road vehicles, fixed-wing, rotary-wing, and reaction-mass spacecraft), but knitting, crocheting, and sewing are the same skill, and there are probably only two or three different skills that cover all of athletics (an AI that could learn to play football would probably be able to learn curling, but it might not be able to learn gymnastics or swiming)

I think that our existing AIs haven't demonstrated that they can learn to do anything the hard way. I could be wrong, because I don't have any deep insight into if existing AIs learn or are created with full knowledge.

Comment author: MugaSofer 23 May 2013 02:03:18PM *  0 points [-]

Well that's settled then.

(Incidentally, I think everyone agrees it's narrow now, but it does make "joining up" narrow AI sound more plausible than before.)