Daniel_Burfoot comments on What if AI doesn't quite go FOOM? - Less Wrong

11 Post author: Mass_Driver 20 June 2010 12:03AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (186)

You are viewing a single comment's thread. Show more comments above.

Comment author: Daniel_Burfoot 20 June 2010 02:33:06PM *  2 points [-]

we have well scaling methods, which lack generalization power (statistical methods, neural nets, SVMs, deep belief networks

You got it backwards. These methods have generalization power, especially the SVM (achieving generalization is the whole point of the VC theory on which it's based), but don't scale well.

Comment author: red75 20 June 2010 04:08:01PM 0 points [-]

Yes, bad wording on my side. I mean something like capability of representing and operating on complex objects, situations and relations. However it doesn't invalidate my (quite trivial) point that we don't have practical theory of AGI yet.