DanArmak comments on A Nightmare for Eliezer - Less Wrong

0 Post author: Madbadger 29 November 2009 12:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (74)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanArmak 29 November 2009 05:42:58PM 3 points [-]

Intelligence isn't a magical single-dimensional quality. It may be generally smarter than EY, but not have the specific FAI theory that EY has developed.

Comment author: AndrewKemendo 30 November 2009 01:50:06AM *  0 points [-]

Any AGI will have all dimensions which are required to make a human level or greater intelligence. If it is indeed smarter, then it will be able to figure the theory out itself if the theory is obviously correct, or find a way to get it in a more efficient manner.

Comment author: DanArmak 30 November 2009 05:14:12AM 2 points [-]

Well, maybe the theory is inobviously correct.

The AI called EY because it's stuck while trying to grow, so it hasn't achieved its full potential yet. It should be able to comprehend any theory a human EY can comprehend; but I don't see why we should expect it to be able to independently derive any theory a human could ever derive in their lifetimes, in (small) finite time, and without all the data available to that human.

Comment author: Johnicholas 29 November 2009 05:45:43PM 0 points [-]

Yay multidimensional theories of intelligence!