You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

JonathanGossage comments on Superintelligence Reading Group - Section 1: Past Developments and Present Capabilities - Less Wrong Discussion

25 Post author: KatjaGrace 16 September 2014 01:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (232)

You are viewing a single comment's thread. Show more comments above.

Comment author: JonathanGossage 17 September 2014 05:57:23PM 2 points [-]

Programming and debugging, although far from trivial, are the easy part of the problem. The hard part is determining what the program needs to do. I think that the coding and debugging parts will not require AGI levels of intelligence, however deciding what to do definitely needs at least human-like capacity for most non-trivial problems.

Comment author: KatjaGrace 22 September 2014 03:20:18AM 2 points [-]

I'm not sure what you mean when you say 'determining what the program needs to do' - this sounds very general. Could you give an example?

Comment author: LeBleu 07 October 2014 08:42:03AM 0 points [-]

Most programming is not about writing the code, it is about translating a human description of the problem into a computer description of the problem. This is also why all attempts so far to make a system so simple "non-programmers" can program it have failed. The difficult aptitude for programming is the ability to think abstractly and systematically, and recognize what parts of a human description of the problem need to be translated into code, and what unspoken parts also need to be translated into code.