You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

KatjaGrace comments on SRG 4: Biological Cognition, BCIs, Organizations - Less Wrong Discussion

7 Post author: KatjaGrace 07 October 2014 01:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (139)

You are viewing a single comment's thread.

Comment author: KatjaGrace 07 October 2014 03:02:52AM 4 points [-]

'Let an ultraintelligent person be defined as a person who can far surpass all the intellectual activities of any other person however clever. Since the improvement of people is one of these intellectual activities, an ultraintelligent person could produce even better people; there would then unquestionably be an 'intelligence explosion,' and the intelligence of ordinary people would be left far behind. Thus the first ultraintelligent person is the last invention that people need ever make, provided that the person is docile enough to tell us how to keep them under control.'

Does this work?

Comment author: paulfchristiano 07 October 2014 03:18:06PM *  2 points [-]

Looks good to me, with the same set of caveats as the original claim. Though note that both arguments are bolstered if "improvement of people" or "design of machines" in the second sentence is replaced by a more exhaustive inventory. Would be good to think more about the differences.

Comment author: KatjaGrace 08 October 2014 07:31:56PM 2 points [-]

What caveats are you thinking of?

Comment author: CarlShulman 07 October 2014 11:20:14PM 2 points [-]

This application highlights a problem in that definition, namely gains of specialization. Say you produced humans with superhuman general intelligence as measured by IQ tests, maybe the equivalent of 3 SD above von Neumann. Such a human still could not be an expert in each and every field of intellectual activity simultaneously due to time and storage constraints.

The superhuman could perhaps master any given field better than any human given some time for study and practice, but could not so master all of them without really ridiculously superhuman prowess. This overkill requirement is somewhat like the way a rigorous Turing Test requires not only humanlike reasoning, but tremendous ability to tell a coherent fake story about biographical details, etc.

Comment author: Jeff_Alexander 24 October 2014 03:04:23AM 0 points [-]

For me, it "works" similarly to the original, but emphasizes (1) the underspecification of "far surpass", and (2) that the creation of a greater intelligence may require resources (intellectual or otherwise) beyond those of the proposed ultraintelligent person, the way an ultraintelligent wasp may qualify as far superior in all intellectual endeavors to a typical wasp yet still remain unable to invent and build a simple computing machine, nevermind constructing a greater intelligence.