timtyler comments on Intuitive supergoal uncertainty - Less Wrong

4 Post author: JustinShovelain 04 December 2009 05:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: Mitchell_Porter 05 December 2009 04:28:15AM 9 points [-]

It's a total digression from this post, but: it occurs to me that someone ought to try to figure out what the "supergoal" or utility function of C. elegans is, or what the coherent extrapolated volition of the C. elegans species might be. That organism's nervous system has been mapped down to every last neuron (not so hard since there's only about 300 of them). If we can't make a C.elegans-Friendly AI given that information, we certainly can't do it for H. sapiens.

Comment author: timtyler 09 December 2009 03:59:01PM *  0 points [-]

In a nutshell, it's to make more copies of the c. elegans genome:

http://en.wikipedia.org/wiki/God'sutilityfunction