timtyler comments on Intuitive supergoal uncertainty - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (27)
It's a total digression from this post, but: it occurs to me that someone ought to try to figure out what the "supergoal" or utility function of C. elegans is, or what the coherent extrapolated volition of the C. elegans species might be. That organism's nervous system has been mapped down to every last neuron (not so hard since there's only about 300 of them). If we can't make a C.elegans-Friendly AI given that information, we certainly can't do it for H. sapiens.
In a nutshell, it's to make more copies of the c. elegans genome:
http://en.wikipedia.org/wiki/God'sutilityfunction