Aleksei_Riikonen comments on The Concepts Problem - Less Wrong

9 Post author: Kaj_Sotala 16 April 2010 06:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (18)

You are viewing a single comment's thread.

Comment author: Aleksei_Riikonen 16 April 2010 09:21:10AM *  0 points [-]

Hardcoding a knowledge ontology that would include e.g. all concepts humans have ever thought of is theoretically possible, since those concepts are made up of a finite amount of complexity. It's just that this would take so very long...

Anyway, I wouldn't rule out that a sufficient knowledge ontology for a FAI could be semi-manually constructed in a century or two, or perhaps a few millenia. It is also theoretically possible that all major players in the world come to an agreement that until then, very strong measures need to be taken to prevent anyone from building anything-that-could-go-UFAI.

I of course wouldn't claim this probability to be particularly high.

Comment author: Risto_Saarelma 16 April 2010 04:14:31PM 4 points [-]

You might actually be able to do some back-of-the-envelope calculations on this. Humans are slow learners, and end up with reasonable ontologies in a finite number of years. By this old estimate, humans learn two bits worth of long term memory content per second. Assuming that people learn with this rate during 16 hours of waking time each day of their life, this would end up something like 32 megabytes of accumulated permanent memory for a 13-year old. 13-year olds can have most of the basic world ontology fixed, and that's around the age where we stop treating people as children who can be expected to be confused about obvious elements of the world ontology as opposed to subtle ones.

Hand-crafting a concept kernel that compresses down to that order of magnitude doesn't seem like an impossible task, but it's possible there's something very wrong with the memory accumulation rate estimation.

Comment author: Strange7 16 April 2010 05:58:32PM 1 point [-]

The 32 megabytes in question should be added to any pre-programmed instincts.

Comment author: Risto_Saarelma 16 April 2010 07:09:49PM 2 points [-]

Yes. Those would go into the complexity bound for the human genome, since the genome is pretty much the only information source for human ontogeny. The original post suggested 25 MB, which apparently turned out to be too low. If you make the very conservative assumption that all of the human genome is important, I think the limit is somewhere around 500 MB. The genes needed to build and run the brain are going to be just a fraction of the total genome, but I don't know enough biology to guess at the size of the fraction.

Anyway, it looks like even in the worst case the code for an AGI that can do interesting stuff out of the box could fit on a single CD-ROM.

Comment author: NancyLebovitz 16 April 2010 03:32:32PM 2 points [-]

Also, by that time, people might be enough more complex that hand-coding all the concepts 21st century people can hold will be an interesting historical project, but not enough for a useful FAI.