You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

falenas108 comments on Size of the smallest recursively self-improving AI? - Less Wrong Discussion

4 Post author: alexflint 30 March 2011 11:31PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread.

Comment author: falenas108 31 March 2011 12:50:52AM 3 points [-]

In terms of natural selection, couldn't homo sapiens be considered a FOOM?

Our first period of FOOMing would be due to social competition, which resulted in those with higher intelligence reproducing more.

Our current style of FOOMing is from the scientific knowledge, and with this we will soon surpass nature (one could even argue that we already have).

If we view nature as our "programer", we could even be called self recursive, as with each passing generation our knowledge as a species increases.

Comment author: alexflint 31 March 2011 07:48:34AM *  1 point [-]

Yeah analogies with evolutionary events are interesting. In the first example it's natural selection doing the optimizing, which latches onto intelligence when that trait happens to be under selection pressure. This could certainly accelerate the growth of intelligence, but the big-brained parents are not actually using their brains to design their even-bigger-brained babies; that remains the purview of evolution no matter how big the brains get.

I agree the second example is closer to a FOOM: some scientific insights actually help us to do more better science. I'm thinking of the cognitive sciences in particular, rather than the more mundane case of building discoveries on discoveries: in the latter case the discoveries aren't really feeding back into the optimization process, rather it's human reasoning playing that role no matter how many discoveries you add.

The really interesting part of FOOM is when the intelligence being produced is the optimization process, and I think we really have no prior analogy for this.

Comment author: atucker 31 March 2011 10:35:04AM *  1 point [-]

If we view nature as our "programer", we could even be called self recursive, as with each passing generation our knowledge as a species increases.

Kind of, but kind of not. I think self-recursing human intelligence would be parents modifying their babies to make them smarter.

The really interesting part of FOOM is when the intelligence being produced is the optimization process, and I think we really have no prior analogy for this.

Humans rapidly got smarter, but we were optimized by evolution. Computers got faster, but were optimized by humans.

When an optimization process improves itself, it makes itself even better at optimizing.

I think that's a pretty decent definition of FOOM: "When an optimization process optimizes itself, and rapidly becomes more powerful than anything else seen before it."