falenas108 comments on Size of the smallest recursively self-improving AI? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (47)
In terms of natural selection, couldn't homo sapiens be considered a FOOM?
Our first period of FOOMing would be due to social competition, which resulted in those with higher intelligence reproducing more.
Our current style of FOOMing is from the scientific knowledge, and with this we will soon surpass nature (one could even argue that we already have).
If we view nature as our "programer", we could even be called self recursive, as with each passing generation our knowledge as a species increases.
Yeah analogies with evolutionary events are interesting. In the first example it's natural selection doing the optimizing, which latches onto intelligence when that trait happens to be under selection pressure. This could certainly accelerate the growth of intelligence, but the big-brained parents are not actually using their brains to design their even-bigger-brained babies; that remains the purview of evolution no matter how big the brains get.
I agree the second example is closer to a FOOM: some scientific insights actually help us to do more better science. I'm thinking of the cognitive sciences in particular, rather than the more mundane case of building discoveries on discoveries: in the latter case the discoveries aren't really feeding back into the optimization process, rather it's human reasoning playing that role no matter how many discoveries you add.
The really interesting part of FOOM is when the intelligence being produced is the optimization process, and I think we really have no prior analogy for this.
Kind of, but kind of not. I think self-recursing human intelligence would be parents modifying their babies to make them smarter.
Humans rapidly got smarter, but we were optimized by evolution. Computers got faster, but were optimized by humans.
When an optimization process improves itself, it makes itself even better at optimizing.
I think that's a pretty decent definition of FOOM: "When an optimization process optimizes itself, and rapidly becomes more powerful than anything else seen before it."