Thanks for a thoughtful article. Intuitively, LLMs are similar to our own internal verbalization. We often turn to verbalizing to handle various problems when we can't keep our train of thought by other means. However, it's clear they only cover a subset of problems; many others can't be tackled this way. Instead, we lean on intuition, a much more abstract and less understood process that generates outcomes based on even more compressed knowledge. It feels that the same is true for LLMs. Without fully understanding intuition and the kind of data transformations and compressions it involves, reaching true AGI could be impossible.
I was heavily thinking about this topic in the past few weeks before stumbling across this post and your comment, and I appreciate both.
Ultimately, I agree with your conclusion. What’s more, I think this (becoming a pure reproductive consequentialist) is also inevitable from the evolutionary standpoint.
It’s already clear that pure hedonistic societies (“shards of desire” et al) are on a massive decline. The collective West, with an average US fertility rate of something like 1.6 per woman, is going to die off quickly.
But the gap will be filled, and it will... (read more)