All of Mervinkel's Comments + Replies

I was heavily thinking about this topic in the past few weeks before stumbling across this post and your comment, and I appreciate both.

Ultimately, I agree with your conclusion. What’s more, I think this (becoming a pure reproductive consequentialist) is also inevitable from the evolutionary standpoint.

It’s already clear that pure hedonistic societies (“shards of desire” et al) are on a massive decline. The collective West, with an average US fertility rate of something like 1.6 per woman, is going to die off quickly.

But the gap will be filled, and it will... (read more)

Mervinkel30

Thanks for a thoughtful article. Intuitively, LLMs are similar to our own internal verbalization. We often turn to verbalizing to handle various problems when we can't keep our train of thought by other means. However, it's clear they only cover a subset of problems; many others can't be tackled this way. Instead, we lean on intuition, a much more abstract and less understood process that generates outcomes based on even more compressed knowledge. It feels that the same is true for LLMs. Without fully understanding intuition and the kind of data transformations and compressions it involves, reaching true AGI could be impossible.

1eggsyntax
That's an interesting view, but it's not clear to me what the evidence for it is. Is this based on introspection into thinking? Although this new paper reviewing recent evidence on language may shed at least a bit of light on the topic: 'Language is primarily a tool for communication rather than thought'.