Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by

This is basically a version of Tyler Cowen's argument (summarized here: https://marginalrevolution.com/marginalrevolution/2025/02/why-i-think-ai-take-off-is-relatively-slow.html)  for why AGI won't change things as quickly as we think because once intelligence is no longer a bottleneck, the other constraints become much more binding. Once programmers no longer become the bottleneck, we'll be bound by the friction that exists elsewhere within companies.