I think it's known simply as extrapolation bias. It is not necessarily about first derivatives, but is in general about hasty generalization. None of the examples in the article show that there is a discernible higher-level trend anywhere.
Assuming that no better technology than transistors is ever developed, we might have to show a little tiny bit of creativity. If we don't want to do that, there's always neuromorphic chips.
Ruthless Extrapolation
Article Summary: One of the key adaptations of humanity is the ability to see trends, which allows us to anticipate and preemptively adapt to future conditions. However, this ability has its limits. We're very good at seeing first derivatives, but terrible at seeing higher level trends. This leaves us vulnerable to situations where those first derivative trends unexpectedly change. The example used is with energy resources, where our adaptation to continually increasing energy usage leaves us vulnerable to a situation where we no longer have access to ever increasing energy resources.
I have two questions regarding the linked article. First, is there a name for this cognitive bias? The author uses "Ruthless Extrapolation", which I find quite fetching, but I think this is well known enough to have a name already. Secondly, what assumptions do we make that could be described as ruthless extrapolation? It seems to me that many in the Singularity Studies community simply assume that CPU transistor densities will continue to increase indefinitely, which certainly seems to be a case of ruthless extrapolation. What would happen to whole-brain emulation if we woke up tomorrow and found out that the most powerful CPU possible would have a transistor density only two or four times higher than an Ivy Bridge Core i7?