I was reading a post on the economy from the political statistics blog FiveThirtyEight, and the following graph shocked me:
This, according to Nate Silver, is a log-scaled graph of the GDP of the United States since the Civil War, adjusted for inflation. What amazes me is how nearly perfect the linear approximation is (representing exponential growth of approximately 3.5% per year), despite all the technological and geopolitical changes of the past 134 years. (The Great Depression knocks it off pace, but WWII and the postwar recovery set it neatly back on track.) I would have expected a much more meandering rate of growth.
It reminds me of Moore's Law, which would be amazing enough as a predicted exponential lower bound of technological advance, but is staggering as an actual approximation:
I don't want to sound like Kurzweil here, but something demands explanation: is there a good reason why processes like these, with so many changing exogenous variables, seem to keep right on a particular pace of exponential growth, as opposed to wandering between phases with different exponents?
EDIT: As I commented below, not all graphs of exponentially growing quantities exhibit this phenomenon- there still seems to be something rather special about these two graphs.
Why would they care about it that much? If they're spending more money than it's worth to keep up the pace, or if they're intentionally slowing down their R&D, they're leaving a vast gap for a competitor to kick them out of the market.
You have successfully predicted the past ;-) With the Pentium 4, Intel deliberately went for the highest possible clock rate, because that looked good in marketing - at the expense of actual performance. This gave their main competitor, AMD, a massive opening for their then-current lines of processor, which did better in performance per clock, per watt and per dollar. Intel only recovered by going back to the P-III and developing the P-M and Core line from there.