Like MIRI, the authors were interested in coping with a threat that the world had never faced before (on such a large scale), and one that could arrive with little notice. The authors make the point that people tend to think in terms of linear growth and decay rather than exponential growth and decay
I sometimes think of "Limits to Growth" and "Unfriendly AI" as rivals for the trophy of worst existential risk, with the folks who are most concerned about one of the risks viewing the folks concerned about the other with deep suspicion. ("Huh... your pet disaster will never happen; we should be worrying about mine instead!").
It's certainly useful to identify the common ground between the camps. Both scenarios involving taking existing exponential trends very seriously. In both cases, society as a whole has not been taking the trends seriously (or is even actively engaged in dismissal or ridicule of the future concerns). In both cases the lack of preparedness means that there are unlikely to be good results.
Can you elaborate on what type of societal collapse was (conditionally) predicted? What kinds of things were they saying would happen in this scenario?
The marginal cost of reducing pollution grows astronomically as the fraction that you want to reduce it by tends toward zero,
Minor point, I had to re-read this a few times to get what you meant. The use of the adverb "by" made me think that you meant the marginal cost of reducing pollution a small amount would be higher than reducing it a large amount.
The marginal cost of reducing pollution grows astronomically as pollution tends to zero
In the book, does this cost refer to the cost of a technological solution to pollution, or does it also/instead refer to the cost of society coordinating to reduce pollution?
There's not a clear divide between the two things (e.g. it could be that what's needed is a technology that facilitates coordination). I don't remember the exact wording in the book.
I ask because as the costs of pollution rise, so might the benefits.
People don't pollute randomly. They pollute as a necessary by-product of other activities. There aren't, as far as I know, any benefits to pollution per se, but there are benefits to the sorts of activities that produce pollution. The costs of pollution may rise, but that doesn't imply that at any point the cost of pollution won't be worth it (except possibly on the margin due to the externality).
Ok, I understand what you're saying now. The matter under discussion is the negative externalities of pollution (perhaps to future generations). I don't know if there's enough Uranium or Plutonium for this to be a realistic hypothetical, but one could imagine a world in which nuclear reactor waste accumulated to such a degree so as to substantially reduce the amount of inhabitable land.
Thermodynamic inefficiency will always produce at least some heat pollution and I think it's safe to predict that the cost of achieving 99.99% efficiency is at least an order of magnitude more expensive than 99.9% efficiency.
In 1972, Donella Meadows, Dennis Meadows, Jørgen Randers, and William Behrens III published The Limits to Growth, a book about the consequences of unchecked population growth and economic growth. The book was very popular at the time, selling 12 million copies. As a part of my work on the project "Can we know what to do about AI?" I did a preliminary investigation of the claims in the book, whether they've been born out, and what the book's impact has been.
The book uses the framework of "systems dynamics," which was pioneered by Jay Forrester. Paul Krugman criticized Forrester as having been unaware of prior overlapping work by economists. So it's possible that the ideas in The Limits to Growth are less novel than it appears on the face of things. I haven't investigated the extent to which this is the case, but may do so later. This blog post focuses on The Limits to Growth.
My initial impression of the book based on what people have written about it was very different from the impression that I formed upon reading the book. The book has been misrepresented (whether denotatively or connotatively) both by critics and by sympathizers. In the first section below, I discuss how the book has been misrepresented, and in the second section I discuss what the book says.
Misrepresentations of The Limits to Growth
In his 2008 paper A comparison of limits to growth with 30 years of reality, Graham Turner describes how the book has been misrepresented by critics:
An article in The Nation seems to suggest that the book's reputation has been tarnished by association with others who made outlandish claims:
What The Limits to Growth actually says
Based on my reading of the book:
They argue that
(a) Even if we had unlimited resources, exponential growth of resource use would still result in exponential growth of pollution, unless pollution is curbed.
(b) The marginal cost of reducing pollution grows astronomically as pollution tends to zero, so that it might not be possible to quell the increase in pollution coming from an exponential increase in resource usage.
(c) A sufficiently large increase in pollution could lead to societal collapse.
so that there's a need to limit resource usage.
Policy recommendations
The authors discuss potential ways to mitigate the problems that they identify, but only at a theoretical level. They recognize the complexity of the implementation details, and don't make policy recommendations.
There may have been examples of people trying to implement policies based on the book and surrounding literature. I plan on investigating this further. I'd appreciate any references from readers.
Subsequent work by the authors
I have not reviewed this work. It may not have high relevance to the project at hand, because it could be that not enough time has passed for it to be possible to check the veracity of the substantive predictions. I may investigate further.
Graham Turner's 2008 paper A comparison of limits to growth with 30 years of reality argues that the trajectory of civilization since the publication of The Limits to Growth is in line with a scenario in the book that leads to societal collapse. The significance of this is unclear, because past historical trends need not predict future historical trends.
The book has relevance to addressing AI risk that extends beyond the project at hand.
Like MIRI, the authors were interested in coping with a threat that the world had never faced before (on such a large scale), and one that could arrive with little notice. The authors make the point that people tend to think in terms of linear growth and decay rather than exponential growth and decay, and that unchecked exponential growth of resource usage could result in a very sudden problem (by human standards) of great scarcity and/or pollution.
As such, it may be fruitful to take a closer look at the authors' remarks on these points.