People don't pollute randomly. They pollute as a necessary by-product of other activities. There aren't, as far as I know, any benefits to pollution per se, but there are benefits to the sorts of activities that produce pollution. The costs of pollution may rise, but that doesn't imply that at any point the cost of pollution won't be worth it (except possibly on the margin due to the externality).
Ok, I understand what you're saying now. The matter under discussion is the negative externalities of pollution (perhaps to future generations). I don't know if there's enough Uranium or Plutonium for this to be a realistic hypothetical, but one could imagine a world in which nuclear reactor waste accumulated to such a degree so as to substantially reduce the amount of inhabitable land.
In 1972, Donella Meadows, Dennis Meadows, Jørgen Randers, and William Behrens III published The Limits to Growth, a book about the consequences of unchecked population growth and economic growth. The book was very popular at the time, selling 12 million copies. As a part of my work on the project "Can we know what to do about AI?" I did a preliminary investigation of the claims in the book, whether they've been born out, and what the book's impact has been.
The book uses the framework of "systems dynamics," which was pioneered by Jay Forrester. Paul Krugman criticized Forrester as having been unaware of prior overlapping work by economists. So it's possible that the ideas in The Limits to Growth are less novel than it appears on the face of things. I haven't investigated the extent to which this is the case, but may do so later. This blog post focuses on The Limits to Growth.
My initial impression of the book based on what people have written about it was very different from the impression that I formed upon reading the book. The book has been misrepresented (whether denotatively or connotatively) both by critics and by sympathizers. In the first section below, I discuss how the book has been misrepresented, and in the second section I discuss what the book says.
Misrepresentations of The Limits to Growth
In his 2008 paper A comparison of limits to growth with 30 years of reality, Graham Turner describes how the book has been misrepresented by critics:
An article in The Nation seems to suggest that the book's reputation has been tarnished by association with others who made outlandish claims:
What The Limits to Growth actually says
Based on my reading of the book:
They argue that
(a) Even if we had unlimited resources, exponential growth of resource use would still result in exponential growth of pollution, unless pollution is curbed.
(b) The marginal cost of reducing pollution grows astronomically as pollution tends to zero, so that it might not be possible to quell the increase in pollution coming from an exponential increase in resource usage.
(c) A sufficiently large increase in pollution could lead to societal collapse.
so that there's a need to limit resource usage.
Policy recommendations
The authors discuss potential ways to mitigate the problems that they identify, but only at a theoretical level. They recognize the complexity of the implementation details, and don't make policy recommendations.
There may have been examples of people trying to implement policies based on the book and surrounding literature. I plan on investigating this further. I'd appreciate any references from readers.
Subsequent work by the authors
I have not reviewed this work. It may not have high relevance to the project at hand, because it could be that not enough time has passed for it to be possible to check the veracity of the substantive predictions. I may investigate further.
Graham Turner's 2008 paper A comparison of limits to growth with 30 years of reality argues that the trajectory of civilization since the publication of The Limits to Growth is in line with a scenario in the book that leads to societal collapse. The significance of this is unclear, because past historical trends need not predict future historical trends.
The book has relevance to addressing AI risk that extends beyond the project at hand.
Like MIRI, the authors were interested in coping with a threat that the world had never faced before (on such a large scale), and one that could arrive with little notice. The authors make the point that people tend to think in terms of linear growth and decay rather than exponential growth and decay, and that unchecked exponential growth of resource usage could result in a very sudden problem (by human standards) of great scarcity and/or pollution.
As such, it may be fruitful to take a closer look at the authors' remarks on these points.