drnickbone comments on Notes on "The Limits to Growth" and surrounding material - Less Wrong

16 Post author: JonahSinick 21 July 2013 10:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (16)

You are viewing a single comment's thread.

Comment author: drnickbone 22 July 2013 12:02:34PM 5 points [-]

Like MIRI, the authors were interested in coping with a threat that the world had never faced before (on such a large scale), and one that could arrive with little notice. The authors make the point that people tend to think in terms of linear growth and decay rather than exponential growth and decay

I sometimes think of "Limits to Growth" and "Unfriendly AI" as rivals for the trophy of worst existential risk, with the folks who are most concerned about one of the risks viewing the folks concerned about the other with deep suspicion. ("Huh... your pet disaster will never happen; we should be worrying about mine instead!").

It's certainly useful to identify the common ground between the camps. Both scenarios involving taking existing exponential trends very seriously. In both cases, society as a whole has not been taking the trends seriously (or is even actively engaged in dismissal or ridicule of the future concerns). In both cases the lack of preparedness means that there are unlikely to be good results.