Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Climate change: existential risk?

6 Post author: katydee 06 May 2011 06:19AM

What does the community here think when it comes to climate change as a potential existential risk? While strategies for combating climate change are fairly straightforward, the seeming lack of political capital behind meaningful climate reform and legislation seems to indicate that the problem is going to get substantially worse before it gets better, and the potential consequences of ignoring this issue look to be quite severe indeed!

Should the rationality/x-risks community be spending more effort on evaluating this idea and exploring potential solutions? It certainly seems like a big problem, and the current trajectory is quite worrisome. On the other hand, the issue is a political minefield and could risk entangling the community in political squabbling, potentially jeopardizing its ability to act on other threats. What do you guys think?

Comments (25)

Comment author: Mitchell_Porter 06 May 2011 06:38:11AM 16 points [-]

An "existential risk" is an extinction risk. Climate change is not an extinction risk - not for the human race, anyway. It would just mean suffering. Pick any horrible historical event you can think of - plague, war, whatever - as lethal as you like - and clearly it wasn't enough to end the species, because the present generations are here. Things can get incredibly bad without actually killing us off. 6 billion people could die and the other 1 billion would continue the struggle amidst the ruins.

I am convinced that in any case, climate change is too slow to matter, when compared to the development of technology. We are going to have our AI/nanotech crisis long before we even have 2 more degrees, let alone 4. If we survive AI/nanotech, then we can solve the problem, and if we don't survive AI/nanotech, then the future is out of our hands anyway.

Comment author: katydee 06 May 2011 06:47:34AM *  10 points [-]

I'm aware of what an existential risk is. While I don't think that climate change is likely to destroy mankind on its own, I consider the potential for runaway climate change to provoke massive instability to be truly worrisome on an existential level, especially in the long run.

I hope that you're right with climate change being too slow to matter, but I also think that hope is not exactly a reliable strategy.

Comment author: rwallace 06 May 2011 12:59:50PM *  5 points [-]

It is the way of extinction that what kills the last individual commonly has nothing to do with the underlying factors that doomed the species. Could climate change by itself kill everyone on earth? No. Could it be a significant contributing factor to a downward spiral that ends in extinction? I hope not, but I don't really know. Leaving aside the purely fictional versions of AI and nanotech to which you refer, could real-life versions of those technologies help us develop sustainable energy sources? Yes. Will they do so fast enough? I hope so, but I don't really know.

As for whether there's anything we here can usefully do about it, I don't think it would be useful for us to get bogged down in the sort of bickering about politics that all too often goes with this kind of territory, but LW does have a good track record of avoiding that; and perhaps it would be useful for us to explore potential solutions.

Comment author: NancyLebovitz 06 May 2011 01:55:19PM 3 points [-]

Any ideas about how capable computer programs would need to be to give significant help to researchers with hypothesis generation and with whether research programs make sense? With seeing whether abstracts match experimental results?

It seems to me that some of this could be done without even having full natural language.

Comment author: rwallace 06 May 2011 04:18:19PM 2 points [-]

As long as we're talking about, as you say, significant help rather than solving the whole problem, about what can be done without having full natural language - then I think this is one of the more promising areas of AI research for the next couple of decades.

I talked a few months ago to somebody who's doing biomedical research - one of the smartest guys I know - asking what AI might be able to do to make his job easier, and his answer was that the one thing likely to be feasible in the near future that would really help would be better text mining, something that could do better than just keyword matching for e.g. flagging papers likely to be relevant to a particular problem.

Comment author: DanielLC 06 May 2011 07:22:07PM 6 points [-]

We don't need help finding sustainable resources. We already have nuclear power. We just need to convince everyone that nuclear isn't bad.

Comment author: [deleted] 08 May 2011 01:10:50PM -3 points [-]

Nuclear's hardly sustainable long-term though. It's a temporary patch that might help for fifty years or so.

Comment author: DanielLC 08 May 2011 06:32:07PM 7 points [-]

At $130/kg, there's enough for 80 years at current consumption (about 10 years if we use it for all our electricity), but if we're willing to use ore with a tenth as much uranium, there's 300 times as much. Also, there's ways of using uranium 238, which is about 140 times as abundant. It's still a temporary patch, in the sense that we can't just keep using it until the sun goes out, but it will last long enough for fusion power to become economically feasible.

Comment author: BenAlbahari 08 May 2011 11:15:36PM 5 points [-]

Also, there's ways of using uranium 238

And thorium.

Comment author: wedrifid 06 May 2011 01:05:36PM 1 point [-]

Could climate change by itself kill everyone on earth? No.

How much climate change are we talking? :)

Comment author: rwallace 06 May 2011 01:14:55PM 2 points [-]

Let's say the amount realistically liable to occur in the next few centuries :) If we have to worry about it on megayear timescales, we'll already have failed.

Comment author: STL 06 May 2011 07:37:04AM 15 points [-]

Trying to predict the future is hazardous, not only because of the conjunction fallacy, but because there are so many factors involved. Even if you're careful to avoid the mistake of thinking that A, B, and C happening are more likely than A and C, it's not easy to estimate what will happen when you've got factors A through Z involved, and complicated chains of relationships like "if D and E, but not F, and G is stronger than expected, then H". Misjudging the likelihood of a factor, or misunderstanding a relationship, or omitting factors or relationships, can make an apparently solid set of predictions completely worthless.

That's not to say that it's impossible, of course. If you restrict yourself to asking what will happen if we push on a physical system, and throw the world's best scientists at the problem for decades with millions of dollars and powerful supercomputers at their disposal, then they can actually model what inputs will cause what outputs with probabilities attached. The existence of anthropogenic global warming is a fact, just as the existence of evolution and atoms is a fact, and it's clear that all else being equal (note 1), basically everyone would prefer for the Earth's climate to be original recipe instead of extra crispy. The problem is, we don't know what the inputs will be, and it's trying to guess what the inputs will be that's especially hazardous. It's a lot harder to model what one Congress will do, let alone many. (Modeling the mind-killer is a headache.)

When faced with this sort of problem, I find it useful to instead think about possible end states, which are typically easier to envision and enumerate, and ask how likely it is we'll end up there, through any path whatsoever. And Stein's Law is usually helpful: "If something cannot go on forever, it will stop."

Anthropogenic global warming can't go on forever, so it'll stop. How will it stop? I can think of several ways - this is a non-exhaustive list:

  • We discover a novel feedback loop, Earth's atmosphere becomes like Venus's, and everyone dies. It's pretty clear that this isn't a possibility, we hope, although it should be considered before being rejected (note 2).

  • We collectively come to our senses, and do all the right things right now to keep the problem from getting any worse, and to fix as much as physically possible of the damage that's already been done. It's also pretty clear that this will not happen.

  • A technological breakthrough substantially solves the problem for us. For example, we solve a bunch of engineering problems, and leapfrog from ITER to cheap and plentiful commercial nuclear fusion in just a couple of decades, without having thrown 100 billion dollars at the problem (as that would be shading into the "come to our senses" scenario). The probability of this one is hard to judge - we get stuck by some problems for a while before eventually solving them - but hope is not really a plan.

  • The nasty consequences of global warming keep getting worse and worse, until advanced civilizations are wrecked back into more primitive states, where they're unable to keep dumping carbon into the atmosphere. It looks to me like this one is unlikely too - advanced nations will be able to cope at significant cost. It's just poor nations that are boned.

  • We run out of coal (note 3), oil, and natural gas to burn. They're finite, so this is guaranteed to happen - the question is whether it happens before anything else. A more precise question is, when will our rates of production stop increasing - combined with inelastic demand, this will cause significant price increases that force us to consider previously more expensive (or ionizing), but non-carbon-emitting, sources of energy. This is the scenario that I judge as most likely. Unfortunately, it looks like the result will be extra crispy at a minimum.

  • Something else - increasing food/water/resource scarcity leads to increasing conflict, and eventually to global thermonuclear war - we know that one is perfectly capable of wrecking technological civilization. Hopefully unlikely (there's that word again).

My conclusion is that because many people are already working on both sides of this issue, this community's time would be better spent elsewhere.

Note 1: The "all else being equal" part is key. The ultimate problem isn't that some people want to seriously modify the Earth's climate in and of itself, or for the lulz, or because they're supervillains. It's because fucking money is at stake, and like Mafia bosses in movies, people want their fucking money and they want it now. This wouldn't even be a problem, except that carbon is an unpriced negative externality.

Note 2: Just as igniting the Earth's atmosphere was considered and rejected before the Trinity test. Note that many popular accounts of how this possibility was considered are completely wrong. The worry was never that a nuclear weapon could ignite a global chemical fire in the atmosphere - it was that it could ignite a global nuclear fire. (Follow Wikipedia's citation.) Fortunately for us, the physics don't work out that way.

Note 3: Fucking coal.

Comment author: Eugine_Nier 07 May 2011 02:19:38AM *  6 points [-]

The nasty consequences of global warming keep getting worse and worse, until advanced civilizations are wrecked back into more primitive states, where they're unable to keep dumping carbon into the atmosphere. It looks to me like this one is unlikely too - advanced nations will be able to cope at significant cost. It's just poor nations that are boned.

One thing I noticed about predictions of nasty consequences of global warming, is that they're always about 5 to 10 years from the present, with the date always being updated. See here, for a discussion of a recent example.

It all started back in October 2005 when the U.N. flatly stated, “by 2010 the world will need to cope with as many as 50 million people escaping the effects of creeping environmental deterioration.” They forecast “this new category of ‘refugee.’” In 2008 the Srgjan Kerim, president of the U.N. General Assembly, upped the doomsday prediction, saying there would “between 50 million and 200 million environmental migrants by 2010.” Environmental activist Norman Myers, a professor at Oxford University predicted that climate change could force to 200 million climate refugees.

The U.N. specifically identified Pacific and Caribbean populations that would be ravaged by climate change. Gavin Atkins, writing for Asiancorrespondent.com reports “a very cursory look at the first available evidence seems to show that the places identified by the UNEP as most at risk of having climate refugees are not only not losing people, they are actually among the fastest growing regions in the world.” Atkins reports that all of China’s “threatened” cities –Shenzzen, Dongguan, Foshan, Zhuhai, Puning and Jinjiang — are the fastest growing cities in the world.

Atkins also looks at other endangered locations, the Bahamas, St. Lucia, the Seychelles and the Solomon Islands. None have refugees and all have enjoyed healthy population growths.

This is by no means the only example of a global warming doomsday prediction failing to come true and being quietly forgotten.

Comment author: RobFisher 31 March 2012 10:25:33AM *  2 points [-]

I'm surprised to find statements here such as "the existence of anthropogenic global warming is a fact". I'm new here, haven't read all the sequences, and this may seem obnoxious. But I'm testing my beliefs and willing to change my mind.

Let's start with the article linked to by the OP. It says that 4 degrees of warming is likely and bad. I'll concentrate on likely.

The argument for athropogenic global warming goes something like this:

  • 1) carbon dioxide levels have increased since pre-industrial times
  • 2) increased carbon dioxide levels in the atmosphere will cause a small amount of warming
  • 3) we have measured a large amount of warming (since, say, 1880)
  • 4) there may be feedback effects that mean that the small amount of warming caused by CO2 could lead to a large amount of warming
  • 5) if we build a computer model of the atmosphere including supposed feedbacks, and tweak it until it predicts past events correctly, then it predicts a large amount of warming in the future

1 and 2 are uncontroversial.

3 is difficult to measure. For example land based thermometers are only accurate to, say, 1C, and we are extracting a signal that varies by tenths of a degree. Or we are using proxies like tree rings that are difficult or impossible to calibrate. The signal we do extract is not linear, to say the least. For example warming stops at times even as CO2 increases, so we know there are large variations not accounted for by CO2 which makes it hard to determine the influence of CO2 alone. The signal looks very different depending on what timescale one looks at, hence arguments about natural variation, decadal oscillations, the medieval warm period and so on.

The feedback effects in (4) are not well understood, which means that the models in (5) do not necessarily reflect how the real climate system works.

And if a model does not include important parts of the system, even if it correctly predicts past events that does not mean it can predict future events. I could build psuedo-random-number generators until I find one that happens to match closely the observed past temperature signal, but it will not predict future temperatures.

All this does not add up to a high level of certainty that additional CO2 will lead to any particular amount of warming. The sensitivity of climate to doubling of CO2 is not known with any degree of certainty. In short, climate is not as well understood as evolution and atoms.

The OP speaks of "meaningful climate reform and legislation" which means redirecting lots of resources to change the amount of CO2 emitted. Resources that could generate more utility elsewhere.

So what is going wrong? Possibilities:

  • a) I am wrong about 1-5 -- I haven't linked to any sources for them partly because I have built this picture by reading around a lot and filtering everything through whatever untamed cognitive biases I have. I think I will explore this more.
  • b) I am not being rational
  • c) I or other commenters are making poor certainty estimates outside their area of expertise
  • d) other commenters are assuming that the majority is right
  • e) other commenters have discounted climate skepticism after having seen poor climate skeptic arguments without having seen the good ones; or I have made the opposite error
  • f) something else…
Comment author: JohnH 06 May 2011 08:28:15PM *  2 points [-]

run out of coal (note 3), oil, and natural gas to burn. They're finite, so this is guaranteed to happen - the question is whether it happens before anything else

Running out is not likely to happen any time soon at expected usage growth rates. The cost of extracting coal, oil, and natural gas will (possibly) increase over time but new technologies may depress the price of each (see the current state of natural gas). The estimates for potentially extractable reserves gives a figure such that if ways are found to extract such reserves oil and coal will be in continual usage (at present growth rates) for over another hundred years in the future.

Further, there are ways of producing oil substitutes that become feasible as oil hits certain price points. As these get implemented large scale it is likely that economies of scale and incremental improvements will kick in so that even though there will still be oil being produced most ways that we currently use oil will be changed over to the new technologies. Oil just happens to be the lowest cost alternative currently but as demand increases and cost of production increase then the other alternatives will be used.

Comment author: Desrtopa 06 May 2011 03:53:23PM 9 points [-]

I'm graduating with a major in Environmental Science this week, and my take on it is that realistically, we're probably going to face large scale environmental disaster, which social and legislative action will be considerably too little, too late to prevent, but which sophisticated technological interventions may significantly mitigate, and which will probably constitute a fairly major mass extinction event. The impacts on human society will be significant, but it's not a major existential risk for us, per se.

Comment author: mstevens 06 May 2011 02:03:14PM 6 points [-]

It's something I'm increasingly worried about recently.

Some articles I read recently:

http://www.monbiot.com/2011/05/05/our-crushing-dilemmas/ http://thearchdruidreport.blogspot.com/2011/05/downside-of-dependence.html

that seem to relate.

My basic worries:

  • An environmental crisis looks likely
  • Environmentalism is massively internally contradictory, generally doesn't understand numbers, and doesn't appear to offer any solutions.
  • We look likely to end up with the default "solution" of many people dying.
Comment author: Vladimir_Nesov 06 May 2011 10:05:24PM 5 points [-]

What does the community here think when it comes to climate change as a potential existential risk? [...] the potential consequences of ignoring this issue look to be quite severe indeed!

There is a world of difference between "severe" consequences, even world-changing and civilization-crippling severe, and existential risk. Existential risk means no future, astronomical waste. Even 99% of population dying doesn't come close to that.

Comment author: ciphergoth 06 May 2011 08:03:20AM 5 points [-]

James Hansen argues there's a risk of a "Venus syndrome" in which runaway climate change makes the planet hostile to all life, which certainly puts it in the existential risk column. However, I think this is markedly less likely than UFAI doing us all in, and even if it wasn't, there are so many people working on global warming that the marginal contribution I can make is much less.

Comment author: fburnaby 06 May 2011 04:55:23PM *  2 points [-]

I think climate change, coupled with different war-over-resources type scenarios imply non-negligible existential risk. The most compelling argument I'm aware of for switching from environmental science (or military tech/strategy/intelligence, since those also impact the existential risk associated with climate change) is the marginal utility argument -- FAI research is so much smaller, making the relative contribution of one person so much larger (assuming you believe the law of diminishing returns should be applied to AI research and climate change).

Comment author: Mercy 09 May 2011 04:28:08PM 1 point [-]

This is a big issue, there's a lot of bright young people doing the equivalent of cleaning oil off seabirds because they feel they gotta do something about environmental risk. On the other hand, theres a lot of greens arguing against nuclear power and for growing tomatoes in greenhouses to save on aviation fuel, so maybe there isn't enough brainpower being dedicated to the topic....

Anyway the flipside of marginal utility here is comparative advantage: what skills do you have that can help with AI research?

Comment author: Goobahman 06 May 2011 06:27:13AM -3 points [-]

I have very little faith in the general population to take this threat seriously. I believe if salvation comes it will be from that small minority of those who always seek to move society forward, and covers for the rest of it's mistakes. I'm guessing some type of technological invention or discovery, with a few good powerful people manipulating the rest to embrace it. Even then though, the ramifications are more far-reaching and complicated than I can get my head around, so whilst I fear the consequences I still have very few strong opinions in the area. I will share one though: The people sticking their heads in the sand are morons.

Comment author: Nic_Smith 08 May 2011 03:59:37AM 1 point [-]

There's a related claim over at PredictionBook:

By 2100 it’s obvious to the average human that (1) Global warming is true and (2) it’s actually a good thing: Resulting in a better climate then we have had in recorded history. A stable Wet-Sahara/Warm-Siberia/No Rainforest scenario.

Out of four people that have expressed an opinion, I'm currently the only person there against this claim.

Comment author: sirswindon 17 January 2014 01:16:32AM -1 points [-]

Not to be the only one expressing the opinion that it is more than just a "wild card scenario" --- but I could construct a scenario where the majority of the population of the earth is exterminated by extreme climate change, i.e. global warming.

Comment author: asr 17 January 2014 01:55:35AM 0 points [-]

There's already billions of dollars per year being spent on reducing carbon impacts, lower-CO2 electrical power generation, etc etc.

My guess would be that, if you wanted to reduce the risks of climate change, the right place to put resources is biotechnology. Bioengineered green plants seems like the most promising way to get CO2 out of the atmosphere (or equivalently, to produce carbon-neutral chemical fuels).

Here again, though, there's already a lot of smart people and billion dollar research budgets devoted to relevant problems. So I don't know if the marginal value of more work on the topic is high.