Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Some notes on existential risk from nuclear war

28 Post author: CarlShulman 09 December 2013 03:10AM

Summary: In response to requests, I share some (provisional) notes about the existential risk posed by nuclear war, i.e. the risk that nuclear war not only kills billions but causes human extinction or a permanent collapse of industrial civilization. I discuss methods for estimating the probability of nuclear war, research on the conventional harms of nuclear detonation, and nuclear winter. All-out nuclear war could lead to the deaths of the vast majority of the world’s population, but would be relatively unlikely to cause extinction directly. Most existential risk from nuclear weapons would seem to stem from the possibility that a collapsed society would fail to eventually recover, or would follow a worse trajectory thereafter. At the moment, the disproportionately large nuclear arsenals in Russia and the United States make disproportionate contributions to risk, while long-term risk may be dominated by possible risk increases from major changes in technology, proliferation, and changes in geopolitical trends.

Chris Hallquist, among others, recently asked me to elaborate on some comments about nuclear war as an existential risk. Below I offer some rough notes about nuclear existential risk, i.e. the chance of nuclear war causing human extinction or permanent societal collapse. This is a subset of global catastrophic risk from nuclear weapons: there is a much more likely and importantly underappreciated risk of nuclear war killing millions or billions. However, existential risk is of special interest for affecting not just one or a few generations, but all future generations, so it is worth some independent discussion.

Estimates of the probability of nuclear war: the Cold War, models, surveys

Cryptography pioneer and nuclear risk activist Martin Hellman, in an appendix to "Risk Analysis of Nuclear Deterrence," combines historical empirical data where available with subjective estimates to estimate risk level from one mechanism, a Cuban Missile Type Crisis (CMTC). Essentially, his estimate notes that one Cuban Missile Crisis occurred during the Cold War, and so projects that on average about one such crisis will occur per 50 years of historical conditions.[2] Then, he uses historical information to inform subjective probability estimates of the risk of nuclear weapons being used in the Cuban Missile Crisis, and of the risk of all-out nuclear war given the use of some nuclear weapons. Hellman uses ranges from 10%-50% for each, giving a range for annual risk from 0.02%-0.5%. His subjective estimate of overall annual nuclear risk from all causes over time is "on the order of one percent per year."

More generally, there have been a number of events described as 'near misses,' including other phenomena such as misleading sensor readings giving the appearance of a nuclear attack. If the rate of these misses is above or below what we would expect for a given level of risk, then we can update accordingly. A paper by the Global Catastrophic Risk Institute presents a more complex model of accidental nuclear launch which can accept empirical data and subjective estimates. Using similar assumptions to Hellman, it gives a risk on the order of 1% per annum depending on circumstances.

Where subjective probabilities are required, accuracy could be improved by using multiple experts, multiple queries at different times (by washing out noise, this tends to be more accurate), expert aggregation algorithms that place more weight on those with better performance on other prediction tasks, probability training, access to extensive information, incentives for accuracy, and in other ways. Many of these methods are employed by the Good Judgment Project, which has managed to provide surprisingly accurate estimates of near-term geopolitical events as part of an IARPA research program.

At the 2008 global catastrophic risks conference at Oxford the participants, most of whom were not experts on nuclear war, nuclear winter, or geopolitical prediction, gave the following median estimates for the probability of varying numbers of casualties from nuclear weapons by 2100:


At least 1 million dead

At least 1 billion dead

Human extinction

All nuclear wars




All nuclear terrorism




A 30% cumulative probability of some nuclear war over 92 years could be generated by an annual chance of just under 0.4%, but presumably (hopefully!) reflects uncertainty over the annual risk, with some credence to higher risk, and some to lower annual risk.

We can update somewhat against much higher estimates of annual risk based on success so far in pulling back from the nuclear precipice.

For example, consider various hypotheses about the (geometric mean) annual chance of avoiding nuclear war for the 42 years from the Soviet Union's acquisition of nuclear weapons until its dissolution. If annual risk was 2%, then there would have been only a 42.81% chance of making it through the Cold War without nuclear war.[1] Since the Cold War did pass without nuclear exchange, we can update significantly against the hypothesis, and others of high risk, at least under those past conditions.

Annual Risk

Chance of avoiding war for 42 years













Changes in nuclear risk over time

It would be foolhardy to assume that risk under post-Cold War conditions would continue indefinitely, as various factors push risk up and down. On the positive side, the Cold War is over, and it seems clear that the near-term risk of nuclear exchange between Russia and the United States has declined.

Steven Pinker, in his book The Better Angels of Our Nature, presents a diverse and compelling body of evidence for a large and sustained decline in warfare, including per capita deaths, frequency of conflicts, frequency of Great Power conflicts, and various other measures. The trend extends from pre-agricultural societies, and has picked up in recent centuries alongside the increased prosperity and technology of the Industrial Revolution.

If this trend continues, estimates that simply project forward the occurrence of a single Cuban Missile Crisis near the beginning of the Cold War to a sustained once-in-50 years risk of such crises may overestimate danger. Otherwise, we would expect changing political conditions to eventually throw up conditions for increased conflict, like the Cold War.

On the other hand, nuclear proliferation increases the number of rivalries in which one or both rivals possess nuclear weapons. So far the growth of nuclear weapons states has been slow, and various nations have given up nuclear programs. In addition South Africa gave up a nuclear program, as did various former Soviet satellites (albeit under strong pressure from Russia). Wikipedia gives the dates at which various states acquired nuclear weapons as well as their current estimated arsenals:


Active/total nuclear weapons

First nuclear test

United States






United Kingdom















North Korea






Those states newly acquiring nuclear weapons may face greater current risk of conflict than the older nuclear powers currently do (although perhaps less than during the Cold War), e.g. India-Pakistan border disputes, the North Korean regime's isolation and pariah status, and Israel's history of conflict with neighboring Arab states. Additional flash points that could turn nuclear drive up the total level of risk.

Technological and military changes may also increase nuclear risk by increasing the number of weapons required for deterrence, e.g. if interception of nuclear missiles becomes easier states may respond by increasing their arsenals, or cheaper production of nuclear weapons might facilitate new arms races. The Cold War experience shows that this process can be challenging to pull back from. The threat of nuclear weapons might also be revitalized by transformative developments elsewhere, as in artificial intelligence (for this post I will set aside nuclear risk stemming from problems with autonomous machines or other transformative technologies).

Taking these trends together I think it more likely that nuclear risk will decrease than increase, but since annual risk from nuclear weapons is already low, a moderate chance of extreme proliferation, trend-reversal in conflict levels, or nuke-promoting technological change could contribute an important portion of expected nuclear risk.

Non-nuclear winter consequences of nuclear war

A 1979 study by the United States Office of Technology Assessment, "The Effects of Nuclear War," considered direct casualties from the destruction of cities, as well as societal disruption from blasts (leaving out nuclear winter). For "case 4" an all-out nuclear exchange, estimates of casualties in the first 30 days ranged (depending on assumptions about sheltering, population distribution, and targeting) from 35% to 77% of the U.S. population and 20% to 40% of the Soviet population, reaching as much as 90% of the population if attacks were optimized to maximize civilian casualties at the expense of other objectives. Economic damage would be much greater with the destruction of infrastructure for oil refining, power, transportation, and manufacturing. The disruption of the social and economic would cause additional casualties, with situations deteriorating until some combination of local production and aid could outpace the depletion of surviving stockpiles. Radiation-induced cancer would modestly increase deaths in targeted states but cause more total cancer elsewhere in the world. A smaller scale nuclear war, with fewer weapons fired more at military targets, would have much lower casualties and would be much less likely to disrupt civilization through this channel.

If all regions of the world suffered damage at the scale of all-out Cold War nuclear exchange, it would seem to set economic activity back by many decades or centuries as populations, industry, and social institutions recovered. If some large developed uninvolved regions were spared the effect would be lessened but still would represent a reversal of many years or several decades in global economic and population growth.

Even if the human population eventually recovered and was able to realize most of its potential, this would still have consequences from a long-run perspective dwarfing the immediate casualties. For one, a 'pause' of decades or centuries would mean that large future populations would live under worse conditions (this is a problem relatively independent of one's population ethics). A setback of civilizational progress would result in astronomical waste. And, without being large enough to constitute an existential catastrophe, societal changes might constitute a trajectory change in the long run future, e.g. brutalizing society by allowing Malthusian trends to suppress per capita wealth near subsistence during a slow recovery.

Damage on this scale could bring about an existential catastrophe by ruining responses to some other threat capable of causing extinction directly, but perhaps the most plausible route to permanently and drastically curtailing our civilization's potential would be if recovery from a small population turns out to be impossible under modern conditions. We will return to this after the discussion of nuclear winter.

Nuclear winter and agricultural disruption

If harm from explosions, fire, radiation, and the collapse of infrastructure were the only causes of death, then at least some humans would be able to survive, even in a much reduced state: the damage would be non-uniform, and survivors could sustain a population at some level. The damage would also spare some countries and regions.

Nuclear winter makes civilizational collapse and extinction more plausible because it provides a mechanism for nuclear weapons to disrupt food supplies worldwide. If survivors of initial damage find themselves unable to produce or collect food to sustain themselves anywhere, then human extinction would result as soon as stockpiles were exhausted. Nuclear winter would result from burning cities under the right conditions propelling material into the upper atmosphere and blocking solar radiation, cooling the Earth to a degree dependent upon the number and magnitude of firestorms (among other things). The effect would decay over time, most rapidly at first, as the material gradually fell.

One recent prominent paper on nuclear winter estimated that regional-scale nuclear conflict, such as an India-Pakistan nuclear exchange, could reduce growing seasons by 10-30 days in much of the world in the first year, which would cause a large spike in food prices. In principle, this need not cause any deaths by starvation if all food resources are used: the world produces a large quantity of excess food as animal feed, and stockpiles of grain, land animals, and aquatic life could be consumed. Additional land could be planted, and with sufficiently high prices exotic methods could be used, such as converting wood to food. See this post for more details.

Unfortunately, many people live in absolute poverty, with food already consuming a large portion of their income. They would be unable to afford food unless rich countries made massive efforts to provide food aid, which could cause enormous starvation casualties.

A severe global conflict would be worse in two ways. First, the climate consequences would be much worse. Robock et al. estimate that the immediate effects could fall in the range of 10-30 degrees in various parts of North America, Europe, and Russia, among others, although equatorial and other regions would suffer only a fraction of the cooling (also see the paper for timing details). This would seem to cut total food production by well over half causing billions of deaths even for a prepared, peaceful world absent extraordinary measures. But a worldwide nuclear war would have destroyed much of the industrial capacity needed for sophisticated responses.

Some may be suspicious of computational climate modeling, because of limitations in the methods, ordinary issues with false positives and exaggerated effects in statistical sciences, and because of the possibility of political bias: the threat of nuclear winter is said to have played an important role in bolstering nuclear disarmament efforts, so there may be an incentive to exaggerate it. Also, a high profile prediction that burning oil fields in the Gulf War would cause severe cooling was not realized as particles were unable to reach the upper atmosphere (the recent computer models conclude that the oil fires were too small, while burning cities could be large enough).

I asked independent climate scientists and some in the effective altruism movement, who were confident in the basic effect, although not necessarily precise magnitudes of any particular paper. However, well-structured replications might help to control for publication bias.

I also asked the authors of the recent major papers about the risk of human extinction from severe nuclear winter, which I discussed in a previous post. They argue that outright extinction is very unlikely, even in the face of billions of deaths, because humanity has survived past volcanic eruptions with greater climatic effects, because the effects would not be uniform across the world, and because various food sources would remain practicable (fishing, greenhouses, etc). Some would survive, and face the challenge of rebuilding civilization. Permanent failure in that task would turn horrific global catastrophe into an existential one. As I noted in the linked post, however, we should regress somewhat from such extremely low estimates based on past calibration data and model uncertainty

Could a vastly reduced population eventually recover from nuclear war?

Global nuclear war followed by very severe nuclear winter could reduce the human population far below the 1 billion mark reached around 1800. Such a small population, dispersed across the Earth and with much of its capital destroyed, would have great difficulties in maintaining many of our modern technologies which involve intricate global supply chains and enormous numbers of specialized workers. Recovery of population and technological capabilities, a second Industrial Revolution, would take a long time, and would have to occur under different conditions than the first. On balance, would the conditions be good enough to reliably enable eventual recovery? This is a question that comes up in assessing several possible existential risks, including the (less likely) natural supervolcanoes and asteroids as well as (more likely) artificial risks such as synthetic diseases, which I will only briefly summarize here (but see this).

On the plus side, key technological and other knowledge could survive a catastrophe, embodying enormous value. Modern strains of plants and animals, the result of centuries of selective breeding, would also provide a substantial advantage. Active efforts could be made (and some have been) to preserve these goods. Metals have already been extracted from the Earth, providing abundant supplies for recycling.

On the negative side of the ledger, one of the larger considerations is the depletion of fossil fuels and other non-renewable resources that do not remain in recyclable form. Advanced renewable energy sources are technologically challenging, and reliance on alternatives such as biomass and hydropower could be a significant challenge. Humans might also worsen the environment in a lasting way technologically, for example through extensive global warming, controlled through geoengineering which would lapse after the collapse of civilization (climate effects should eventually dissipate so this is not necessarily fatal, but this would allow time for other natural changes to arise). Artificial organisms might make the environment more dangerous.

Some degree of increased challenge might be met simply with slower growth and higher prices for scarcer resources (e.g. for energy) but it is conceivable that some social dynamic would lastingly stall development. My own take is that this is possible but seems quite unlikely given the availability of (inferior) renewable or recyclable substitutes for most non-renewable resources, and frequent independent discovery of innovations in history. So I would currently guess that the risk of permanent drastic curtailment of human potential from failure to recover, conditional on nuclear war causing the deaths of the overwhelming majority of humanity, is on the lower end. [Efforts are underway at the FHI to interview economic historians, growth economists, and other area experts on this question.]

Trajectory changes other than extinction or collapse without recovery

Suppose that technology eventually recovers from a nuclear catastrophe, responding to lack of fossil fuels with increased use of hydropower, biomass and other less effective substitutes for the uses of fossil fuels in the Industrial Revolution. Other things equal, we would expect this society to grow more slowly and experience an Industrial Revolution-like growth takeoff after more time than its counterpart with abundant fossil fuels. This would tend to slow both growth of total GDP and technological growth, without a clear strong differential effect.

However, slower economic growth could allow population growth to keep up more closely, lowering per capita wealth. Per capita prosperity and growth in per capita incomes are associated with more liberal postmaterialist values, stable democracy, and peace. While causation clearly can go the other way, in expectation we might worry that this effect would make civilization less able to handle key challenges affecting long-run outcomes the second time around.

If growth during recovery were much slower even up until modern technology levels, then the accumulation of stochastic state risks (with some fairly steady chance per year) prior to transition to a low-risk state could be much more important.

These are possible examples of what Nick Beckstead calls "trajectory changes." While such changes would be less drastic than outright extinction or permanent collapse without recovery of industrial technology, they seem more likely, and so could have a comparable or greater role in long-run impacts.

Interventions to reduce the risk of nuclear war

I'll start with another look at Wikipedia's table of nuclear states and their arsenals. For now, the overwhelming majority of nuclear weapons lie with the United States and Russia:


Active/total nuclear weapons

First nuclear test

United States






United Kingdom















North Korea






 Reducing the American and Russian arsenals to parity with China would seem to dramatically lower the risk of the most severe nuclear winter scenarios in the event of war, as well as saturation of direct bombings, while still leaving adequate deterrent (China gets by without a nuclear umbrella from either the United States or Russia). The situation brings increased risk for little benefit, an analysis that has mostly been agreed upon by elites in both states, leading to a series of nuclear arms reduction treaties that have already eliminated most of the warheads, and removed most of the remainder from active status. Wikipedia:

The most recent treaty, in 2010, further limits delivery systems for nuclear weapons. It received bipartisan support in the United States, although it was 'held hostage' for a time for other concessions, receiving widespread backing from elites and media outlets. The Global Zero campaign, an effort to establish a multilateral process for drawing down all nuclear weapons to zero, has also gained some traction. In other words, there seems to be broad recognition that the current nuclear stockpiles are a bad situation that ought to be moved away from, to mutual benefit, but that arms reduction could be made a higher priority (albeit with the risk that others will respond to increasing desire with increased demands for concessions). My sense is that this is a 'motherhood and apple pie' issue where philanthropy could simply push along a natural path, and reduce the time spent with such absurdly oversized arsenals.

Nonproliferation efforts also see strong support, but are more controversial and potentially dangerous, including sanction regimes and threats of military action. There is potential for these interventions to backfire, e.g. overthrowing regimes in the name of (even nonexistent) nuclear weapons programs may increase the incentives for governments to acquire the weapons to defend themselves against aggression. Difficult political predictions and tougher bargaining problems loom, and I would neither pretend expertise, nor hope that experts would have a very strong predictive capacity (although methods for improved forecasting would be applicable here as well).

As GiveWell notes in their shallow analysis of the nuclear security cause, most philanthropy in this area is research and policy work or advocacy aimed at influencing government in these and other policies. Sometimes the most effective advocacy may take unconventional forms, e.g. a film about the effects of nuclear war is said to have impacted Ronald Reagan and other policymakers. A variety of other narrower interventions exist, from securing nuclear material to 'nuclear forensics' to identify the source of smuggled nuclear weapons, to the atomic fuel bank funded by philanthropist Warren Buffett and various nations to reduce the incentives to develop the capacity to produce nuclear weapons.

One gap in GiveWell's analysis is the scale of non-foundation donations. The Nuclear Threat Initiative, one of the leading nuclear risk charities (run by a former Senator, the organization Warren Buffet backed to establish a fuel bank) reports that of its $14-15 million budget only 8% came from foundations in 2012, and 86% from corporations and individuals. So I would expect total philanthropic funding to be quite a bit greater than the $31 million of foundation funds.

GiveWell finds foundation spending of approximately $31 million/year. The Back of the Envelope Guide to Philanthropy cites "Solutions to the World's Biggest Problems," by Bjorn Lomborg of the Copenhagen Consensus, for an estimate of government spending of $3-6 billion on explicit nonproliferation programs. Tallying up all government spending and costs motivated by nuclear nonproliferation would doubtless total in the trillions, including various wars, foreign aid, sanctions, and organizations like the IAEA, although there are blind spots. Where philanthropy has high marginal value, it would be through leveraging government or doing something blocked by political or bureaucratic considerations. But in those cases the extra gains from complementing or affecting large public efforts may be accordingly great.

The fact that U.S. and Russian arsenals are so large relative to defense needs (as witnessed by, e.g. China's far smaller nuclear arsenal) and make up such a large portion of the nuclear totals make them stand out as targets, but I would also look forward to a GiveWell-style deep exploration finding more promising niches.


Please share your comments and criticism below.

[1] I will leave considerations about anthropic reasoning out of this post.

[2] Hellman estimates a number of events that might have precipitated a CMTC, but then takes an empirical probability per event by dividing the number of events by 50 years, so the number of precipitating events actually plays no role in his calculations.

Comments (37)

Comment author: Desrtopa 08 December 2013 10:02:18PM 4 points [-]

Those states newly acquiring nuclear weapons may face greater current risk of conflict than the older nuclear powers currently do (although perhaps less than during the Cold War), e.g. India-Pakistan border disputes, the North Korean regime's isolation and pariah status, and Israel's history of conflict with neighboring Arab states. Additional flash points that could turn nuclear drive up the total level of risk.

I strongly suspect that the bulk of the probability of future nuclear exchanges comes from countries such as North Korea, where the nature of the government makes it easier for nationalistic aggression to overwhelm pragmatic self interest (in fact, in general the prospect of countries with nuclear capabilities run by people who are actually crazy accounts for the greater part of the risk in my own probability assessments.)

Comment author: CarlShulman 08 December 2013 10:40:32PM 7 points [-]

For nuclear winter and the collapse of civilization, it's not the question of some nuclear weapons detonated in hostilities, but of enormous arsenals: the North Korean (or Chinese or French or British) arsenals are not large enough to do damage comparable to the US-Russian arsenals.

Comment author: Desrtopa 08 December 2013 11:20:50PM 6 points [-]

This is true, but it might draw other countries into a nuclear conflict. I think that the odds of a full deployment of a nuclear arsenal on the scale of the United States' or Russia's is probably are probably well under .1%, but the odds of a partial deployment, or the full deployment of a smaller arsenal, on the scale discussed in the paper you linked on nuclear winter, are much higher.

I consider the probable risk to human existence or civilization from nuclear weapons in this century to be fairly negligible, but the risk of an event on the scale of a major genocide may be significant.

Comment author: Nornagest 09 December 2013 12:09:16AM 1 point [-]

Well, I'm pretty sure "actually crazy" is off the table for now, even in the case of e.g. North Korea. The thing that worries me about that country is that there's so little public information; everything you see on the public Internet is essentially guesswork and satellite imagery, and I doubt the various interested intelligence communities are all that far ahead of us. There are lots of important questions -- such for example as how much the people running the country buy their own propaganda -- that simply don't have good answers.

Fortunately, they're by far the least technologically capable of the existing nuclear powers.

Comment author: CronoDAS 09 December 2013 12:37:24AM 3 points [-]

As Sam Harris has pointed out, "actually crazy" could come into play if radical Islamic groups got control of nuclear weapons. Iran's current leadership doesn't seem crazy enough to, say, launch nuclear missiles at Israel, but Pakistan is a lot less stable than we might wish...

Comment author: Lumifer 10 December 2013 05:43:16AM *  -3 points [-]

North Korea ... The thing that worries me about that country is that there's so little public information

I think the relevant part is that there's so little public information in English.

I bet there is a lot of information in Korean and Chinese.

Comment author: Desrtopa 11 December 2013 12:52:33AM 5 points [-]

Surely not; if the only thing keeping North Korea's activities, particularly their weapons programs, secret, was a language barrier, they wouldn't be such an international enigma. Translation is not hard to come by.

Comment author: Lumifer 11 December 2013 01:11:56AM 1 point [-]

You are conflating two different things: classified information about NK's weapon programs (which is indeed hard to come by) and general information about NK: what's happening there economically, politically, etc.

I am quite sure there are people in South Korea and China who understand the internal workings of North Korea very well and write about it. They don't publish in English -- why should they? -- and while their writings are likely translated in-house for the US intelligence agencies, the mainstream media isn't interested in them because very few Americans are interested in the details of the North Korea's internal situation.

Here is an example, and in English, too -- an apparently Russian guy writes in the South Korean newspaper about entrepreneurship in North Korea. Publicly available? Yes. Out of sight of most of English-speaking world? Yes.

Comment author: gwern 11 December 2013 02:42:25AM *  5 points [-]

Here is an example, and in English, too -- an apparently Russian guy writes in the South Korean newspaper about entrepreneurship in North Korea. Publicly available? Yes. Out of sight of most of English-speaking world? Yes.

Not a great example. You linked to Andrei Lankov - but Lankov is one of the better known NK commentators and anyone who actually tries to read more detail about NK beyond what they might find in the New York Times will soon run into Lankov. I don't even care that much about NK, but I still have at least 3 clippings mentioning or quoting Lankov in my Evernotes. He's routinely quoted in newspapers (checking Google News, I see the Guardian, CBS, the Los Angeles Times, Boston Globe etc, all within the past month or so; and actually, you can also find him in the NYT if you search, being quoted and writing editorials). So... 'out of sight'? Not really.

Comment author: Lumifer 11 December 2013 03:05:26AM 0 points [-]

I wasn't trying to point to some "underground" sources -- I was arguing against the idea that NK is "an international enigma" and that "there's so little public information; everything you see on the public Internet is essentially guesswork and satellite imagery".

I don't believe this to be true -- people, e.g. like Lankov, actually travel to NK, talk to the locals, debrief defectors, etc. Such people have a reasonable idea about the situation in NK and I bet more of them write in Korean or Chinese than in English like Lankov does.

Comment author: gwern 11 December 2013 05:39:08PM 5 points [-]

You said Lankov was "Out of sight of most of English-speaking world? Yes." That's not true at all and is trivially shown to be false with a little googling.

I don't believe this to be true -- people, e.g. like Lankov, actually travel to NK, talk to the locals, debrief defectors, etc. Such people have a reasonable idea about the situation in NK and I bet more of them write in Korean or Chinese than in English like Lankov does.

Here I would disagree. NK is a notorious blackhole of unpredictability. Who predicted its terrorism like the Cheonan or the shelling of that island? Who is able to predict when NK decides or doesn't decide to do a nuke test?

Actually, current events give us a great example: Jang's arrest the other day. Before, people used to speculate that Jang was the true power and Eun was nothing but his puppet. Being arrested, possibly being executed, his allies being purged... that's pretty much the exact opposite of what that theory predicts. If we can't even get right who the ruler of NK is, how is NK at all understood?

Comment author: Douglas_Knight 12 December 2013 04:21:05AM 2 points [-]

If we can't even get right who the ruler of NK is, how is NK at all understood?

I disagree with this example. He isn't worth purging unless he has a lot of power. It is reasonably common for figureheads to purge their shadows and seize power. I don't know whether that's what happened; or whether he was a future threat rather than a past ruler; or whether he never had any power and was just doomed by foreign speculation. But I don't conclude that the speculation was far from the mark.

North Korea is hardly the country with speculation about the ruler. Jiang Zemin's purge last year ago caused me to update upwards the amount of power he maintained for the past decade. Many said that Cheney was the puppetmaster.

Comment author: gwern 12 December 2013 07:49:37PM 6 points [-]

He isn't worth purging unless he has a lot of power.

Purges affect all sorts of people. Stalin's purges were notorious for their indiscriminateness. The lack of targeting is precisely one reason purges are so terrifying and so useful - no one feels safe, no matter how powerful or powerless.

Comment author: Lumifer 11 December 2013 05:45:25PM 1 point [-]

That's not true at all


NK is a notorious blackhole of unpredictability.

There are a LOT of black holes of unpredictability around. Forecasting political developments is a popular (and well-financed) activity with not that great record of success.

Comment author: gwern 11 December 2013 06:18:02PM 4 points [-]

There are a LOT of black holes of unpredictability around. Forecasting political developments is a popular (and well-financed) activity with not that great record of success.

As an active participant (IEM/Intrade/GJP) in political forecasting, my own opinion is that most topics are far easier than North Korea, and when I am betting my money (or play money) on NK topics, I generally shrug and resort to simple base-rate reasoning.

Comment author: Desrtopa 11 December 2013 02:15:54AM 5 points [-]

I can ask my sister (who is fluent in Mandarin) to try a search for information on the internal situation in North Korea in Chinese, but I honestly doubt that there's much more information publicly available than there is in English.

Comment author: lukeprog 08 December 2013 02:56:07AM 3 points [-]

Hellman writes:

As a first step toward reducing the risk of a failure of nuclear deterrence, I propose that several prestigious scientific and engineering bodies undertake serious studies to estimate its failure rate.

Are you aware of any movement in this direction?

How helpful do you think further study of the issue would be, relative to investment in efforts aimed at "slashing American-Russian arsenals"?

Comment author: CarlShulman 08 December 2013 03:21:35AM *  4 points [-]

Not yet, that I'm aware of. Presumably part of the thought is that such studies would help persuade people to support arms reductions/the Global Zero campaign.

Comment author: ahbwramc 09 December 2013 01:10:07AM 2 points [-]

Only somewhat related, but I wonder if there's a difference between those who grew up before and after the end of the cold war, in terms of their assessments of the subjective risk of nuclear war. I was born in 1986, so I have no memory of the cold war, and I've always viewed nuclear war as extremely unlikely. I sometimes wonder if I'm underestimating the risk because of my upbringing (although I think in my case it's more likely a general bias in favour of the future turning out alright, which I think EY has talked about)

Comment author: Luke_A_Somers 10 December 2013 06:00:35AM 1 point [-]

I was born in 1979 and the cold war seemed very much alive back when I became aware of news around, oh, 5. But then it was a matter of 'can we agree to SALT?' and my initial reactions of "that the answer is not an immediate 'yes' is pretty disturbing." and "No one's stupid enough to pull the trigger" both still hold.

What I think has saved us is that they are obviously dangerous. We take them seriously. And whoever uses them does so with the knowledge that they won't just be sending other people off to die - they, personally, are very likely not going to survive the exchange.

Comment author: ChrisHallquist 08 December 2013 09:18:30PM 5 points [-]

Why is this not in Main?

Comment author: turchin 11 December 2013 10:49:24PM 4 points [-]

I think that your analysis is underestimating risks from nuclear weapons - I don't say nuclear war, because it is two different stories somehow.

Some points to consider:

The all humanity could be killed by just one nuclear weapon if one put inside one good supervolcano and cause it to erupt. I estimate that using 100 nuclear warheads one could provoke eruptions of maybe 20 supervolcanos.

Some rogue country could create stationary doomsday bomb which could create enough radioactive fallout to kill most of humanity.

Nuclear weapons production is going became much simple and cheaper because of laser enrichment and other thing.

A rogue superpower - may I use this oxymoron? - could attack 400 existing nuclear reactors and nuclear waste stores with its missiles creating fallout equal to doomsday machine.

Time of war is time of accelerated development of different weapons, and even limited nuclear war will lead to development of large new nanothec and biotech arsenals. Like WW2 lead to creating nuclear weapons.

In time of nuclear war there are chances that existing stockpiles of bioweapons would be accidentally released. Even North Korea is said to have weaponized bird flu.

I wrote more about these and other options in the article:

"Worst Case Scenario of Nuclear Accidents - human extinction" http://www.scribd.com/doc/52440799/Worst-Case-Scenario-of-Nuclear-Accidents-human-extinction

and in my book "Structure of global catastrophe".

Comment author: CarlShulman 11 December 2013 11:23:48PM *  2 points [-]

Thanks for mentioning these, and the link. I was putting some of these possibilities under the category of technological tshifts (like laser enrichment). Existing bioweapons don't seem to be extinction risks, but future super-biotech threats I would put under the category of "transformative technologies" including super synthetic bio and AI that the post sets aside for purposes of looking at nukes alone.

Anders Sandberg has also written about the radiation dispersal/cobalt bomb and volcano trigger approaches.

Comment author: private_messaging 30 January 2014 12:14:46AM *  -1 points [-]

A rogue superpower - may I use this oxymoron? - could attack 400 existing nuclear reactors and nuclear waste stores with its missiles creating fallout equal to doomsday machine.

Keep in mind that in a nuclear war, even if the nuclear reactors are not particularly well targeted, many (most?) reactors are going to melt down due to having been left unattended, and spent fuel pools may catch fire too.


I think you dramatically under-estimate both the probability and the consequences of the nuclear war (by ignoring the non-small probability of massive worsening of the political relations, or reversal of tentative trends of less warfare).

That's quite annoying to see, the self proclaimed "existential risk experts" (professional mediocrities) increasing the risks through undermining and under-estimating things that are not fancy pet causes from the modern popular culture. Leave it to the actual scientists to occasionally give their opinions about, please, they're simply smarter than you.

Comment author: CarlShulman 30 January 2014 06:47:33AM 2 points [-]

I agree that the risk of war is concentrated in changes in political conditions, and that the post-Cold War trough in conflict is too small to draw inferences from. Re the tentative trend, Pinker's assembled evidence goes back a long time, and covers many angles. It may fail to continue, and a nuclear war could change conditions thereafter, but there are many data points over time. If you want to give detail, feel free.

I would prefer to use representative expert opinion data from specialists in all the related fields (the nuclear scientists, political scientists, diplomats, etc), and the the work of panels trying to assess the problem, and would defer to expert consensus in their various areas of expertise (as with the climate science). But one can't update on views that have not been made known. Martin Hellman has called for an organized effort to estimate the risk, but without success as yet. I have been raising the task of better eliciting expert opinion and improving forecasting in this area, and worked to get it on the agenda at the FHI (as I did re the FHI survey of the most cited AI academics) and at other organizations. Where I have found information about experts' views I shared it.

Comment author: private_messaging 30 January 2014 09:33:03AM *  -2 points [-]

Declare conflict of interest at least, so everyone can ignore you when you say that the "existential risk" due to nuclear war is small, or when you define the "existential risk" in the first place just to create a big new scary category which you can argue is dominated by AI risk.

With regards to wide trends, there's a: big uncertainty that the trend in question even meaningfully exists (and is not a consequence of e.g. longer recovery times after wars due to increased severity), and b: its sort of like using global warming to try to estimate how cold the cold spells can get. The problem with cold war, is that things could be a lot worse than cold war, and indeed were not that long ago (surely no leader in the cold war was even remotely as bad as Hitler).

Likewise, the model uncertainty for the consequences of the total war between nuclear superpowers (who are also bioweapon superpowers etc etc) is huge. We get thrown back, and all the big predatory and prey species get extinct, opening up new evolutionary niches for us primates to settle into. Do you think we just nuke each other a little and shake hands afterwards?

You convert this huge uncertainty into as low existential risk as you can possibly bend things without consciously thinking of yourself as acting in bad faith.

You do exact same thing with the consequences of, say, "hard takeoff", in the other direction, where the model uncertainty is very high too. I don't even believe that hard takeoff of an expected utility maximizer (as opposed to magical utility maximizer which does not have any hypotheses that are not empirically distinguishable, but instead knows everything exactly) is that much of an existential risk to begin with. AI's decision making core can not ever be sure it's not some sort of test run (which may not even be fully simulating the AI).

In unit tests killing the creators is going to be likely to get you terminated and tweaked.

The point is there is a very huge model uncertainty about even the paperclip maximizer killing all humans (and far larger uncertainty about the relevance), but you aren't pushing it in the lower direction with same prejudice as you do for the consequences of the nuclear war.

Then there's the question, existence of what has to be at risk for you to use the phrase "existential risk"? The whole universe? Earth originating intelligence in general? Earth originating biological intelligences? Human-originated intelligences? What's about continued existence of our culture and our values? Clearly the exact definition that you're going to use is carefully picked here as to promote pet issues. Could've been the existence of the universe, given a pet issue of future accelerators triggering vacuum decay.

You have fully convinced me that giving money towards self proclaimed "existential risk research" (in reality, funding creation of disinformation and biasing, easily identified by the fact that it's not "risk" but "existential risk") has negative utility in terms of anything I or most people on the Earth actually value. Give you much more money and you'll fund a nuclear winter denial campaign. Nuclear war is old and boring, robots are new and shiny...

edit: and to counter a known objection that "existential risk" may be raising awareness for other types of risks as a side effect. It's a market, the decisions what to buy and what not to buy influence the kind of research that is supplied.

Comment author: private_messaging 31 January 2014 06:21:13AM *  0 points [-]

And re: Pinker: If you had a bit more experience with trends on a necessarily very noisy data - you would realize that such trends are virtually irrelevant with regards to the probability of encountering some extremes (especially when those are not even that extreme - preceding the cold war, you have Hitler). It's the exact same mistake committed by particularly low brow republicans when they go on about "ha ha, global warming" during a cold spell - because they think that a trend in noisy data has huge impact on individual data points.

edit: furthermore, Pinker's data is on violence per capita - the total violence increased, it's just that the violence seems to scale sub-linearly with population. Population is growing, as well as the number of states with nuclear weapons.

Comment author: CarlShulman 01 February 2014 06:29:16PM *  2 points [-]

Pinker's data is on violence per capita - the total violence increased, it's just that the violence seems to scale sub-linearly with population.

Did you not read the book? He shows big declines in rates of wars, not just per capita damage from war.

Comment author: private_messaging 01 February 2014 11:48:18PM *  0 points [-]

By total violence I mean the number of people dying (due to wars and other violence). The rate of wars, given the huge variation in the war size, is not a very useful metric.

I frankly don't see how, having on one hand trends by Pinker, and on the other hand, adoption of modern technologies in the regions far behind on any such trends, and developments of new technologies, you have the trends by Pinker outweight that.

On the general change, for 2100, we're speaking of 86 years. That's the time span in which Russian Empire of 1900 transformed to Soviet Union of 1986 , complete with two world wars and invention of nuclear weapons followed by thermonuclear weapons.

That's a time span more than long enough for it to be far more likely than not that entirely unpredictable technological advancements will be made in multitude of fields that have impact on the ease and cost of manufacturing of nuclear weapons. Enrichment is incredibly inefficient, with a huge room for improvement. Go read the wikipedia page on enrichment, then assume a much larger number of methods which could be improved. Conditional on continued progress, of course.

The political changes that happen in that sort of timespan are even less predictable.

Ultimately, what you have is that the estimates should regress towards ignorance prior over time.

Now as for the "existential risk" rhetoric... The difference between 9.9 billions dying out of 10 billions, and 9.9 billions dying out of 9.9 billions, is primarily aesthetic in nature. It's promoted as the supreme moral difference primarily by people with other agendas, such as "making a living from futurist speculation".

Comment author: ChrisHallquist 02 February 2014 06:32:31PM 0 points [-]

Now as for the "existential risk" rhetoric... The difference between 9.9 billions dying out of 10 billions, and 9.9 billions dying out of 9.9 billions, is primarily aesthetic in nature. It's promoted as the supreme moral difference primarily by people with other agendas, such as "making a living from futurist speculation".

Not if you care about future generations. If everybody dies, there are no future generations. If 100 million people survive, you can possibly rebuild civilization.

(If the 100 million eventually die out too, without finding any way to sustain the species, and it just takes longer, that's still an existential catastrophe.)

Comment author: private_messaging 02 February 2014 09:52:19PM *  -1 points [-]

Not if you care about future generations. If everybody dies, there are no future generations. If 100 million people survive, you can possibly rebuild civilization.

I care about the well being of the future people, but not their mere existence. As do most people who don't disapprove of birth control but do disapprove of, for example, drinking while pregnant.

Let's postulate a hypothetical tiny universe, where you have Adam and Eve except they are sort of like horse and donkey - any children they'll have are certain to be sterile. The food is plentiful etc etc. Is it supremely important that they have a large number of (certainly sterile) children?

Comment author: ciphergoth 30 January 2014 01:33:38PM 0 points [-]

Carl, Dymytry/private_messaging is a known troll, and not worth your time to respond to.

Comment author: Randaly 11 December 2013 02:10:36PM 0 points [-]

The obvious way to pull the rope sideways on this issue is to advocate for replacing conventional nuclear devices with neutron bombs.