Last month I was involved in a conversation thread about what the impact of a hypothetical nuclear war would be on existential risk.
There are many potential nuclear war scenarios which would have varying impacts on existential risk. It's difficult to know where to start to gain an understanding of the long-term of nuclear proliferation.
For concreteness, consider the case of an India-Pakistan nuclear war.
According to Local Nuclear War, Global Suffering by Robock and Toon,
India and Pakistan, long at odds, have more than 50 nuclear warheads apiece; if each country dropped that many bombs on cities and industrial areas, the smoke from fires would stunt agriculture worldwide for 10 years.
[...]
1 billion people worldwide with marginal food supplies today could die of starvation because of ensuing agricultural collapse
Note that this would presumably cause some degree of chaos in the developed world.
I have not yet investigated the credibility of the papers' claims. However,
Suppose that an all-out nuclear war between India and Pakistan were to occur and were to result in climate change killing 1 billion people. Then would the probability of a positive singularity increase or decrease and if so why?
This question seems very difficult to answer; maybe altogether too difficult for humans to answer. I welcome responses raising relevant considerations even in absence of a good way to compare the relevant considerations. Please read the linked conversation thread before commentating.
The question is not the right question to ask. Large scale war whether nuclear or not regardless of the countries increases existential risk in all forms. The more resources taken up dealing with such situations the less spent on preventing existential risks such as large asteroids, superbugs and very bad AI. The increased stress levels to societies will also encourage risk taking liking it more likely that people will try to develop major new technologies without adequate safeguards. Nanotech and AI both fall into this category. (To some degree this is the worst case scenario . If technological progress is halted completely this won't be a problem. The really bad case is where technological research continues but without safeguards.)
The question as phrased also emphasizes climate change rather than other issues. In the case of such a nuclear war, there would be many other negative results. India is a major economy at this point and such a war would result in largescale economic problems throughout.
A slightly larger scale problem is that of total societal collapse, or human extinction. Both of these look unlikely in the Pakistan-India case but are worth discussing (although at this point seem very unlikely for any plausible nuclear war scenario). One serious problem with coming back from societal collapse that is often neglected is the problem of resource management. Nick Bostrom has pointed out that. to get to our current tech level we had to bootstrap up using non-renewable fossil fuels and other non-renewable resources. If the tech level is sufficiently reduced it isn't obvious that such a bootsrapping can occur again.As more and more resources are consumed this problem becomes more severe. (This is in my view an argument for conservation of fossil fuels that is too often neglected- we need them in reserve in case we need to climb back up the tech ladder again.) But again, this situation doesn't seem that likely.
Overall, nuclear war is an example of many sorts of situations that would increase existential risk across the board. In that regard it isn't that different from a smallish asteroid impact (say 2-3 km) in a major country, or Yellowstone popping, or a massive disease outbreak or a lot of other situations. Nuclear war probably seems more salient because it involves human intent. This is similar to how terrorism is a lot scarier to most people than car crashes.
Agree with most of what you say here.
No, if technological progress is halted completely then we'll never be able to become transhumans. From a certain perspective this is almost as bad as going extinct.
The Robock... (read more)