Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

passive_fist comments on Global catastrophic risks connected with nuclear weapons and nuclear energy - Less Wrong

1 Post author: turchin 21 December 2015 12:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (16)

You are viewing a single comment's thread. Show more comments above.

Comment author: passive_fist 28 December 2015 02:30:17AM 0 points [-]
  1. Strictly speaking, the only major barrier to development of fission weapons (once the possibility of prompt criticality was realized) was enrichment. Even a simple gun-type bomb design suffices if you want to build a fission weapon, but you have to get the nuclear material first, and that's where the bulk of the scientific and technological effort in the Manhattan project was focused. Even today, enrichment is still only the major barrier to aspiring nuclear states/groups. Once it was identified that this was the problem that needed to be solved, the scientists quickly came up with a plan on how to tackle it. But there is no plan or pathway to pure fusion weapons. As far as we know, they could be physically impossible. I'm not discounting the possibility of some incredibly secret pure fusion weapon, but if such a weapon existed it would be exceedingly silly to spend billions of dollars on facilities like NIF or the Z machine - and keep in mind that these projects were funded by and do research for the government agencies responsible for nuclear weapons development. What's the point? (Also, cold fusion does not exist.)

  2. Wrong. A country with a sizeable stockpile of nuclear ICBMs can target and kill anyone it wishes. It's not restricted to just bombing the other superpower.

Comment author: turchin 28 December 2015 10:31:29AM 0 points [-]
  1. This is a map of possible risks, not a map of claims. All it says is that if pure fusion (or other simple nukes) will be created it will make situation with proliferation much more difficult. For example laser enrichment is much simpler than traditional and it was recognised as proliferation risk. We can't say how, but tech progress is making nukes cheaper and simpler and it is a problem. https://en.wikipedia.org/wiki/Separation_of_isotopes_by_laser_excitation
  2. It can kill anyone, but not everyone. The world have around 5 million villages and small towns, and you need at least one bomb for each one to kill. On the peak of cold war the world had less than 100 000 bombs. If you really want to kill everyone, you should try something special like artificial nuclear winter or summer.
Comment author: passive_fist 28 December 2015 11:52:51AM *  0 points [-]

I realize that it's a map of risks, I'm just saying the possibilities don't even remotely fall into comparable levels of risk. "Death from nuclear ICBM" is quite imaginable and possible. Not only that, there was a time when it almost seemed imminent and inevitable. And it could easily become that way again. Whereas "death from cold fusion" is essentially of zero meaningful concern.

Maybe it would be useful if you could attach some kind of crude probabilities to your estimates. I can fill a pdf with items like "death from massive leprechaun attack" but it wouldn't be a very useful guide.

Comment author: turchin 28 December 2015 12:37:25PM *  0 points [-]

While I do not appreciate your wording "death from cold fusion" when we speak about risks of proliferation connected with new technologies, I already added some kind of probability estimation to the map and painted boxes in one of three colors. But instead of probability I used "Importance of risks", which more clearly connected with what we should do to prevent them.

"Importance (or urgency) of risks is subjectively estimated based on their probability, timing, magnitude of expected effect and scientific basis for the risk. Importance here means how much attention and efforts we should put to control the risk.

Green – just keep it in mind, do nothing Yellow – pay attention, do reasonable efforts to prevent Red – pay immediate attention to prevent" The pdf is here: http://immortality-roadmap.com/nukerisk2.pdf

In it only two risks are red: nuclear war and nuclear-biological war.

The risks of large scale proliferation connected with new technologies is yellow.

and the risk of Jupiter detonation is green.