Just a reminder that some of the old threats are still around (and hence that AI is not only something that can go hideously badly, but also some thing that could help us with the other existential risks as well):
EDIT: as should have been made clear in that post (but wasn't!), the existential risks doesn't come from the full fledged nuclear winter directly, but from the collapse of human society and fragmentation of the species into small, vulnerable subgroups, with no guarantee that they'd survive or ever climb back to a technological society.
Americans have 10000 atomic weapons currently. Russians also. Others are negligible in this sense.
Say that the average bomb has 1 MT. This means 8*10^19 J of energy. What is a big overestimation, but for the sake of the discussion, would you accept this number first?
I don't accept the idea that the climate effect of the fire is in any way comparable to nukes in the first place, because fire doesn't get smoke high up in the atmosphere. I think its a very screwed up assumption. I've only been criticizing the numbers because the point is that people don't think straight about existential risks. Humans don't think about risks, they evaluate risks rapidly with some feeling & particular really simple strategy that they picked up, then rationalize verbosely.