Due to their tremendous power, nuclear weapons were a subject of intense secrecy and taboo in the US government following WW2. After their first uses, President Truman came to consider atomic weapons a terror weapon of last resort and generally avoided discussion of their use.1 As a consequence, top-level decision makers in his administration (himself included), outside the Atomic Energy Commision (AEC) remained unaware of the US nuclear stockpile size as it grew,2 and relatively few means of achieving arms control were considered at a high level, given Stalin’s resistance to opening up the Soviet Union to inspections.3 Frustrated by failed negotiations and geopolitical expansion by the communists, US atomic weapons were made ready for use again in 1947,4 and no arms control talks were attempted again beyond a propaganda level for the next ten years.5

The consequences of a lack of internal communication, however, did not end with Truman. Somewhat insulated from Eisenhower’s stance against nuclear first use,6 Curtis LeMay’s plans with Strategic Air Command grew increasingly toward it.7 Growing from a 1000 weapon air force to an 18,000 weapon triad8 with up to potentially 27,000 megatons of explosive power by 19609 the arsenal went far beyond the necessity needed for Eisenhower’s massive retaliation strategy. If the warheads were equally sized and distributed, an airburst attack with such a number of weapons would have been capable of scorching more than half of the Soviet wilderness or shattering windows over more than 90% of the USSR, China, and North Korea.10 While one might argue that such a degree of force may have been rational for accounting for reliability, or retaining significant retaliatory capability after a first strike by the Soviet Union, LeMay’s plans presumed using nearly all the weapons in a preemptive strike11 as US bombers were vulnerable on open runways. Generally speaking, nuclear war planners of the 50s were uncoordinated, didn’t account for many of the known effects of nuclear weapons, and were given no war objectives to plan toward from which limits could be derived.12 Though Eisenhower learned that SAC’s plans were excessive even for counterforce, Eisenhower didn’t commission any study on the lasting environmental effects of such a massive use of force.13

With secrecy and taboo around nuclear weapons, each party involved in planning their use and policy ended up with a great deal of autonomy, where they could make mistakes which would be impossible in a higher feedback environment. With a lack of idea spread between different parts of government, the Strategic Air Command could pursue plans and procurements that were not necessarily efficient or aligned with US political interests. Though analysts at the RAND Corporation had determined counterforce strategies were problematic,14 as SAC didn’t share it’s intelligence abilities, it could feel overly confident dismissing such analysis.15

Since the US’s open society likely had greater vulnerability to Soviet spies than vice versa, it makes sense that the US was concerned about leaks which could give the USSR an advantage or create public pressure to force the president into taking non-strategic actions. In general, compartmentalizing information is a great way to reduce the risk of spies leaking a decisive amount of information, but cutting information flow at the highest levels of government directly harms the government’s ability to make decisions on the basis of the classified information it possesses. If one can’t trust high-level planners enough to give them access to the information needed to make rational plans, such people should not have been planners in the first place. If too many people are required to make good plans securely, then compartmentalization should take place based on accomplishing smaller objectives and creating models from which plans can be derived. Instead, compartmentalization happened by military service branch and agency, resulting in duplication of planning effort, and too many inexperienced people being involved without criticism.

To characterize the disunity of intelligence and communication during the Cold War some in the defense community use this joke:16

US Air Force: “The Russians are here!”

Defense Intelligence Agency: “The Russians are not here yet, but they are coming.”

CIA: “The Russians are trying, but they won’t make it.”

Intelligence and Research (INR, in the Department of State): “The Russians? They aren’t even trying.”


For states and research organizations to not fall prey to these sorts of mistakes in areas situations with risks of harmful information spread (information hazards), there are a few principles that seem like they can be derived from this.

1: It is extremely important for people with good ideas to press them within their secure communities in order to improve decisions. Ideas kept secret and immune from criticism are likely to be pretty bad or ill-conceived, but if there is risk from public engagement (eg. by spreading knowledge of a bioweapon that may have become easier to manufacture) then such conversations should still be had within secure communities to better flesh out the real risks and create strategies for mitigating them.

2: The bounds of secrecy should be well defined. Generally stifling conversation shuts down both useful and harmful information flow, so making good delineations can result in a net reduction in risk. Secrecy can be abused by groups to gain information advantages against competing interest groups, to increase one’s social status, or to maintain corruption. For this reason, cultures of openness and transparency can sometimes develop a net advantage against more secretive ones since they better align the incentives of those involved.

 


Footnotes:

  1. Rosenberg, D. A. (1983). The Origins of Overkill: Nuclear Weapons and American Strategy, 1945-1960. International Security, 7(4), 11.
  2. ibid., 11
  3. McGeorge Bundy, Danger and Survival, (New York: Random House, 1988), 130
  4. ibid., 339
  5. ibid., 130
  6. ibid., 252
  7. ibid., 322
  8. ibid., 319
  9. ibid., 320
  10. Calculation made using land area estimations from google maps and Alex Wellerstein’s nukemap app using 1.5 Megaton airburst weapons. Note that as all weapons would not be uniform, the actual maximum area damage would be lower.
  11. Bundy 322
  12. Moore, John H. (14 February 1957). “Letter from Captain John H. Morse, Special Assistant to the chairman, Atomic Energy Commission, to Lewis Strauss, Chairman, Atomic Energy Commission”. In Burr, William. “It Is Certain There Will be Many Firestorms”: New Evidence on the Origins of Overkill (PDF). Electronic Briefing Book No. 108 (Report). George Washington University National Security Archive. Dwight D. Eisenhower Library, Records of Special Assistant for National Security Affairs, NSC Series, Briefing Notes Subseries, box 17, Target Systems (1957–1961).
  13. Bundy 324-325
  14. Andrew David May, ‘The RAND Corporation and the Dynamics of American Strategic Thought, 1946–1962’, PhD Dissertation, Emory Univ. 1998, 235, 291.
  15. Austin Long & Brendan Rittenhouse Green (2015) Stalking the Secure Second Strike: Intelligence, Counterforce, and Nuclear Strategy, Journal of Strategic Studies, 38:1-2, 44, DOI: 10.1080/01402390.2014.958150
  16. Johnson, L. K. (2008). Glimpses into the Gems of American Intelligence: The President’s Daily Brief and the National Intelligence Estimate. Intelligence and National Security, 23(3), 333-370. doi:10.1080/02684520802121257
New Comment
10 comments, sorted by Click to highlight new comments since:

I think the question of "how good are governments and large institutions in general at aggregating information and handling dangerous technologies?" is a really key question for dealing with potential catastrophic risks from technology. In trying to answer that question, I've references this post a few times. 

Thanks! As habryka noted elsewhere, it seems like LessWrong could use more history.

Minor note: I'm pretty sure I've parsed your thesis, but some of the phrasings were a bit confusing

With a lack of idea spread between different parts of government, the Strategic Air Command could pursue plans and procurements that were not necessarily efficient or aligned with US political interests

I'm assuming 'lack of idea spread' is just referring to lack of shared information?

Shared information is the primary thing I was getting at. I used idea spread since different government bureaucracies had different ideas of how nuclear weapons should be used in the case of war, and if SAC's plans were well known at the highest levels of government when they were first made, they likely would not have persisted as long, and been replaced with better ideas.

The book I cite, but which may be hard to find online (Danger and Survival) goes through many policy ideas which weren't considered by high level policy makers, but which may have been beneficial in retrospect and which were sometimes thought of by others, but not put forward to policy makers.

This is great. I was so pleased to see all those footnote citations.

Write a review!

Along with Chatham House rules, this post has played a role in my thinking about secrecy/privacy/confidentiality, although in a somewhat different fashion. (I also endorse habryka's take)

Reviewing it now, I feel like it also has some connection to Explicit/Implicit Information post, although I'm a bit hazy on whether that's real or just a "I'm currently seeing connections and patterns everywhere".

For people interested in nuclear war history, the 2003 documentary The Fog of War is wonderful. Alex Wellerstein's blog is also great: http://blog.nuclearsecrecy.com/2014/08/08/kyoto-misconception/

This also reminds me of how companies may benefit like Buffer's commitment to transparency: https://open.buffer.com/transparency-timeline/

Unfortunately, with an entity like the US government and military it seems like a hard inadequate equilibria to overcome even if certain actors knew better communication was more effective overall. It seems like coming up with the optimal organizations of groups (branches of the military, intelligence services, government entities) or lack thereof and the hierarchy and responsibilities within and between them is one of the major blockages.

Edit: I found an interesting prescriptive article of potential ways to overcome the problems I discussed: http://blog.nuclearsecrecy.com/2016/03/18/conversation-with-a-super-spook/

Excellent work!

I thought I would mention that the footnote links do not link to the citation, but rather back to the head. I am using Chrome Version 63.0.3239.132 (Official Build) (64-bit) on a Linux machine, if that is relevant.

Should be fixed. The HTML pasted didn't have a referent for the "#bottom" hashtag. I added the id "bottom" to the footnotes header.

I confirm that it is fixed! I appreciate your work; thank you.