Reid Kirby, Radiological Weapons: America’s Cold War Experience (2020)
When the United Nations crafted the first definition of weapons of mass destruction in 1948, it included “atomic explosive weapons, radio-active material weapons, lethal chemical and biological weapons, and any weapons developed in the future which have characteristics comparable in destructive effect to those of the atomic bomb or other weapons mentioned above.” This definition always confused me in that I had never heard of any nation developing radioactive material weapons. In fact, my readings on arms control confirmed that the United States and Soviet Union were never really worried about that class of unconventional weapons to the point of trying to regulate them through any bilateral or global agreement. I probably should have known better than to believe that someone in the US military had not at least thought about this issue, and Reid Kirby has now written a short book confirming that suspicion.
This book, titled Radiological Weapons: America’s Cold War Experience, describes how the US Army researched, developed, and tested radiological devices for the purposes of supporting military operations. Most scholars working in the WMD community recognize the possibility that substate groups could obtain radioactive material for the purposes of distributing it in a public forum—the so-called dirty bomb or, more formally, a radioactive dispersal device. While dirty bombs are generally understood not to have much utility other than causing panic and economic disruption, a military radiological weapon would be expected to have some degree of utility on the battlefield. At the least, one could emplace a radiological hazard that would either create a barrier to unprotected military forces or incapacitate them as a result of radiological exposure. Kirby’s book illustrates that the US Army did in fact have an organized effort to develop such weapons, a program that evolved over time and that was heavily influenced by the Air Force’s development of nuclear munitions.
The concept of using radiological isotopes in a munition originated in World War II, as research conducted within the Manhattan Project demonstrated their potential hazardous characteristics. Given the possibility that German scientists might have also recognized radiological isotopes as a weapon of war, the Army developed the first field radiac meters as a contingency. While no radiological weapons were employed by either side in World War II, the Army Chemical Corps continued its research by developing dust generators and air-delivered bomblets as area-denial devices. These prototype munitions were tested at Dugway Proving Ground but not produced in quantity, given the military judgment that these weapons would be of limited operational value.
As development of atomic and thermonuclear weapons advanced, interest in radiological weapons waned and the Army terminated its research projects in favor of developing tactical nuclear weapons. In addition to artillery-delivered nuclear projectiles, the Army became very interested in low-yield atomic demolition munitions, or ADMs. While there was an obvious use of destroying bridges, power plants, and dams, these devices could also be used to throw up radioactive dirt and debris thousands of yards forward to form an atomic barrier to enemy forces. While the West German government objected to the concept, hundreds were developed and deployed to Germany and Italy.
As the US government continued to develop new nuclear weapons, one concept for increasing the radiological fallout from these devices was to “salt” the munitions. The 1954 Castle Bravo detonation demonstrated how significantly large areas of radiation could be created, far exceeding the blast effects. The Air Force was focused on increasing megatonnage and not radiological fallout, but the Army remained interested in the concept of using nuclear weapons for their radiological effects rather than their blast effects. As the number of atmospheric nuclear weapons tests resulted in steadily increasing levels of radiological fallout, President John F. Kennedy joined with the Soviet Union in a partial test ban stopping atmospheric and underwater nuclear tests.
While the Air Force was focused on killing cities, the Army’s interest in nuclear weapons remained focused on targeting enemy soldiers and armored vehicles, as well as defensive fortifications. The preferred employment was low-altitude airbursts that would incapacitate or bury enemy personnel. The primary effect of the Army’s tactical nuclear systems was radiological hazards and not necessarily the blast, given that troops might be inside fortifications, tanks, or armored personnel carriers. One way to maximize harmful effects to these protected troops would be to develop a fusion device that would create a much higher neutron burst than fission devices, thus penetrating heavy armor or walls.
Pure fusion devices were not practical but the US nuclear enterprise did develop warheads that had increased radiological output but reduced blast effects. These were called “neutron bombs,” and while the US military did develop and deploy a number of these “enhanced radiation” nuclear warheads, public opposition in Europe (in no small part agitated by the Soviet Union) caused their ultimate recall and disposal. In 1991, President George H.W. Bush pulled the Army’s tactical nuclear weapons as an operational capability, ending the Army’s quest for offensive radiological weapons.
By telling this story, Kirby’s book, while concise, fills a fascinating gap in the Army’s history. While the Air Force steals most of the glory in nuclear modernization efforts, the Army did have an active program that fully investigated how radiological hazards might support the tactics of maneuver forces. Certainly other nations investigated this concept as well. The Cold War featured some very interesting research programs, and while some might eschew the idea of using radioactivity as a weapon system, it certainly was not that different from the dozens of nuclear munition designs that were developed and produced over the years. Kirby has illuminated this dark corner of history, and I would recommend this book to anyone interested in the technical achievements of the Cold Warriors who invested their time in this arcane area.
Al Mauroni is the director of the US Air Force Center for Strategic Deterrence Studies and author of the book, Countering Weapons of Mass Destruction: Assessing the U.S. Government’s Policy.
The views expressed are those of the author and do not reflect the official position of the United States Military Academy, Department of the Army, Air University, Department of the Air Force, or Department of Defense.
Image credit: Catalania Catalino (adapted by MWI)