Russia’s invasion of Ukraine not only reminded the world of all the usual horrors of modern warfare, but also stirred the long-slumbering spectre of nuclear catastrophe, both in the form of nuclear war à la “Dr. Strangelove” and of civilian disaster à la Chernobyl. When Russian forces occupied the Chernobyl nuclear plant and held its workers hostage, some worried about a new nuclear disaster in the making if the plant was damaged or if decommissioning operations were severely disrupted. Other nuclear plants in Ukraine, including the Zaporizhzhia nuclear power station — the largest nuclear power plant in Europe, with six reactors — were threatened by invading forces. The dangers were severe enough that the International Atomic Energy Agency sent safety staff and continues to monitor the unfolding situation to ensure that things don’t get out of control.
At the moment, Ukraine’s nuclear plants seem to be safe, but fear and anxiety persist. As Serhii Plokhy details in “Atoms and Ashes: A Global History of Nuclear Disasters,” the memories of past catastrophes continue to haunt the idea of nuclear power, including any plans or hopes for a nuclear power renaissance in a world of worsening climate change.
Each of the book’s six chapters focuses on an individual nuclear accident, some famous, others more obscure, including relevant background information and historical context. “The debate on the safety of nuclear energy can be advanced by taking a fresh look at the history of nuclear accidents and trying to understand why they happened, how bad they were, what we can learn from them, and whether they can happen again,” writes Plokhy, a Ukrainian history professor at Harvard University and director of the university’s Ukraine Research Institute. Yet although the details of the accidents he describes are indeed harrowing, they do not adequately support his ultimate conclusion that nuclear energy isn’t a safe choice for powering our future.
Plokhy’s previous book, “Nuclear Folly: A History of the Cuban Missile Crisis,” dealt with a nuclear catastrophe that never happened, namely global thermonuclear war between the U.S. and the USSR in 1962. As Plokhy’s multifaceted perspective on that crisis showed, a horrific outcome was averted both by sheer luck and human effort and intention — paradoxically, the same qualities that created the crisis in the first place. In “Atoms and Ashes,” he demonstrates that while the accidents he describes may have initially arisen from the inherent dangers, deficiencies, and risks of nuclear technology, they were made much worse by human error, arrogance, stupidity, corruption, improper training, political and ideological attitudes, choice of technology, and (especially in the early days) sheer ignorance.
For many readers, the most illuminating revelations of “Atoms and Ashes” will likely be the three older and more obscure incidents: the 1954 Castle Bravo U.S. nuclear test on the Marshall Islands, the 1957 Kyshtym nuclear disaster in the Soviet Union, and the 1957 Windscale reactor fire in the United Kingdom. Of the accidents Plokhy describes, Castle Bravo was the only actual nuclear explosion, a test of a new thermonuclear weapon design in the Pacific that proved far more powerful than its creators anticipated. The blast spread radioactive fallout over a wide area, with high radiation levels temporarily trapping test personnel in a bunker on one of the test islands and also causing an international incident after dousing the crew of a Japanese fishing boat with fallout, later killing one crewman.
Both of the 1957 disasters were also related to nuclear weapons, specifically the production of weapons-grade material: The Kyshtym episode in the Soviet Union resulted from the explosion of a poorly maintained underground nuclear waste tank and the Windscale incident involved a British reactor intended to produce fuel for nuclear weapons.
The more familiar Three Mile Island, Chernobyl, and Fukushima accidents all involved civilian power plants, but even these bore military pedigrees, tracing their ancestry back to the Manhattan Project and the subsequent weapons programs of the Cold War.
Plokhy’s detailed narratives on each episode demonstrate that, as with most technologically based disasters, there’s never really a single, unambiguous cause. Such catastrophes are the result of multiple factors over time — sometimes years, sometimes days, sometimes minutes and seconds. They’re not the result of one failure but a string of them, a perfect storm of technical shortcomings and the failure of people to notice problems, take early preventive measures, and forestall an ever-worsening chain of catastrophe. As analysts including Scott Sagan and Diane Vaughan have pointed out, the more complex the technological system, the more inevitable such failures become, in what sociologist Charles Perrow has deemed “normal accidents.” The devil is indeed in the details.
In the early years of nuclear technology, engineers, designers, operators, and managers of nuclear energy “were all entering truly uncharted waters,” Plokhy writes. They “were dealing with new science and technology that was not fully understood and tested, especially in the early decades, and was bound to prove risky and unpredictable in emergencies. They all took huge risks that made accidents well-nigh inevitable.” At Chernobyl, for example, the combination of an inherently dangerous and obsolete reactor design and pressure to meet arbitrary production goals by cutting corners on safe procedures led to catastrophe. As Plokhy explains, “While nuclear scientists and engineers in all countries shared a ‘can-do’ attitude, only in the Soviet Union did managers and engineers consciously violate safety instructions and regulations to achieve their goals, doing so with the tacit blessing of the authorities.”
Plokhy’s detailed narratives on each episode demonstrate that as with most technologically based disasters, there’s never really a single, unambiguous cause.
Fortunately, Plokhy notes, “a learning process took place in all the countries that suffered from accidents.” Corrections were made, procedures were improved, new precautions were put into place, scapegoats were identified. Such measures were not in vain: “The fact that all the major accidents involved designs and technologies developed in the 1950s and 1960s offers some hope that the initial period of major errors accompanying the birth of any new technology and industry is behind us.” Plokhy also notes that the end of the Cold War resulted in more sharing of information and greater international cooperation and regulation. These factors, combined with technological developments and better safety standards, “did a great deal to ensure that no major nuclear accident occurred for 25 years after Chernobyl.”
Then, of course, came Fukushima in 2011, which triggered a fresh wave of worldwide sentiment against nuclear power, much of it fueled by sensationalistic and inaccurate media coverage, a problem that arguably worsened the impact and cost of previous accidents like Chernobyl and Three Mile Island. Germany reacted soon after by voting to either shut down or phase out their nuclear power plants by 2022. The move increased their dependence on Russian energy and has caused considerable difficulty as international pressure mounted to issue sanctions against Russia due to the Ukraine war.
Citing not only the potential for disaster but its great economic costs, Plokhy discounts the hopes of some experts who champion nuclear power as a partial alternative to fossil fuels and global warming, downplaying the prospects of improved reactor and plant designs. “Many of the political, economic, social, and cultural factors that led to the accidents of the past are still with us today, making the nuclear industry vulnerable to repeating old mistakes in new and unexpected ways,” he writes. A new accident would threaten any further development of the industry “for at least another 20 years,” Plokhy adds. “This makes the nuclear industry not only risky to operate but impossible to count on as a long-term solution to an overwhelming problem” — climate change.
But while Plokhy’s tales of disaster are well-told and gripping, the overall effect, perhaps unintentionally, is to make clear that whatever the inherent risks of the technology, the common denominator behind all the accidents is human — a reality that is not only unavoidable but also the most difficult factor to control and predict. “The big accidents uncover existing problems that go beyond simple mistakes or technical malfunctions,” Plokhy observes. “They bring to the fore factors of broader political, social, and cultural significance that contribute to a given disaster indirectly or not always obviously, but most profoundly nevertheless.” Those factors aren’t a function of reactor design or control system operations or radiation physics, but the imponderables of human nature — and they ultimately indict not the technology, but the imperfect humans operating it.