Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Fukushima Plus Two: Still the Beginning?

An IAEA inspector examines the remains of reactor 3 at Fukushima Daiichi (5/27/11) (photo: Greg Webb/IAEA imagebank)

An IAEA inspector examines the remains of reactor 3 at Fukushima Daiichi (5/27/11) (photo: Greg Webb/IAEA imagebank)

I was up working in what were in my part of the world the early morning hours of March 11, 2011, when I heard over the radio that a massive earthquake had struck northeastern Japan. I turned on the TV just in time to see the earliest pictures of the tsunami that followed what became known as the Tohoku quake. The devastation was instantly apparent, and reports of high numbers of casualties seemed inevitable, but it wasn’t until a few hours later, when news of the destruction and loss of power at the Fukushima Daiichi nuclear plant hit the English-language airwaves, that I was gripped by a real sense of despair.

I was far from a nuclear expert at the time, but I knew enough to know that without intact cooling systems, or the power to keep them running, and with the added threat of a containment breach, some amount of environmental contamination was certain, and the potential for something truly terrifying was high.

What started as a weekend of watching newswires and live streams, virtually around the clock, and posting basic tech and health questions on email lists, expanded as the Fukushima crisis itself grew. Two years later, I have written tens of thousands of words, and read hundreds of thousands more. I have learned much, but I think I have only scratched the surface.

We all might be a little closer to understanding what happened in those first days and weeks after the earthquake, but what has happened since is still, sadly, a story where much must be written. What the Daiichi plant workers really went through in those early days is just now coming to light, and the tales of intrigue and cover-up, of corruption and captured government, grow more complex and more sinister with each revelation. But what has happened to the environment, not just in the government-cordoned evacuation zone, but also throughout Japan, across the Pacific, and around the world, will likely prove the most chilling narrative.

Radiation levels in the quarantined parts of Japan are still far too high to permit any kind of human re-habitation, but exposure rates in areas far outside that radius are also well above what would have been considered acceptable before this disaster. And water, used to cool the molten cores and damaged spent fuel pools at Fukushima Daiichi, now dangerously radioactive itself, continues to leak into the ground and into the ocean at unprecedented rates.

Alas, the efforts of the Japanese government seem more focused on limiting the information, quieting dissent, and sharing the pain (by shipping radioactive detritus across the country for disposal and incineration), than it is on stopping the leaks, cleaning up the contamination, and eliminating future risks. Though originally pledged to quickly turn away from all nuclear power, a change of government in Japan has revived the incestuous relationship between the nuclear industry and the bureaucrats and politicians who are supposed to police it.

Across the Pacific, the United States has not exactly bathed itself in glory, either. Within days of the news of the explosions at Fukushima, President Barack Obama was the rare world leader that made a point of publicly assuring the nuclear industry that America’s commitment to this dangerous energy source was still strong. Just months after the start of the crisis, information on airborne radiation samples from across the country became less accessible to the public. And while industrialized countries like Germany work to phase out their nuclear plants, the US Nuclear Regulatory Commission actually approved construction of new reactors, and the federal government is poised to backstop the baldly risky investment to the tune of $8.3 billon.

But most disturbing of all, of course, will be the stories of the people. First, the stories we will hear from the families in Japan exposed to the toxic fallout in the immediate aftermath of the initial containment breaches and explosions–stories we are already hearing of children with severe thyroid abnormalities. But soon, and likely for decades to come, the stories of cancers and immune disorders, of birth defects and health challenges, elevated not only in northern Japan, but perhaps across the northern hemisphere.

Two years after the earthquake and tsunami, it is not the beginning of the end of this disaster, and, with apologies to Winston Churchill, it may not even be the end of the beginning. The spent fuel pool at Daiichi reactor 4 remains in precarious shape, and the state of the three molten cores is still shrouded in mystery. Radioactive dust and grime blanket large parts of Japan with no serious plan to remove it, and the waters off the northeast coast continue to absorb irradiated runoff, putting an entire aquatic food chain in peril.

On this second anniversary of the start of the Fukushima crisis, let us honor those who have suffered so far, review what we have learned to date, and endeavor to understand what is likely to come. But, most of all, let us renew our commitment to breaking with this dirty, dangerous and expensive technology.

* * *

To this end, on March 11 and 12, I will be attending a symposium at the New York Academy of Medicine, “The Medical and Ecological Consequences of the Fukushima Nuclear Accident,” sponsored by the Helen Caldicott Foundation and Physicians for Social Responsibility. If you are in the New York area, there is still space available; if you want to watch online, the organizers have promised a live stream. More information can be found on the Caldicott Foundation website.

The Long, Long Con: Seventy Years of Nuclear Fission; Thousands of Centuries of Nuclear Waste

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi's first atomic pile--and the start of the world's nuclear waste problem. (Photo: Nathan Guy via Flickr)

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi’s first atomic pile–and the start of the world’s nuclear waste problem. (Photo: Nathan Guy via Flickr)

On December 2, 1942, a small group of physicists under the direction of Enrico Fermi gathered on an old squash court beneath Alonzo Stagg Stadium on the Campus of the University of Chicago to make and witness history. Uranium pellets and graphite blocks had been stacked around cadmium-coated rods as part of an experiment crucial to the Manhattan Project–the program tasked with building an atom bomb for the allied forces in WWII. The experiment was successful, and for 28 minutes, the scientists and dignitaries present observed the world’s first manmade, self-sustaining nuclear fission reaction. They called it an atomic pile–Chicago Pile 1 (CP-1), to be exact–but what Fermi and his team had actually done was build the world’s first nuclear reactor.

The Manhattan Project’s goal was a bomb, but soon after the end of the war, scientists, politicians, the military and private industry looked for ways to harness the power of the atom for civilian use, or, perhaps more to the point, for commercial profit. Fifteen years to the day after CP-1 achieved criticality, President Dwight Eisenhower threw a ceremonial switch to start the reactor at Shippingport, PA, which was billed as the first full-scale nuclear power plant built expressly for civilian electrical generation.

Shippingport was, in reality, little more than a submarine engine on blocks, but the nuclear industry and its acolytes will say that it was the beginning of billions of kilowatts of power, promoted (without a hint of irony) as “clean, safe, and too cheap to meter.” It was also, however, the beginning of what is now a, shall we say, weightier legacy: 72,000 tons of nuclear waste.

Atoms for peace, problems forever

News of Fermi’s initial success was communicated by physicist Arthur Compton to the head of the National Defense Research Committee, James Conant, with artistically coded flair:

Compton: The Italian navigator has landed in the New World.
Conant: How were the natives?
Compton: Very friendly.

But soon after that initial success, CP-1 was disassembled and reassembled a short drive away, in Red Gate Woods. The optimism of the physicists notwithstanding, it was thought best to continue the experiments with better radiation shielding–and slightly removed from the center of a heavily populated campus. The move was perhaps the first necessitated by the uneasy relationship between fissile material and the health and safety of those around it, but if it was understood as a broader cautionary tale, no one let that get in the way of “progress.”

A stamp of approval: the US Postal Service commemorated Eisenhower's initiative in 1955.

A stamp of approval: the US Postal Service commemorated Eisenhower’s initiative in 1955.

By the time the Shippingport reactor went critical, North America already had a nuclear waste problem. The detritus from manufacturing atomic weapons was poisoning surrounding communities at several sites around the continent (not that most civilians knew it at the time). Meltdowns at Chalk River in Canada and the Experimental Breeder Reactor in Idaho had required fevered cleanups, the former of which included the help of a young Navy officer named Jimmy Carter. And the dangers of errant radioisotopes were increasing with the acceleration of above-ground atomic weapons testing. But as President Eisenhower extolled “Atoms for Peace,” and the US Atomic Energy Commission promoted civilian nuclear power at home and abroad, a plan to deal with the “spent fuel” (as used nuclear fuel rods are termed) and other highly radioactive leftovers was not part of the program (beyond, of course, extracting some of the plutonium produced by the fission reaction for bomb production, and the promise that the waste generated by US-built reactors overseas could at some point be marked “return to sender” and repatriated to the United States for disposal).

Attempts at what was called “reprocessing”–the re-refining of used uranium into new reactor fuel–quickly proved expensive, inefficient and dangerous, and created as much radioactive waste as it hoped to reuse. It also provided an obvious avenue for nuclear weapons proliferation because of the resulting production of plutonium. The threat of proliferation (made flesh by India’s test of an atomic bomb in 1976) led President Jimmy Carter to cancel the US reprocessing program in 1977. Attempts by the Department of Energy to push mixed-oxide (MOX) fuel fabrication (combining uranium and plutonium) over the last dozen years has not produced any results, either, despite over $5 billion in government investments.

In fact, there was no official federal policy for the management of used but still highly radioactive nuclear fuel until passage of The Nuclear Waste Policy Act of 1982. And while that law acknowledged the problem of thousands of tons of spent fuel accumulating at US nuclear plants, it didn’t exactly solve it. Instead, the NWPA started a generation of political horse trading, with goals and standards defined more by market exigencies than by science, that leaves America today with what amounts to over five-dozen nominally temporary repositories for high-level radioactive waste–and no defined plan to change that situation anytime soon.

When you assume…

When a US Court of Appeals ruled in June that the Nuclear Regulatory Commission acted improperly when it failed to consider all the risks of storing spent radioactive fuel onsite at the nation’s nuclear power facilities, it made specific reference to the lack of any real answers to the generations-old question of waste storage:

[The Nuclear Regulatory Commission] apparently has no long-term plan other than hoping for a geologic repository. . . . If the government continues to fail in its quest to establish one, then SNF (spent nuclear fuel) will seemingly be stored on site at nuclear plants on a permanent basis. The Commission can and must assess the potential environmental effects of such a failure.

The court concluded the current situation–where spent fuel is stored across the country in what were supposed to be temporary configurations–“poses a dangerous long-term health and environmental risk.”

The decision also harshly criticized regulators for evaluating plant relicensing with the assumption that spent nuclear fuel would be moved to a central long-term waste repository.

A mountain of risks

The Nuclear Waste Policy Act set in motion an elaborate process that was supposed to give the US a number of possible waste sites, but, in the end, the only option seriously explored was the Yucca Mountain site in Nevada. After years of preliminary construction and tens of millions of dollars spent, Yucca was determined to be a bad choice for the waste:

[Yucca Mountain’s] volcanic formation is more porous and less isolated than originally believed–there is evidence that water can seep in, there are seismic concerns, worries about the possibility of new volcanic activity, and a disturbing proximity to underground aquifers. In addition, Yucca mountain has deep spiritual significance for the Shoshone and Paiute peoples.

Every major Nevada politician on both sides of the aisle has opposed the Yucca repository since its inception. Senate Majority Leader Harry Reid has worked most of his political life to block the facility. And with the previous NRC head, Gregory Jaczko, (and now his replacement, Allison Macfarlane, as well) recommending against it, the Obama administration’s Department of Energy moved to end the project.

Even if it were an active option, Yucca Mountain would still be many years and maybe as much as $100 million away from completion. And yet, the nuclear industry (through recipients of its largesse in Congress) has challenged the administration to spend any remaining money in a desperate attempt to keep alive the fantasy of a solution to their waste crisis.

Such fevered dreams, however, do not qualify as an actual plan, according to the courts.

The judges also chastised the NRC for its generic assessment of spent fuel pools, currently packed well beyond their projected capacity at nuclear plants across the United States. Rather than examine each facility and the potential risks specific to its particular storage situation, the NRC had only evaluated the safety risks of onsite storage by looking at a composite of past events. The court ruled that the NRC must appraise each plant individually and account for potential future dangers. Those dangers include leaks, loss of coolant, and failures in the cooling systems, any of which might result in contamination of surrounding areas, overheating and melting of stored rods, and the potential of burning radioactive fuel–risks heightened by the large amounts of fuel now densely packed in the storage pools and underscored by the ongoing disaster at Japan’s Fukushima Daiichi plant.

Indeed, plants were not designed nor built to house nuclear waste long-term. The design life of most reactors in the US was originally 40 years. Discussions of the spent fuel pools usually gave them a 60-year lifespan. That limit seemed to double almost magically as nuclear operators fought to postpone the expense of moving cooler fuel to dry casks and of the final decommissioning of retired reactors.

Everyone out of the pool

As disasters as far afield as the 2011 Tohoku earthquake and last October’s Hurricane Sandy have demonstrated, the storage of spent nuclear fuel in pools requires steady supplies of power and cool water. Any problem that prevents the active circulation of liquid through the spent fuel pools–be it a loss of electricity, the failure of a back-up pump, the clogging of a valve or a leak in the system–means the temperature in the pools will start to rise. If the cooling circuit is out long enough, the water in the pools will start to boil. If the water level dips (due to boiling or a leak) enough to expose hot fuel rods to the air, the metal cladding on the rods will start to burn, in turn heating the fuel even more, resulting in plumes of smoke carrying radioactive isotopes into the atmosphere.

And because these spent fuel pools are so full–containing as much as five times more fuel than they were originally designed to hold, and at densities that come close to those in reactor cores–they both heat stagnant water more quickly and reach volatile temperatures faster when exposed to air.

A spent fuel pool and dry casks. (Both photos courtesy of the US Nuclear Regulatory Commission)

A spent fuel pool and dry casks. (Both photos courtesy of the US Nuclear Regulatory Commission)

After spent uranium has been in a pool for at least five years (considerably longer than most fuel is productive as an energy source inside the reactor), fuel rods are deemed cool enough to be moved to dry casks. Dry casks are sealed steel cylinders filled with spent fuel and inert gas, which are themselves encased in another layer of steel and concrete. These massive fuel “coffins” are then placed outside, spaced on concrete pads, so that air can circulate and continue to disperse heat.

While the long-term safety of dry casks is still in question, the fact that they require no active cooling system gives them an advantage, in the eyes of many experts, over pool storage. As if to highlight that difference, spent fuel pools at Fukushima Daiichi have posed some of the greatest challenges since the March 2011 earthquake and tsunami, whereas, to date, no quake or flood-related problems have been reported with any of Japan’s dry casks. The disparity was so obvious, that the NRC’s own staff review actually added a proposal to the post-Fukushima taskforce report, recommending that US plants take more fuel out of spent fuel pools and move it to dry casks. (A year-and-a-half later, however, there is still no regulation–or even a draft–requiring such a move.)

But current dry cask storage poses its own set of problems. Moving fuel rods from pools to casks is slow and costly–about $1.5 million per cask, or roughly $7 billion to move all of the nation’s spent fuel (a process, it is estimated, that would take no less than five to ten years). That is expensive enough to have many nuclear plant operators lobbying overtime to avoid doing it.

Further, though not as seemingly vulnerable as fuel pools, dry casks are not impervious to natural disaster. In 2011, a moderate earthquake centered about 20 miles from the North Anna, Virginia, nuclear plant caused most of its vertical dry casks–each weighing 115 tons–to shift, some by more than four inches. The facility’s horizontal casks didn’t move, but some showed what was termed “cosmetic damage.”

Dry casks at Michigan’s Palisades plant sit on a pad atop a sand dune just 100 yards from Lake Michigan. An earthquake there could plunge the casks into the water. And the casks at Palisades are so poorly designed and maintained, submersion could result in water contacting the fuel, contaminating the lake and possibly triggering a nuclear chain reaction.

And though each cask contains far less fissile material than one spent fuel pool, casks are still considered possible targets for terrorism. A TOW anti-tank missile would breach even the best dry cask (PDF), and with 25 percent of the nation’s spent fuel now stored in hundreds of casks across the country, all above ground, it provides a rich target environment.

Confidence game

Two months after the Appeals Court found fault with the Nuclear Regulatory Commission’s imaginary waste mitigation scenario, the NRC announced it would suspend the issuing of new reactor operating licenses, license renewals and construction licenses until the agency could craft a new plan for dealing with the nation’s growing spent nuclear fuel crisis. In drafting its new nuclear “Waste Confidence Decision” (NWCD)–the methodology used to assess the hazards of nuclear waste storage–the Commission said it would evaluate all possible options for resolving the issue.

At first, the NRC said this could include both generic and site-specific actions (remember, the court criticized the NRC’s generic appraisals of pool safety), but as the prescribed process now progresses, it appears any new rule will be designed to give the agency, and so, the industry, as much wiggle room as possible. At a public hearing in November, and later at a pair of web conferences in early December, the regulator’s Waste Confidence Directorate (yes, that’s what it is called) outlined three scenarios (PDF) for any future rulemaking:

  • Storage until a repository becomes available at the middle of the century
  • Storage until a repository becomes available at the end of the century
  • Continued storage in the event a repository is not available

And while, given the current state of affairs, the first option seems optimistic, the fact that their best scenario now projects a repository to be ready by about 2050 is a story in itself.

When the Nuclear Waste Policy Act was signed into law by President Reagan early in 1983, it was expected the process it set in motion would present at least one (and preferably another) long-term repository by the late 1990s. But by the time the “Screw Nevada Bill” (as it is affectionately known in the Silver State) locked in Yucca Mountain as the only option for permanent nuclear waste storage, the projected opening was pushed back to 2007.

But Yucca encountered problems from its earliest days, so a mid-’90s revision of the timeline postponed the official start, this time to 2010. By 2006, the Department of Energy was pegging Yucca’s opening at 2017. And, when the NWPA was again revised in 2010–after Yucca was deemed a non-option–it conveniently avoided setting a date for the opening of a national long-term waste repository altogether.

It was that 2010 revision that was thrown out by the courts in June.

“Interim storage” and “likely reactors”

So, the waste panel now has three scenarios–but what are the underlying assumptions for those scenarios? Not, obviously, any particular site for a centralized, permanent home for the nation’s nuclear garbage–no new site has been chosen, and it can’t even be said there is an active process at work that will choose one.

There are the recommendations of a Blue Ribbon Commission (BRC) convened by the president after Yucca Mountain was off the table. Most notable there, was a recommendation for interim waste storage, consolidated at a handful of locations across the country. But consolidated intermediate waste storage has its own difficulties, not the least of which is that no sites have yet been chosen for any such endeavor. (In fact, plans for the Skull Valley repository, thought to be the interim facility closest to approval, were abandoned by its sponsors just days before Christmas.)

Just-retired New Mexico Senator Jeff Bingaman (D), the last chair of the Energy and Natural Resources Committee, tried to turn the BRC recommendations into law. When he introduced his bill in August, however, he had to do so without any cosponsors. Hearings on the Nuclear Waste Administration Act of 2012 were held in September, but the gavel came down on the 112th Congress without any further action.

In spite of the underdeveloped state of intermediate storage, however, when the waste confidence panel was questioned on the possibility, interim waste repositories seemed to emerge, almost on the fly, as an integral part of any revised waste policy rule.

“Will any of your scenarios include interim centralized above-ground storage?” we asked during the last public session. Paul Michalak, who heads the Environmental Impact Statement branch of the Waste Confidence Directorate, first said temporary sites would be considered in the second and third options. Then, after a short pause, Mr. Michalak added (PDF p40), “First one, too. All right. Right. That’s right. So we’re considering an interim consolidated storage facility [in] all three scenarios.”

The lack of certainty on any site or sites is, however, not the only fuzzy part of the picture. As mentioned earlier, the amount of high-level radioactive waste currently on hand in the US and in need of a final resting place is upwards of 70,000 tons–already at the amount that was set as the initial limit for the Yucca Mountain repository. Given that there are still over 100 domestic commercial nuclear reactors more or less in operation, producing something like an additional 2,000 tons of spent fuel every year, what happens to the Waste Confidence Directorate’s scenarios as the years and waste pile up? How much waste were regulators projecting they would have to deal with–how much spent fuel would a waste confidence decision assume the system could confidently handle?

There was initial confusion on what amount of waste–and at what point in time–was informing the process. Pressed for clarification on the last day of hearings, NRC officials finally posited that it was assumed there would be 150,000 metric tons of spent fuel–all deriving from the commercial reactor fleet–by 2050. By the end of the century, the NRC expects to face a mountain of waste weighing 270,000 metric tons (PDF pp38-41) (though this figure was perplexingly termed both a “conservative number” and an “overestimate”).

How did the panel arrive at these numbers? Were they assuming all 104 (soon to be 103–Wisconsin’s Kewaunee Power Station will shut down by mid-2013 for reasons its owner, Dominion Resources, says are based “purely on economics”) commercial reactors nominally in operation would continue to function for that entire time frame–even though many are nearing the end of their design life and none are licensed to continue operation beyond the 2030s? Were they counting reactors like those at San Onofre, which have been offline for almost a year, and are not expected to restart anytime soon? Or the troubled reactors at Ft. Calhoun in Nebraska and Florida’s Crystal River? Neither facility has been functional in recent years, and both have many hurdles to overcome if they are ever to produce power again. Were they factoring in the projected AP1000 reactors in the early stages of construction in Georgia, or the ones slated for South Carolina? Did the NRC expect more or fewer reactors generating waste over the course of the next 88 years?

The response: waste estimates include all existing facilities, plus “likely reactors”–but the NRC cannot say exactly how many reactors that is (PDF p41).

Jamming it through

Answers like those from the Waste Confidence Directorate do not inspire (pardon the expression) confidence for a country looking at a mountain of eternally toxic waste. Just what would the waste confidence decision (and the environmental impact survey that should result from it) actually cover? What would it mandate, and what would change as a result?

How long is it? Does this NRC chart provide a justification for the narrow scope of the waste confidence process? (US Nuclear Regulatory PDF, p12)

How long is it? Does this NRC chart provide a justification for the narrow scope of the waste confidence process? (US Nuclear Regulatory PDF, p12)

In past relicensing hearings–where the public could comment on proposed license extensions on plants already reaching the end of their 40-year design life–objections based on the mounting waste problem and already packed spent fuel pools were waived off by the NRC, which referenced the waste confidence decision as the basis of its rationale. Yet, when discussing the parameters of the process for the latest, court-ordered revision to the NWCD, Dr. Keith McConnell, Director of the Waste Confidence Directorate, asserted that waste confidence was not connected to the site-specific licensed life of operations (PDF p42), but only to a period defined as “Post-Licensed Life Storage” (which appears, if a chart in the directorate’s presentation (PDF p12) is to be taken literally, to extend from 60 years after the initial creation of waste, to 120 years–at which point a phase labeled “Disposal” begins). Issues of spent fuel pool and dry cask safety are the concerns of a specific plant’s relicensing process, said regulators in the latest hearings.

“It’s like dealing with the Mad Hatter,” commented Kevin Kamps, a radioactive waste specialist for industry watchdog Beyond Nuclear. “Jam yesterday, jam tomorrow, but never jam today.”

The edict originated with the White Queen in Lewis Carroll’s Through the Looking Glass, but it is all too appropriate–and no less maddening–when trying to motivate meaningful change at the Nuclear Regulatory Commission. The NRC has used the nuclear waste confidence decision in licensing inquiries, but in these latest scoping hearings, we are told the NWCD does not apply to on-site waste storage. The Appeals Court criticized the lack of site-specificity in the waste storage rules, but the directorate says they are now only working on a generic guideline. The court disapproved of the NRC’s continued relicensing of nuclear facilities based on the assumption of a long-term geologic repository that in reality did not exist–and the NRC said it was suspending licensing pending a new rule–but now regulators say they don’t anticipate the denial or even the delay of any reactor license application while they await the new waste confidence decision (PDF pp49-50).

In fact, the NRC has continued the review process on pending applications, even though there is now no working NWCD–something deemed essential by the courts–against which to evaluate new licenses.

The period for public comment on the scope of the waste confidence decision ended January 2, and no more scoping hearings are planned. There will be other periods for civic involvement–during the environmental impact survey and rulemaking phases–but, with each step, the areas open to input diminish. And the current schedule has the entire process greatly accelerated over previous revisions.

On January 3, a coalition of 24 grassroots environmental groups filed documents with the Nuclear Regulatory Commission (PDF) protesting “the ‘hurry up’ two-year timeframe” for this assessment, noting the time allotted for environmental review falls far short of the 2019 estimate set by the NRC’s own technical staff. The coalition observed that two years was also not enough time to integrate post-Fukushima recommendations, and that the NRC was narrowing the scope of the decision–ignoring specific instructions from the Appeals Court–in order to accelerate the drafting of a new waste storage rule.

Speed might seem a valuable asset if the NRC were shepherding a Manhattan Project-style push for a solution to the ever-growing waste problem–the one that began with the original Manhattan Project–but that is not what is at work here. Instead, the NRC, under court order, is trying to set the rules for determining the risk of all that high-level radioactive waste if there is no new, feasible solution. The NRC is looking for a way to permit the continued operation of the US nuclear fleet–and so the continued manufacture of nuclear waste–without an answer to the bigger, pressing question.

A plan called HOSS

While there is much to debate about what a true permanent solution to the nuclear waste problem might look like, there is little question that the status quo is unacceptable. Spent fuel pools were never intended to be used as they are now used–re-racked and densely packed with over a generation of fuel assemblies. Both the short- and long-term safety and security of the pools has now been questioned by the courts and laid bare by reality. Pools at numerous US facilities have leaked radioactive waste (PDF) into rivers, groundwater and soil. Sudden “drain downs” have come perilously close to triggering major accidents in plants shockingly close to major population centers. Recent hurricanes have knocked out power to cooling systems and flooded backup generators, and last fall’s superstorm came within inches of overwhelming the coolant intake structure at Oyster Creek in New Jersey.

The crisis at Japan’s Fukushima Daiichi facility was so dangerous and remains dangerous to this day in part because of the large amounts of spent fuel stored in pools next to the reactors but outside of containment–a design identical to 35 US nuclear reactors. A number of these GE Mark 1 Boiling Water Reactors–such as Oyster Creek and Vermont Yankee–have more spent fuel packed into their individual pools than all the waste in Fukushima Daiichi Units 1, 2, 3, and 4 combined.

Dry casks, the obvious next “less-bad” option for high-level radioactive waste, were also not supposed to be a permanent panacea. The design requirements and manufacturing regulations of casks–especially the earliest generations–do not guarantee their reliability anywhere near the 100 to 300 years now being casually tossed around by NRC officials. Some of the nation’s older dry casks (which in this case means 15 to 25 years) have already shown seal failures and structural wear (PDF). Yet, the government does not require direct monitoring of casks for excessive heat or radioactive leaks–only periodic “walkthroughs.”

Add in the reluctance of plant operators to spend money on dry cask transfer and the lack of any workable plan to quickly remove radioactive fuel from failed casks, and dry cask storage also appears to fail to attain any court-ordered level of confidence.

Interim plans, such as regional consolidated above-ground storage, remain just that–plans. There are no sites selected and no designs for such a facility up for public scrutiny. What is readily apparent, though, is that the frequent transport of nuclear waste increases the risk of nuclear accidents. There does not, as of now, exist a transfer container that is wholly leak proof, accident proof, and impervious to terrorist attack. Moving high-level radioactive waste across the nation’s highways, rail lines and waterways has raised fears of “Mobile Chernobyls” and “Floating Fukushimas.”

More troubling still, if past (and present) is prologue, is the tendency of options designed as “interim” to morph into a default “permanent.” Can the nation afford to kick the can once more, spending tens (if not hundreds) of millions of dollars on a “solution” that will only add a collection of new challenges to the existing roster of problems? What will the interim facilities become beyond the next problem, the next site for costly mountains of poorly stored, dangerous waste?

Hardened: The more robust HOSS option as proposed in 2003. (From "Robust Storage of Spent Nuclear Fuel: A Neglected Issue of Homeland Security" courtesy of the Nuclear Information and Resource Service)

Hardened: The more robust HOSS option as proposed in 2003. (From “Robust Storage of Spent Nuclear Fuel: A Neglected Issue of Homeland Security” courtesy of the Nuclear Information and Resource Service)

If there is an interim option favored by many nuclear experts, engineers and environmentalists (PDF), it is something called HOSS–Hardened On-Site Storage (PDF). HOSS is a version of dry cask storage that is designed and manufactured to last longer, is better protected against leaks and better shielded from potential attacks. Proposals (PDF) involve steel, concrete and earthen barriers incorporating proper ventilation and direct monitoring for heat and radiation.

But not all reactor sites are good candidates for HOSS. Some are too close to rivers that regularly flood, some are vulnerable to the rising seas and increasingly severe storms brought on by climate change, and others are close to active geologic fault zones. For facilities where hardened on-site storage would be an option, nuclear operators will no doubt fight the requirements because of the increased costs above and beyond the price of standard dry cask storage, which most plant owners already try to avoid or delay.

The first rule of holes

Mixed messages: A simple stone marker in Red Gate Woods, just outside Chicago, tries to both warn and reassure visitors to this public park. (Photo: Kevin Kamps, Beyond Nuclear. Used by permission.)

Mixed messages: A simple stone marker in Red Gate Woods, just outside Chicago, tries to both warn and reassure visitors to this public park. (Photo: Kevin Kamps, Beyond Nuclear. Used by permission.)

In a wooded park just outside Chicago sits a dirt mound, near a bike path, that contains parts of the still-highly radioactive remains of CP-1, the world’s first atomic pile. Seven decades after that nuclear fuel was first buried, many health experts would not recommend that spot (PDF) for a long, languorous picnic, nor would they recommend drinking from nearby water fountains. To look at it in terms Arthur Compton might favor, when it comes to the products of nuclear chain reactions, the natives are restless. . . and will remain so for millennia to come.

One can perhaps forgive those working in the pressure cooker of the Manhattan Project and in the middle of a world war for ignoring the forest for the trees–for not considering waste disposal while pursuing a self-sustaining nuclear chain reaction. Perhaps. But, as the burial mound in Red Gate Woods reminds us, ignoring a problem does not make it go away.

And if that small pile, or the mountains of spent fuel precariously stored around the nation are not enough of a prompt, the roughly $960 million that the federal government has had to pay private nuclear operators should be. For every year that the Department of Energy does not provide a permanent waste repository–or at least some option that takes the burden of storing spent nuclear fuel off the hands (and off the books) of power companies–the government is obligated to reimburse the industry for the costs of onsite waste storage. By 2020, it is estimated that $11 billion in public money will have been transferred into the pockets of private nuclear companies. By law, these payments cannot be drawn from the ratepayer-fed fund that is earmarked for a permanent geologic repository, and so, these liabilities must be paid out of the federal budget. Legal fees for defending the DoE against these claims will add another 20 to 30 percent to settlement costs.

The Federal Appeals Court, too, has sent a clear message that the buck needs to stop somewhere at some point–and that such a time and place should be both explicit and realistic. The nuclear waste confidence scoping process, however, is already giving the impression that the NRC’s next move will be generic and improbable.

The late, great Texas journalist Molly Ivins once remarked, “The first rule of holes” is “when you’re in one, stop digging.” For high-level radioactive waste, that hole is now a mountain, over 70 years in the making and over 70,000 tons high. If the history of the atomic age is not evidence enough, the implications of the waste confidence decision process put the current crisis in stark relief. There is, right now, no good option for dealing with the nuclear detritus currently on hand, and there is not even a plan to develop a good option in the near future. Without a way to safely store the mountain of waste already created, under what rationale can a responsible government permit the manufacture of so much more?

The federal government spends billions to perpetuate and protect the nuclear industry–and plans to spend billions more to expand the number of commercial reactors. Dozens of facilities already are past, or are fast approaching, the end of their design lives, but the Nuclear Regulatory Commission has yet to reject any request for an operating license extension–and it is poised to approve many more, nuclear waste confidence decision not withstanding. Plant operators continue to balk at any additional regulations that would require better waste management.

The lesson of the first 70 years of fission is that we cannot endure more of the same. The government–from the DoE to the NRC–should reorient its priorities from creating more nuclear waste to safely and securely containing what is now here. Money slated for subsidizing current reactors and building new ones would be better spent on shuttering aging plants, designing better storage options for their waste, modernizing the electrical grid, and developing sustainable energy alternatives. (And reducing demand through conservation programs should always be part of the conversation.)

Enrico Fermi might not have foreseen (or cared about) the mountain of waste that began with his first atomic pile, but current scientists, regulators and elected officials have the benefit of hindsight. If the first rule of holes says stop digging, then the dictum here should be that when you’re trying to summit a mountain, you don’t keep shoveling more garbage on top.

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

New Fukushima Video Shows Disorganized Response, Organized Deception

A frame from early in the newly released Fukushima video.

Tokyo Electric Power Company (TEPCO), the operator of the Fukushima Daiichi nuclear power plant when the Tohoku earthquake and tsunami struck last year, bowed to public and government pressure this week, releasing 150 hours of video recorded during the first days of the Fukushima crisis. Even with some faces obscured and two-thirds of the audio missing, the tapes clearly show a nuclear infrastructure wholly unprepared for the disaster, and an industry and government wholly determined to downplay that disaster’s severity:

Though incomplete, the footage from a concrete bunker at the plant confirms what many had long suspected: that the Tokyo Electric Power Company, the plant’s operator, knew from the early hours of the crisis that multiple meltdowns were likely despite its repeated attempts in the weeks that followed to deny such a probability.

It also suggests that the government, during one of the bleakest moments, ordered the company not to share information with the public, or even local officials trying to decide if more people should evacuate.

Above all, the videos depict mayhem at the plant, a lack of preparedness so profound that too few buses were on hand to carry workers away in the event of an evacuation. They also paint a close-up portrait of the man at the center of the crisis, Mr. Yoshida, who galvanizes his team of engineers as they defy explosions and fires — and sometimes battle their own superiors.

That summary is from New York Times Tokyo-based reporter Hiroko Tabuchi. The story she tells is compelling and terrifying, and focuses on the apparent heroism of Masao Yoshida, Fukushima’s chief manager when the crisis began, along with the far less estimable behavior of TEPCO and Japanese government officials. It is worth a couple of your monthly quota of clicks to read all the way through.

The story is but one take on the video, and I point this out not because I question Tabuchi’s reporting on its content, much of which is consistent with what is already known about the unholy alliance between the nuclear industry and the Japanese government, and about what those parties did to serve their own interests at the expense of the Japanese people (and many others across the northern hemisphere). Instead, I bring this up because I do not myself speak Japanese, and I am only allowed to view a 90-minute “highlight reel” and not the entire 150 hours of video, and so I am dependent on other reporters’ interpretations. And because neither TEPCO nor the Japanese government (which now essentially owns TEPCO) has yet proven to be completely open or honest on matters nuclear, the subtle differences in those interpretations matter.

Tabuchi took to Twitter to say how much she wanted to tell the story as “a tribute to Fukushima Daiichi chief Yoshida and the brave men on the ground who tried to save us.” But in a separate tweet, Tabuchi said she was “heartbroken” to discover her article was cut in half.

Editing is, of course, part of journalism. Trimming happens to many stories in many papers. But I had to raise an eyebrow when I saw a note at the bottom of Tabuchi’s piece that said Matthew Wald “contributed reporting from Washington.” I have previously been critical of Wald–a Times veteran, contributor to their Green blog, and often their go-to reporter on nuclear power–for stories that sometimes read like brochures from the Nuclear Energy Institute. Wald tends to perpetuate myths in line with the old “clean, safe, and too cheap to meter” saw, while reserving a much, uh, healthier (?) skepticism for nuclear power critics and renewable energy advocates.

There is, of course, no way to know what Wald’s contributions (or redactions) were in this case, and it is doubtful any of the parties involved would tell us, but what particularly stokes my curiosity is this paragraph:

Despite the close-up view of the disaster, the videos — which also capture teleconferences with executives in Tokyo — leave many questions unresolved, in good part because only 50 of 150 hours include audio. The company blamed technical problems for the lack of audio.

TEPCO might blame technical problems, but reports from other news services seem to leave little doubt that the general belief is that the audio has been withheld–or in some cases most obviously obscured–by TEPCO. The BBC’s Mariko Oi saw it this way:

Tepco has bowed to pressure to release 150 hours of teleconferencing footage but the tape was heavily edited and mostly muted to “protect employees’ privacy”.

. . . .

Tepco is again under criticism for not releasing the full recordings and has been asked if it was removing more than employees’ names and phone numbers.

And Mari Yamaguchi of the Associated Press reported even more directly about TEPCO’s intent:

Japan’s former prime minister criticized the tsunami-hit nuclear plant’s operator Wednesday for heavily editing the limited video coverage it released of the disaster, including a portion in which his emotional speech to utility executives and workers was silenced.

Naoto Kan called for Tokyo Electric Power Co. to release all of its video coverage, beyond the first five days. Two-thirds of the 150 hours of videos it released Monday are without sound, including one segment showing Kan’s visit to the utility’s headquarters on March 15 last year, four days after a tsunami critically damaged three reactors at the Fukushima Dai-ichi power plant.

Many people’s faces, except for the plant chief and top executives in Tokyo, are obscured in the videos and frequent beeps mask voices and other sound.

The AP story also points out that the released video arbitrarily ends at midnight on March 15–and though it is not known how much more tape exists, it appears clear that TEPCO has held some substantial portion back. After five days, the Fukushima crisis was far from over, after all (as it is still far from over), and the recordings end amidst some of the disaster’s most critical events.

But the New York Times omits all of this, leaving TEPCO’s Rose Mary Woods-like excuse to stand as the innocent truth.

That’s a shame, because the way you read this story changes when you look at some of the horrific revelations keeping in mind that this is only the part TEPCO decided it could let you see. Here are just a few highlights. . . or lowlights:

  • Plant managers and TEPCO officials were aware from the earliest hours of the crisis that they were likely facing multiple meltdowns.
  • Japanese government officials withheld information–and ordered TEPCO to withhold information–on radiation levels that could have helped untold numbers of civilians reduce their exposure.
  • Despite warnings years prior that such natural disasters were possible in the region, Fukushima operators had no plan to deal with the damage and loss of power caused by the quake and tsunami.
  • TEPCO did not even have the infrastructure or procedures in place to evacuate its own employees from an imperiled facility.
  • Plant officials were–from the earliest days–as worried about the spent fuel pools as they were about the reactors. Those on the scene feared that most of the pools at Daiichi, not just the one at reactor four, were facing loss of coolant and the fires and massive radiation leaks that would follow, though publicly they said none of the pools were a danger at the time.

And there is more about the dire conditions for plant workers, the lack of food or water, the high levels of radiation exposure, and even a point where employees had to pool their cash to buy water and gasoline. And, as noted above, that’s just the part TEPCO has deemed acceptable for release.

Above all, though–beyond the discrepancies in reporting, beyond the moral failings of TEPCO and government officials, beyond the heroism of those at the crippled facility–what the new Fukushima tapes reveal is what those who watch the nuclear industry have mostly known all along. Nuclear power is dangerous–the radiation, the complexity of the system, the waste, the reliance on everything going right, and the corrupt conspiracy between industry and government saddle this form of energy production with unacceptable risks. The video now available might shed some light on how things at Fukushima went horribly wrong, but the entire world already knows plenty of who, what, where and when. We all know that things at Fukushima did go horribly wrong, and so many know that they must suffer because of it.

Made in Japan? Fukushima Crisis Is Nuclear, Not Cultural

(photo: Steve Snodgrass)

Since the release of the Fukushima Nuclear Accident Independent Committee’s official report last week, much has been made of how it implicates Japanese culture as one of the root causes of the crisis. The committee’s chairman, Dr. Kiyoshi Kurokawa, makes the accusation quite plainly in the opening paragraphs of the executive summary [PDF]:

What must be admitted – very painfully – is that this was a disaster “Made in Japan.” Its fundamental causes are to be found in the ingrained conventions of Japanese culture: our reflexive obedience; our reluctance to question authority; our devotion to ‘sticking with the program’; our groupism; and our insularity.

That this apparently critical self-examination was seized upon by much of the western media’s coverage of the report probably does not come as a surprise–especially when you consider that this revelation falls within the first 300 words of an 88-page document. Cultural stereotypes and incomplete reads are hardly new to establishment reportage. What might come as a shock, however, is that this painful admission is only made in the English-language version of the document, and only in the chairman’s introduction is the “made in Japan” conclusion drawn so specifically.

What replaces the cultural critique in the Japanese edition and in the body of the English summary is a ringing indictment of the cozy relationship between the Japanese nuclear industry and the government agencies that were supposed to regulate it. This “regulatory capture,” as the report details, is certainly central to the committee’s findings and crucial to understanding how the Fukushima disaster is a manmade catastrophe, but it is not unique to the culture of Japan.

Indeed, observers of the United States will recognize this lax regulatory construct as part-and-parcel of problems that threaten the safety and health of its citizenry, be it in the nuclear sector, the energy sector as a whole, or across a wide variety of officially regulated industries.

No protection

The Japanese Diet’s Fukushima report includes a healthy dose of displeasure with the close ties between government regulators and the nuclear industry they were supposed to monitor. The closed, insular nature of nuclear oversight that might be attributed to Japanese culture by a superficial read is, in fact, a product of the universally familiar “revolving door” that sees industry insiders taking turns as government bureaucrats, and regulatory staff “graduating” to well-compensated positions in the private sector.

Mariko Oi, a reporter at the BBC’s Tokyo bureau, described the situation this way when discussing the Fukushima report on the World Service:

When there was a whistleblower, the first call that the government or the ministry made was to TEPCO, saying, “Hey, you’ve got a whistleblower,” instead of “Hey, you’ve got a problem at the nuclear reactor.”

A disturbing betrayal of accountability in any context, it is especially troubling with the ominous repercussions of the Fukushima disaster still metastasizing. And it is also ominously familiar.

Look, for example, just across the Pacific:

[San Onofre Nuclear Generating Station] was chastised two years ago by the U.S. Nuclear Regulatory Commission for creating an atmosphere in which employees fear retaliation if they report safety concerns.

. . . .

Edward Bussey, a former health physics technician at the plant, sued Edison in state court after he was fired in 2006 under what he said were trumped-up charges that he had falsified initials on logs documenting that certain materials had been checked for radiation. Bussey contended that he was really fired in retaliation for complaining about safety concerns to his supervisors and the NRC.

San Onofre–SONGS, if you will–has been offline since January when a radioactive steam leak led to the discovery of severely degraded copper tubing in both of the plant’s existing reactors. But here’s the real kicker: whistleblower suits at SONGS, like the one from Mr. Bussey, have routinely been summarily dismissed thanks to a little known legal loophole:

San Onofre is majority owned and operated by Southern California Edison, a private company, but it sits on land leased from the Camp Pendleton Marine Corps base.

That puts the plant in a so-called federal enclave, where courts have held that many California laws, including labor laws intended to protect whistle-blowers, do not apply.

Lawsuits filed in state court by San Onofre workers who claimed that they were fired or retaliated against for reporting safety concerns, sexual harassment and other issues have been tossed out because of the plant’s location.

The Los Angeles Times cites examples dating back to the construction of San Onofre where personnel who complained about safety or work conditions were terminated and left without many of the legal options normally afforded most California citizens. The history of SONGS is liberally peppered with accidents and safety breaches–and the lies and cover-ups from its owner-operators that go with them. Considering that San Onofre employees are regularly punished for exposing problems and have fewer whistleblower protections, is it at all surprising that SONGS is reported to have the worst safety record of all US nuclear plants?

If San Onofre’s track record isn’t evidence enough of the dangers of weak regulation, the findings and conclusions of the latest Fukushima report make it crystal clear: “safety culture” is not undermined by Japanese culture so much as it is by the more international culture of corruption born of the incestuous relationship between industry and regulators.

It’s a nuclear thing…

But the corrupt culture–be it national or universal–is itself a bit of a dodge. As noted by the Financial Times, the Japanese and their regulatory structure have managed to operate the technologically complex Shinkansen bullet trains since 1964 without a single derailment or fatal collision.

As the Diet’s report makes abundantly clear–far more clear than any talk about Japanese culture–the multiple failures at and around Fukushima Daiichi were directly related to the design of the reactors and to fatal flaws inherent in nuclear power generation.

Return for a moment to something discussed here last summer, The Light Water Paradox: “In order to safely generate a steady stream of electricity, a light water reactor needs a steady stream of electricity.” As previously noted, this is not some perpetual motion riddle–all but one of Japan’s commercial nuclear reactors and every operating reactor in the United States is of a design that requires water to be actively pumped though the reactor containment in order to keep the radioactive fuel cool enough to prevent a string of catastrophes, from hydrogen explosions and cladding fires, to core meltdowns and melt-throughs.

Most of the multiple calamities to befall Fukushima Daiichi have their roots in the paradox. As many have observed and the latest Japanese report reiterates, the Tohoku earthquake caused breaches in reactor containment and cooling structures, and damaged all of Fukushima’s electrical systems, save the diesel backup generators, which were in turn taken out by the tsunami that followed the quake. Meeting the demands of the paradox–circulating coolant in a contained system–was severely compromised after the quake, and was rendered completely impossible after the tsunami. Given Japan’s seismic history, and the need of any light water reactor for massive amounts of water, Fukushima wouldn’t really have been a surprise even if scientists hadn’t been telling plant operators and Japanese regulators about these very problems for the last two decades.

Back at San Onofre, US regulators disclosed Thursday that the damage to the metal tubes that circulate radioactive water between the reactor and the steam turbines (in other words, part of the system that takes heat away from the core) was far more extensive than had previously been disclosed by plant operators:

[Each of San Onofre’s steam generators has] 9,727 U-shaped tubes inside, each three-quarters of an inch in diameter.

The alloy tubes represent a critical safety barrier — if one breaks, there is the potential that radioactivity could escape into the atmosphere. Also, serious leaks can drain protective cooling water from a reactor.

Gradual wear is common in such tubing, but the rate of erosion at San Onofre startled officials since the equipment is relatively new. The generators were replaced in a $670 million overhaul and began operating in April 2010 in Unit 2 and February 2011 in Unit 3.

Tubes have to be taken out of service if 35 percent — roughly a third — of the wall wears away, and each of the four generators at the plant is designed to operate with a maximum of 778 retired tubes.

In one troubled generator in Unit 3, 420 tubes have been retired. The records show another 197 tubes in that generator have between 20 percent and 34 percent wear, meaning they are close to reaching the point when they would be at risk of breaking.

More than 500 others in that generator have between 10 percent and 19 percent wear in the tube wall.

“The new data reveal that there are thousands of damaged tubes in both Units 2 and 3, raising serious questions whether either unit should ever be restarted,” said Daniel Hirsch, a lecturer on nuclear policy at the University of California, Santa Cruz, who is a critic of the industry. “The problem is vastly larger than has been disclosed to date.”

And if anything, the Nuclear Regulatory Commission is underplaying the problem. A report from Fairewinds Associates, also released this week, unfavorably compared San Onofre’s situation with similar problems at other facilities:

[SONGS] has plugged 3.7 times as many steam generator tubes than the combined total of the entire number of plugged replacement steam generator tubes at all the other nuclear power plants in the US.

The report also explains that eight of the tubes failed a “pressure test” at San Onofre, while the same test at other facilities had never triggered any more than one tube breach. Fairewinds goes on to note that both units at San Onofre are equally precarious, and that neither can be restarted with any real promise of safe operation.

And while the rapid degeneration of the tubing might be peculiar to San Onofre, the dangers inherent in a system that requires constant power for constant cooling–lest a long list of possible problems triggers a toxic crisis–are evident across the entire US nuclear fleet. Cracked containment buildings, coolant leaks, transformer fires, power outages, and a vast catalogue of human errors fill the NRC’s event reports practically every month of every year for the past 40 years. To put it simply, with nuclear power, too much can go wrong when everything has to go right.

And this is to say nothing of the dangers that come with nuclear waste storage. Like with the reactors, the spent fuel pools that dot the grounds of almost every nuclear plant in America and Japan require a consistent and constantly circulating water supply to keep them from overheating (which would result in many of the same disastrous outcomes seen with damaged reactors). At Fukushima, one of the spent fuel pools is, at any given point, as much of a concern as the severely damaged reactor cores.

Ions and tigers and bears, oh my!

Even with the latest findings, however, Japanese Prime Minister Yoshihiko Noda pushed ahead with the restart of the precariously situated and similarly flawed nuclear reactor complex at Oi. It is as if the PM and the nuclear industry feared Japan surviving another summer without nuclear-generated electricity would demonstrate once and for all that the country had no reason to trade so much of its health and safety for an unnecessary return.

But the people of Japan seem to see it differently. Tens of thousands have turned out to demonstrate against their nation’s slide back into this dangerous culture of corruption. (Remember, the Oi restart comes without any safety upgrades made in response to the Fukushima disaster.)

And maybe there’s where cultural distinctions can be drawn. In Japan, the citizenry–especially women–are not demonstrating “reflexive obedience,” instead, they are demonstrating. In the United States, where 23 nuclear reactors are of the same design as Fukushima Daiichi, and 184 million people within 50 miles of a nuclear power plant, when the chairman of the Nuclear Regulatory Commission suggested requiring some modest upgrades as a response to the Fukushima disaster, the nuclear industry got its henchmen on the NRC and in Congress to push him out. . . with little public outcry.

Still, the BBC’s Mariko Oi lamented on the day the Fukushima report was released that Japanese media was paying more attention to the birth of a giant panda at a Tokyo zoo. That sort of response would seem all too familiar to any consumer of American media.

That baby panda, it should be noted, has since died. The radioactive fallout from Fukushima, however, lingers, and the crisis at Daiichi is far from over. The threat to global heath and safety that is unique to nuclear power lives on.

Fukushima Nuclear Disaster “Man-Made” Reports Japanese Panel; Quake Damaged Plant Before Tsunami

Aerial view of the Oi Nuclear Power Plant, Fukui Prefecture, Japan. (photo: Japan Ministry of Land, Infrastructure and Transport via Wikipedia)

The massive disaster at the Fukushima Daiichi nuclear facility that began with the March 11, 2011 Tohoku earthquake and tsunami could have been prevented and was likely made worse by the response of government officials and plant owners, so says a lengthy report released today by the Japanese Diet (their parliament).

The official report of The Fukushima Nuclear Accident Independent Investigation Committee [PDF] harshly criticizes the Japanese nuclear industry for avoiding safety upgrades and disaster plans that could have mitigated much of what went wrong after a massive quake struck the northeast of Japan last year. The account also includes direct evidence that Japanese regulatory agencies conspired with TEPCO (Fukushima’s owner-operator) to help them forestall improvements and evade scrutiny:

The TEPCO Fukushima Nuclear Power Plant accident was the result of collusion between the government, the regulators and TEPCO, and the lack of governance by said parties. They effectively betrayed the nation’s right to be safe from nuclear accidents.

. . . .

We found evidence that the regulatory agencies would explicitly ask about the operators’ intentions whenever a new regulation was to be implemented. For example, NISA informed the operators that they did not need to consider a possible station blackout (SBO) because the probability was small and other measures were in place. It then asked the operators to write a report that would give the appropriate rationale for why this consideration was unnecessary.

The report also pointed to Japanese cultural conventions, namely the reluctance to question authority–a common refrain in many post-Fukushima analyses.

But perhaps most damning, and most important to the future of Japan and to the future of nuclear power worldwide, is the Investigation’s finding that parts of the containment and cooling systems at Fukushima Daiichi were almost certainly damaged by the earthquake before the mammoth tsunami caused additional destruction:

We conclude that TEPCO was too quick to cite the tsunami as the cause of the nuclear accident and deny that the earthquake caused any damage.

. . . .

[I]t is impossible to limit the direct cause of the accident to the tsunami without substantive evidence. The Commission believes that this is an attempt to avoid responsibility by putting all the blame on the unexpected (the tsunami), as they wrote in their midterm report, and not on the more foreseeable earthquake.

Through our investigation, we have verified that the people involved were aware of the risk from both earthquakes and tsunami. Further, the damage to Unit 1 was caused not only by the tsunami but also by the earthquake, a conclusion made after considering the facts that: 1) the largest tremor hit after the automatic shutdown (SCRAM); 2) JNES confirmed the possibility of a small-scale LOCA (loss of coolant accident); 3) the Unit 1 operators were concerned about leakage of coolant from the valve, and 4) the safety relief valve (SR) was not operating.

Additionally, there were two causes for the loss of external power, both earthquake-related: there was no diversity or independence in the earthquake-resistant external power systems, and the Shin-Fukushima transformer station was not earthquake resistant.

As has been discussed here many times, the nuclear industry and its boosters in government like to point to the “who could have possibly imagined,” “one-two punch” scenario of quake and tsunami to both vouch for the safety of other nuclear facilities and counter any call for reexamination and upgrades of existing safety systems. Fukushima, however, has always proved the catastrophic case study that actually countered this argument–and now there is an exhaustive study to buttress the point.

First, both the quake and the tsunami were far from unpredictable. The chances of each–as well as the magnitude–were very much part of predictions made by scientists and government bureaucrats. There is documentation that Japanese regulators knew and informed their nuclear industry of these potential disasters, but then looked the other way or actively aided the cause as plant operators consistently avoided improving structures, safety systems and accident protocols.

Second, even if there had not been a tsunami, Fukushima Daiichi would have still been a disaster. While the crisis was no doubt exacerbated by the loss of the diesel generators and the influx of seawater, the evidence continues to mount that reactor containment was breached and cooling systems were damaged by the earthquake first. Further, it was the earthquake that damaged all the electrical systems and backups aside from the diesel generators, and there is no guarantee that all generators would have worked flawlessly for their projected life-spans, that the other external and internal power systems could have been restored quickly, or that enough additional portable power could have been trucked in to the facility in time to prevent further damage. In fact, much points to less than optimal resolution of all of these problems.

To repeat, there was loss of external power, loss of coolant, containment breach, and release of radiation after the quake, but before the tsunami hit the Fukushima nuclear plant.

And now for the bad news. . . .

And yet, as harsh as this new report is (and it is even more critical than was expected, which is actually saying something), on first reading, it still appears to pull a punch.

Though the failure of the nuclear reactors and their safety systems is now even further documented in this report, its focus on industry obstruction and government collusion continues in some ways to perpetuate the “culture of safety” myth. By labeling the Fukushima disaster as “Made in Japan,” “manmade” and “preventable,” the panel–as we are fond of saying here–assumes a can opener. By talking up all that government and industry did wrong in advance of March 11, 2011, by critiquing all the lies and crossed signals after the earthquake and tsunami, and by recommending new protocols and upgrades, the Japanese report fiats a best-case scenario for a technology that has consistently proven that no such perfect plan exists.

The facts were all there before 3/11/11, and all the revelations since just add to the atomic pile. Nuclear fission is a process that has to go flawlessly to consistently provide safe and economical electrical power–but the process is too complex, and relies on too many parts, too many people and too volatile a fuel for that to ever really happen. Add in the costs and hazards of uranium mining, transport, fuel milling, and waste storage, and nuclear again proves itself to be dirty, dangerous, and disgustingly expensive.

* * *

And, as if to put an exclamation point at the end of the Diet’s report (and this column), the Japanese government moved this week to restart the nuclear plant at Oi, bringing the No. 3 reactor online just hours before the release of the new Fukushima findings. The Oi facility rests on a fault line, and seismologists, nuclear experts and activists have warned that this facility is at risk much in the way Fukushima Daiichi proved to be.

Most of Japan’s reactors were taken offline following the Tohoku quake, with the last of them–the Oi plant–shut down earlier this year. In the wake of the disaster, Japan’s then-Prime Minister, Naoto Kan, suggested that it might be time for his country to turn away from nuclear power. Demonstrators across Japan seemed to agree and urged Kansai Electric Power Company and current Prime Minister Yoshihiko Noda to delay the restart of Oi. But the government seemed to be hurrying to get Oi back up, despite many questions and several technical glitches.

Noda insists the rush is because of the need for electricity during the hot summer months, but Japan managed surprisingly well last summer (when more of the country’s infrastructure was still damaged from the quake and tsunami) with better conservation and efficiency measures. Perhaps release of this new report provides a more plausible explanation for the apparent urgency.

Something Fishy: CRS Report Downplays Fukushima’s Effect on US Marine Environment

japan

(photo: JanneM)

Late Thursday, the United States Coast Guard reported that they had successfully scuttled the Ryou-Un Maru, the Japanese “Ghost Ship” that had drifted into US waters after being torn from its moorings by the tsunami that followed the Tohoku earthquake over a year ago. The 200-foot fishing trawler, which was reportedly headed for scrap before it was swept away, was seen as potentially dangerous as it drifted near busy shipping lanes.

Coincidentally, the “disappearing” of the Ghost Ship came during the same week the Congressional Research Service (CRS) released its report on the effects of the Fukushima Daiichi nuclear disaster on the US marine environment, and, frankly, the metaphor couldn’t be more perfect. The Ryou-Un Maru is now resting at the bottom of the ocean–literally nothing more to see there, thanks to a few rounds from a 25mm Coast Guard gun–and the CRS hopes to dispatch fears of the radioactive contamination of US waters and seafood with the same alacrity.

But while the Ghost Ship was not considered a major ecological threat (though it did go down with around 2,000 gallons of diesel fuel in its tanks), the US government acknowledges that this “good luck ship” (a rough translation of its name) is an early taste of the estimated 1.5 million tons of tsunami debris expected to hit North American shores over the next two or three years. Similarly, the CRS report (titled Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment [PDF]) adopts an overall tone of “no worries here–its all under control,” but a closer reading reveals hints of “more to come.”

Indeed, the report feels as it were put through a political rinse cycle, limited both in the strength of its language and the scope of its investigation. This tension is evident right from the start–take, for example, these three paragraphs from the report’s executive summary:

Both ocean currents and atmospheric winds have the potential to transport radiation over and into marine waters under U.S. jurisdiction. It is unknown whether marine organisms that migrate through or near Japanese waters to locations where they might subsequently be harvested by U.S. fishermen (possibly some albacore tuna or salmon in the North Pacific) might have been exposed to radiation in or near Japanese waters, or might have consumed prey with accumulated radioactive contaminants.

High levels of radioactive iodine-131 (with a half-life of about 8 days), cesium-137 (with a half-life of about 30 years), and cesium-134 (with a half-life of about 2 years) were measured in seawater adjacent to the Fukushima Dai-ichi site after the March 2011 events. EPA rainfall monitors in California, Idaho, and Minnesota detected trace amounts of radioactive iodine, cesium, and tellurium consistent with the Japanese nuclear incident, at concentrations below any level of concern. It is uncertain how precipitation of radioactive elements from the atmosphere may have affected radiation levels in the marine environment.

Scientists have stated that radiation in the ocean very quickly becomes diluted and would not be a problem beyond the coast of Japan. The same is true of radiation carried by winds. Barring another unanticipated release, radioactive contaminants from Fukushima Dai-ichi should be sufficiently dispersed over time that they will not prove to be a serious health threat elsewhere, unless they bioaccumulate in migratory fish or find their way directly to another part of the world through food or other commercial products.

Winds and currents have “the potential” to transport radiation into US waters? Winds–quite measurably–already have, and computer models show that currents, over the next couple of years, most certainly will.

Are there concentrations of radioisotopes that are “below concern?” No reputable scientist would make such a statement. And if monitors in the continental United States detected radioactive iodine, cesium and tellurium in March 2011, then why did they stop the monitoring (or at least stop reporting it) by June?

The third paragraph, however, wins the double-take prize. Radiation would not be a problem beyond the coast? Fish caught hundreds of miles away would beg to differ. “Barring another unanticipated release. . . ?” Over the now almost 13 months since the Fukushima crisis began, there have been a series of releases into the air and into the ocean–some planned, some perhaps unanticipated at the time, but overall, the pattern is clear, radioactivity continues to enter the environment at unprecedented levels.

And radioactive contaminants “should be sufficiently dispersed over time, unless they bioaccumulate?” Unless? Bioaccumulation is not some crazy, unobserved hypothesis, it is a documented biological process. Bioaccumulation will happen–it will happen in migratory fish and it will happen as under-policed food and commercial products (not to mention that pesky debris) make their way around the globe.

Maybe that is supposed to be read by inquiring minds as the report’s “please ignore he man behind the curtain” moment–an intellectual out clause disguised as an authoritative analgesic–but there is no escaping the intent. Though filled with caveats and counterfactuals, the report is clearly meant to serve as a sop to those alarmed by the spreading ecological catastrophe posed by the ongoing Fukushima disaster.

The devil is in the details–the dangers are in the data

Beyond the wiggle words, perhaps the most damning indictment of the CRS marine radiation report can be found in the footnotes–or, more pointedly, in the dates of the footnotes. Though this report was released over a year after the Tohoku earthquake and tsunami triggered the Fukushima nightmare, the CRS bases the preponderance of its findings on information generated during the disaster’s first month. In fact, of the document’s 29 footnotes, only a handful date from after May 2011–one of those points to a CNN report (authoritative!), one to a status update on the Fukushima reactor structures, one confirms the value of Japanese seafood imports, three are items tracking the tsunami debris, and one directs readers to a government page on FDA radiation screening, the pertinent part of which was last updated on March 28 of last year.

Most crucially, the parts of the CRS paper that downplay the amounts of radiation measured by domestic US sensors all cite data collected within the first few weeks of the crisis. The point about radioisotopes being “below any level of concern” comes from an EPA news release dated March 22, 2011–eleven days after the earthquake, only six days after the last reported reactor explosion, and well before so many radioactive releases into the air and ocean. It is like taking reports of only minor flooding from two hours after Hurricane Katrina passed over New Orleans, and using them as the standard for levee repair and gulf disaster planning (perhaps not the best example, as many have critiqued levee repairs for their failure to incorporate all the lessons learned from Katrina).

It now being April of 2012, much more information is available, and clearly any report that expects to be called serious should have included at least some of it.

By October of last year, scientists were already doubling their estimates of the radiation pushed into the atmosphere by the Daiichi reactors, and in early November, as reported here, France’s Institute for Radiological Protection and Nuclear Safety issued a report showing the amount of cesium 137 released into the ocean was 30 times greater than what was stated by TEPCO in May. Shockingly, the Congressional Research Service does not reference this report.

Or take the early March 2012 revelation that seaweed samples collected from off the coast of southern California show levels of radioactive iodine 131 500 percent higher than those from anywhere else in the US or Canada. It should be noted that this is the result of airborne fallout–the samples were taken in mid-to-late-March 2011, much too soon for water-borne contamination to have reached that area–and so serves to confirm models that showed a plume of radioactive fallout with the greatest contact in central and southern California. (Again, this specific report was released a month before the CRS report, but the data it uses were collected over a year ago.)

Then there are the food samples taken around Japan over the course of the last year showing freshwater and sea fish–some caught over 200 kilometers from Fukushima–with radiation levels topping 100 becquerels per kilogram (one topping 600 Bq/kg).

And the beat goes on

This information, and much similar to it, was all available before the CRS released its document, but the report also operates in a risibly artificial universe that assumes the situation at Fukushima Daiichi has basically stabilized. As a sampling of pretty much any week’s news will tell you, it has not. Take, for example, this week:

About 12 tons of water contaminated with radioactive strontium are feared to have leaked from the Fukushima No. 1 plant into the Pacific Ocean, Tepco said Thursday.

The leak occurred when a pipe broke off from a joint while the water was being filtered for cesium, Tokyo Electric Power Co. said.

The system doesn’t remove strontium, and most of the water apparently entered the sea via a drainage route, Tepco added.

The water contained 16.7 becquerels of cesium per cu. centimeter and tests are under way to determine how much strontium was in it, Tepco said.

This is the second such leak in less than two weeks, and as Kazuhiko Kudo, a professor of nuclear engineering at Kyushu University who visited Fukushima Daiichi twice last year, noted:

There will be similar leaks until Tepco improves equipment. The site had plastic pipes to transfer radioactive water, which Tepco officials said are durable and for industrial use, but it’s not something normally used at nuclear plants. Tepco must replace it with metal equipment, such as steel.

(The plastic tubes–complete with the vinyl and duct tape patch–can be viewed here.)

And would that the good people at the Congressional Research Service could have waited to read a report that came out the same day as theirs:

Radioactive material from the Fukushima nuclear disaster has been found in tiny sea creatures and ocean water some 186 miles (300 kilometers) off the coast of Japan, revealing the extent of the release and the direction pollutants might take in a future environmental disaster.

In some places, the researchers from Woods Hole Oceanographic Institution (WHOI) discovered cesium radiation hundreds to thousands of times higher than would be expected naturally, with ocean eddies and larger currents both guiding the “radioactive debris” and concentrating it.

Or would that the folks at CRS had looked to their fellow government agencies before they went off half-cocked. (The study above was done by researchers at Woods Hole and written up in the journal of the National Academy of Sciences.) In fact, it appears the CRS could have done that. In its report, CRS mentions that “Experts cite [Fukushima] as the largest recorded release of radiation to the ocean,” and the source for that point is a paper by Ken Buesseler–the same Ken Buesseler that was the oceanographer in charge of the WHOI study. Imagine what could have been if the Congressional Research Service had actually contacted the original researcher.

Can openers all around

Or perhaps it wouldn’t have mattered. For if there is one obvious takeaway from the CRS paper, beyond its limits of scope and authority, that seeks to absolve it of all other oversights–it is its unfailing confidence in government oversight.

Take a gander at the section under the bolded question “Are there implications for US seafood safety?”:

It does not appear that nuclear contamination of seafood will be a food safety problem for consumers in the United States. Among the main reasons are that:

  • damage from the disaster limited seafood production in the affected areas,
  • radioactive material would be diluted before reaching U.S. fishing grounds, and
  • seafood imports from Japan are being examined before entry into the United States.

According to the U.S. Food and Drug Administration (FDA), because of damage from the earthquake and tsunami to infrastructure, few if any food products are being exported from the affected region. For example, according to the National Federation of Fisheries Cooperative Associations, the region’s fishing industry has stopped landing and selling fish. Furthermore, a fishing ban has been enforced within a 2-kilometer radius around the damaged nuclear facility.

So, the Food and Drug Administration is relying on the word of an industry group and a Japanese government-enforced ban that encompasses a two-kilometer radius–what link of that chain is supposed to be reassuring?

Last things first: two kilometers? Well, perhaps the CRS should hire a few proofreaders. A search of the source materials finds that the ban is supposed to be 20-kilometers. Indeed, the Japanese government quarantined the land for a 20-kilometer radius. The US suggested evacuation from a 50-mile (80-kilometer) radius. The CRS’s own report notes contaminated fish were collected 30 kilometers from Fukushima. So why is even 20 kilometers suddenly a radius to brag about?

As for a damaged industry not exporting, numerous reports show the Japanese government stepping in to remedy that “problem.” From domestic PR campaigns encouraging the consumption of foodstuffs from Fukushima prefecture, to the Japanese companies selling food from the region to other countries at deep discounts, to the Japanese government setting up internet clearing houses to help move tainted products, all signs point to a power structure that sees exporting possibly radioactive goods as essential to its survival.

The point on dilution, of course, not only ignores the way many large scale fishing operations work, it ignores airborne contamination and runs counter to the report’s own acknowledgment of bioaccumulation.

But maybe the shakiest assertion of all is that the US Food and Drug Administration will stop all contaminated imports at the water’s edge. While imports hardly represent the total picture when evaluating US seafood safety, taking this for the small slice of the problem it covers, it engenders raised eyebrows.

First there is the oft-referenced point from nuclear engineer Arnie Gundersen, who said last summer that State Department officials told him of a secret agreement between Japan and Secretary Hilary Clinton guaranteeing the continued importation of Japanese food. While independent confirmation of this pact is hard to come by, there is the plain fact that, beyond bans on milk, dairy products, fruits and vegetables from the Fukushima region issued in late March 2011, the US has proffered no other restrictions on Japanese food imports (and those few restrictions for Japanese food were lifted for US military commissaries in September).

And perhaps most damning, there was the statement from an FDA representative last April declaring that North Pacific seafood was so unlikely to be contaminated that “no sampling or monitoring of our fish is necessary.” The FDA said at the time that it would rely on the National Oceanographic and Atmospheric Administration (NOAA) to tell it when they should consider testing seafood, but a NOAA spokesperson said it was the FDA’s call.

Good. Glad that’s been sorted out.

The Congressional Research Service report seems to fall victim to a problem noted often here–they assume a can opener. As per the joke, the writers stipulate a functioning mechanism before explaining their solution. As many nuclear industry-watchers assume a functioning regulatory process (as opposed to a captured Nuclear Regulatory Commission, an industry-friendly Department of Energy, and industry-purchased members of Congress) when speaking of the hypothetical safety of nuclear power, the CRS here assumes an FDA interested first and foremost in protecting the general public, instead of an agency trying to strike some awkward “balance” between health, profit and politics. The can opener story is a joke; the effects of this real-life example are not.

Garbage in, garbage out

The Congressional Research Service, a part of the Library of Congress, is intended to function as the research and analysis wing of the US Congress. It is supposed to be objective, it is supposed to be accurate, and it is supposed to be authoritative. America needs the CRS to be all of those things because the agency’s words are expected to inform federal legislation. When the CRS shirks its responsibility, shapes its words to fit comfortably into the conventional wisdom, or shaves off the sharp corners to curry political favor, the impact is more than academic.

When the CRS limits its scope to avoid inconvenient truths, it bears false witness to the most important events of our time. When the CRS pretends other government agencies are doing their jobs–despite documentable evidence to the contrary–then they are not performing theirs. And when the CRS issues a report that ignores the data and the science so that a few industries might profit, it is America that loses.

The authors of this particular report might not be around when the bulk of the cancers and defects tied to the radiation from Fukushima Daiichi present in the general population, but this paper’s integrity today could influence those numbers tomorrow. Bad, biased, or bowdlerized advice could scuttle meaningful efforts to make consequential policy.

If the policy analysts that sign their names to reports like this don’t want their work used for scrap paper, then maybe they should take a lesson from the Ryou-Un Maru. Going where the winds and currents take you makes you at best a curiosity, and more likely a nuisance–just so much flotsam and jetsam getting in the way of actual business. Works of note come with moral rudders, anchored to best data available; without that, the report might as well just say “good luck.”

As World Honors Fukushima Victims, NRC Gives Them a One-Fingered Salute

Sign from Fukushima commemoration and anti-nuclear power rally, Union Square Park, NYC, 3/11/12. (photo: G. Levine)

Nearly a week after the first anniversary of the Japanese earthquake and tsunami that started the crisis at the Fukushima Daiichi nuclear power facility, I am still sorting through the dozens of reports, retrospectives and essays commemorating the event. The sheer volume of material has been a little exhausting, but that is, of course, compounded by the weight of the subject. From reviewing the horrors of a year ago–now even more horrific, thanks to many new revelations about the disaster–to contemplating what lies ahead for residents of Japan and, indeed, the world, it is hard just to read about it; living it–then, now, and in the future–is almost impossible for me to fathom.

But while living with the aftermath might be hard to imagine, that such a catastrophe could and likely would happen was not. In fact, if there is a theme (beyond the suffering of the Japanese people) that runs through all the Fukushima look-backs, it is the predictability–the mountains of evidence that said Japan’s nuclear plants were vulnerable, and if nothing were done, a disaster (like the one we have today) should be expected.

I touched on this last week in my own anniversary examination, and now I see that Dawn Stover, contributing editor at The Bulletin of the Atomic Scientists, draws a similar comparison:

Although many politicians have characterized 3/11 and 9/11 as bizarre, near-impossible events that could not have been foreseen, in both cases there were clear but unheeded warnings. . . . In the case of 3/11, the nuclear plant’s operators ignored scientific studies showing that the risks of a tsunami had been dramatically underestimated. Japan’s “safety culture,” which asserted that accidents were impossible, prevented regulators from taking a hard look at whether emergency safety systems would function properly in a tsunami-caused station blackout.

Stover goes on to explain many points where the two nightmare narratives run parallel. She notes how while governments often restrict information, stating that they need to guard against mass panic, it is actually the officials who are revealed to be in disarray. By contrast, in both cases, first responders behaved rationally and professionally, putting themselves at great risk in attempts to save others.

In both cases, communication–or, rather, the terrible lack of it–between sectors of government and between officials and responders exacerbated the crisis and put more lives at risk.

And with both 9/11 and 3/11, the public’s trust in government was shaken. And that crisis of trust was made worse by officials obscuring the facts and covering their tracks to save their own reputations.

But perhaps with that last point, it is more my reading my observations into hers than a straight retelling of Stover. Indeed, it is sad to note that Stover concludes her Fukushima think piece with a similar brand of CYA hogwash:

By focusing needed attention on threats to our existence, 3/11 and 9/11 have brought about some positive changes. The nuclear disaster in Japan has alerted nuclear regulators and operators around the world to the vulnerabilities of nuclear power plant cooling systems and will inevitably lead to better standards for safety and siting — and perhaps even lend a new urgency to the problem of spent fuel. Likewise, 9/11 resulted in new security measures and intelligence reforms that have thus far prevented another major terrorist attack in the United States and have created additional safeguards for nuclear materials.

When it comes to post-9/11 “security” and “intelligence reforms,” Stover is clearly out of her depth, and using the Bush-Cheney “no new attacks” fallacy frankly undermines the credibility of the entire essay. But I reference it here because it sets up a more important point.

If only Stover had taken a lesson from her own story. The Fukushima disaster has not alerted nuclear regulators and operators to vulnerabilities–as has been made clear here and in several of the post-Fukushima reports, those vulnerabilities were all well known, and known well in advance of 3/11/11.

But even if this were some great and grand revelation, some signal moment, some clarion call, what in the annals of nuclear power makes Stover or any other commentator think that call will be heard? “Inevitably lead to better standards”–inevitably? We’d all exit laughing if we weren’t running for our lives.

Look no further than the “coincidental” late-Friday, pre-anniversary news dump from the US Nuclear Regulatory Commission.

Late on March 9, 2012, two days before the earthquake and tsunami would be a year in the rear-view mirror, the NRC put on a big splashy show. . . uh, strike that. . . released a weirdly underplayed written announcement that the commission had approved a set of new rules drawing on lessons learned from the Fukushima crisis:

The Nuclear Regulatory Commission ordered major safety changes for U.S. nuclear power plants Friday. . . .

The orders require U.S. nuclear plants to install or improve venting systems to limit core damage in a serious accident and to install sophisticated equipment to monitor water levels in pools of spent nuclear fuel.

The plants also must improve protection of safety equipment installed after the 2001 terrorist attacks and make sure it can handle damage to multiple reactors at the same time.

Awwwrighty then, that sounds good, right? New rules, more safety, responsive to the Japanese disaster at last–but the timing instantly raised questions.

It didn’t take long to discover these were not the rules you were looking for.

First off, these are only some of the recommendations put before the commission by their Near-Term Task Force some ten months ago, and while better monitoring of water levels in spent fuel pools and plans to handle multiple disasters are good ideas, it has been noted that the focus on hardening the vents in Mark I and Mark II boiling water reactors actually misdiagnoses what really went wrong in two of the Fukushima Daiichi reactors.

Also, it should be noted this represents less than half the recommendations in last summer’s report. It also does not mandate a migration of spent fuel from pools to dry casks, an additional precaution not explicitly in the report, but stressed by NRC chief Gregory Jaczko, as well as many industry watchdogs.

But most important–and glaring–of all, the language under which these rules passed could make it that almost none of them are ever enforced.

This is a little technical, so let me turn to one of the few members of Congress that actually spends time worrying about this, Rep. Ed Markey (D MA-7):

While I am encouraged that the Commission supports moving forward with three of the most straightforward and quickly-issued nuclear safety Orders recommended by their own expert staff, I am disappointed that several Commissioners once again have rejected the regulatory justification that they are necessary for the adequate protection of nuclear reactors in this country. . . .

After the terrorist attacks of September 11, 2001, the NRC determined that some nuclear security upgrades were required to be implemented for the “adequate protection” of all U.S. nuclear reactors. This meant that nuclear reactors would not be considered to be sufficiently secure without these new measures, and that an additional cost-benefit “backfit” analysis would not be required to justify their implementation. The “adequate protection” concept is derived from the Atomic Energy Act of 1954, and is reflected in NRC’s “Backfit Rule” which specifies that new regulations for existing nuclear reactors are not required to include this extra cost-benefit “backfit” analysis when the new regulations are “necessary to ensure that the facility provides adequate protection to the health and safety of the public.”

Both the NRC Fukushima Task Force and the NRC staff who reviewed the Task Force report concluded that the new post-Fukushima safety recommendations, including the Orders issued today, were also necessary for the “adequate protection” of existing U.S. nuclear power plants, and that additional cost-benefit analysis should not be required to justify their implementation.

While Chairman Jaczko’s vote re-affirmed his support of all the Near-Term Task Force’s recommendations, including the need to mandate them all on the basis that they are necessary for the adequate protection of all U.S. nuclear power plants, Commissioner Svinicki did not do so for any of the Orders, Commissioner Magwood did not do so for two of the three Orders, and Commissioners Apostolakis and Ostendorff rejected that basis for one of the three. As a result, the Order requiring technologies to monitor conditions in spent nuclear fuel pools during emergencies will proceed using a different regulatory basis. More importantly, the inability of the Commission to unanimously accept its own staff’s recommendations on these most straightforward safety measures presents an ominous signal of the manner in which the more complicated next sets of safety measures will be considered.

In other words, last Friday’s move was regulatory kabuki. By failing to use the strictest language for fuel pools, plant operators will be allowed to delay compliance for years, if not completely excuse themselves from it, based on the argument that the safety upgrade is too costly.

The other two rules are also on shaky ground, as it were. And even if by some miracle, the industry chose not to fight them, and the four uber-pro-nuclear commissioners didn’t throw up additional roadblocks, nothing is required of the nuclear facilities until December 31, 2016.

So, rather than it being a salutary moment, a tribute of sorts to the victims in Japan on the anniversary of their disaster, the announcement by the NRC stands more as an insult. It’s as if the US government is saying, “Sure, there are lessons to be learned here, but the profits of private energy conglomerates are more important than any citizen’s quaint notions of health and safety. ”

As if any more examples were needed, these RINOs (rules in name only) demonstrate again that in America, as in Japan, the government is too close to the nuclear industry it is supposed to police.

And, for the bigger picture, as if any more examples were needed, be it before or after March 11, it really hasn’t been that hard to imagine the unimaginable. When an industry argues it has to forgo a margin of safety because of cost, there’s a good chance it was too dangerous and too expensive to begin with.

* * *

By way of contrast, take a look in at the some of the heartfelt expressions of commemoration and protest from New York’s Fukushima memorial and anti-nuclear rally, held last Sunday in Union Square Park.

Fukushima One Year On: Many Revelations, Few Surprises

Satellite image of Fukushima Daiichi showing damage on 3/14/11. (photo: digitalglobe)

One year on, perhaps the most surprising thing about the Fukushima crisis is that nothing is really that surprising. Almost every problem encountered was at some point foreseen, almost everything that went wrong was previously discussed, and almost every system that failed was predicted to fail, sometimes decades earlier. Not all by one person, obviously, not all at one time or in one place, but if there is anything to be gleaned from sorting through the multiple reports now being released to commemorate the first anniversary of the Tohoku earthquake and tsunami–and the start of the crisis at Fukushima Daiichi–it is that, while there is much still to be learned, we already know what is to be done. . . because we knew it all before the disaster began.

This is not to say that any one person–any plant manager, nuclear worker, TEPCO executive, or government official–had all that knowledge on hand or had all the guaranteed right answers when each moment of decision arose. We know that because the various timelines and reconstructions now make it clear that several individual mistakes were made in the minutes, hours and days following the dual natural disasters. Instead, the analysis a year out teaches us that any honest examination of the history of nuclear power, and any responsible engagement of the numerous red flags and warnings would have taken the Fukushima disasters (yes, plural) out of the realm of “if,” and placed it squarely into the category of “when.”

Following closely the release of findings by the Rebuild Japan Foundation and a report from the Union of Concerned Scientists (both discussed here in recent weeks), a new paper, “Fukushima in review: A complex disaster, a disastrous response,” written by two members of the Rebuild Japan Foundation for the Bulletin of the Atomic Scientists, provides a detailed and disturbing window on a long list of failures that exacerbated the problems at Japan’s crippled Fukushima Daiichi facility. Among them, they include misinterpreting on-site observations, the lack of applicable protocols, inadequate industry guidelines, and the absence of both a definitive chain of command and the physical presence of the supposed commanders. But first and foremost, existing at the core of the crisis that has seen three reactor meltdowns, numerous explosions, radioactive contamination of land, air and sea, and the mass and perhaps permanent evacuation of tens of thousands of residents from a 20 kilometer exclusion zone, is what the Bulletin paper calls “The trap of the absolute safety myth”:

Why were preparations for a nuclear accident so inadequate? One factor was a twisted myth–a belief in the “absolute safety” of nuclear power. This myth has been propagated by interest groups seeking to gain broad acceptance for nuclear power: A public relations effort on behalf of the absolute safety of nuclear power was deemed necessary to overcome the strong anti-nuclear sentiments connected to the atomic bombings of Hiroshima and Nagasaki.

Since the 1970s, disaster risk has been deliberately downplayed by what has been called Japan’s nuclear mura (“village” or “community”)–that is, nuclear advocates in industry, government, and academia, along with local leaders hoping to have nuclear power plants built in their municipalities. The mura has feared that if the risks related to nuclear energy were publicly acknowledged, citizens would demand that plants be shut down until the risks were removed. Japan’s nuclear community has also feared that preparation for a nuclear accident would in itself become a source of anxiety for people living near the plants.

The power of this myth, according to the authors, is strong. It led the government to actively cancel safety drills in the wake of previous, smaller nuclear incidents–claiming that they would cause “unnecessary anxiety”–and it led to a convenient classification for the events of last March 11:

The word used then to describe risks that would cause unnecessary public anxiety and misunderstanding was “unanticipated.” Significantly, TEPCO has been using this very word to describe the height of the March 11 tsunami that cut off primary and backup power to Fukushima Daiichi.

Ignoring for this moment the debate about what cut off primary power, the idea that the massive size of the tsunami–not to mention what it would do to the nuclear plant–was unanticipated is, as this paper observes, absurd. Studies of a 9th Century tsunami, as well as an internal report by TEPCO’s own nuclear energy division, showed there was a definite risk of large tsunamis at Fukushima. TEPCO dismissed these warnings as “academic.” The Japanese government, too, while recommending nuclear facilities consider these findings, did not mandate any changes.

Instead, both the industry and the government chose to perpetuate the “safety myth,” fearing that any admission of a need to improve or retrofit safety systems would result in “undue anxiety”–and, more importantly, public pressure to make costly changes.

Any of that sound familiar?

“No one could have possibly anticipated. . .” is not just the infamous Bush administration take on the attacks of 9/11/2001, it has become the format for many of the current excuses on why a disaster like Fukushima could happen once, and why little need now be done to make sure it doesn’t happen again.

In fact, reading the BAS Fukushima review, it is dishearteningly easy to imagine you are reading about the state of the American nuclear reactor fleet. Swapping in places like Three Mile Island, Palisades, Browns Ferry, Davis-Besse, San Onofre, Diablo Canyon, Vermont Yankee, and Indian Point for the assorted Japanese nuclear power plants is far too easy, and replacing the names of the much-maligned Japanese regulatory agencies with “Nuclear Regulatory Commission” and “Department of Energy” is easier still.

As observed a number of times over the last year, because of unusual events and full-on disasters at many of the aging nuclear plants in the US, American regulators have a pretty good idea of what can go wrong–and they have even made some attempts to suggest measures should be taken to prevent similar events in the future. But industry pressure has kept those suggestions to a minimum, and the cozy relationship between regulators and the regulated has diluted and dragged out many mandates to the point where they serve more as propaganda than prophylaxis.

Even with the Fukushima disaster still visible and metastasizing, requiring constant attention from every level of Japanese society and billions of Yen in emergency spending, even with isotopes from the Daiichi reactors still showing up in American food, air and water, and even with dozens of US reactors operating under circumstances eerily similar to pre-quake Fukushima, the US Nuclear Regulatory Commission has treated its own post-Fukushima taskforce recommendations with a pointed lack of urgency. And the pushback from the nuclear industry and their bought-and-paid-for benefactors in the government at the mere hint of new regulations or better enforcement indicates that America might have its own safety myth trap–though, in the US, it is propagated by the generations-old marketing mantra, “Clean, safe and too cheap to meter.”

Mythical, too, is the notion that the federal government has the regulatory infrastructure or political functionality to make any segment of that tripartite lie ring closer to true. From NRC chairman Gregory Jaczko’s bizarre faith in a body that has failed to act on his pre-Fukushima initiatives while actively conspiring to oust him, to the Union of Concerned Scientists’ assuming a regulatory “can opener,” the US may have a bigger problem than the absolute safety myth, and that would be the myth of a government with the will or ability to assure that safety.

Which, of course, is more than a shame–it’s a crime. With so many obvious flaws in the technology–from the costs of mining, importing and refining fuel to the costs of building an maintaining reactors, from the crisis in spent fuel storage to the “near misses” and looming disasters at aging facilities–with so many other industrialized nations now choosing to phase out nuclear and ramp up renewables, and with the lessons of Fukushima now so loud and clear, the path forward for the US should not be difficult to delineate.

Nuclear power is too dirty, too dangerous and too expensive to justify any longer. No one in America should assume that the willpower or wherewithal to manage these problems would magically appear when nothing sufficient has materialized in the last fifty years. Leaders should not mistake luck for efficacy, nor should they pretend birds of a feather are unrelated black swans. They know better, and they knew all they needed to know long before last year’s triple meltdown.

Nuclear is not in a “renaissance,” it is in its death throes. Now is the time to cut financial losses and guard against more precious ones. The federal government should take the $54.5 billion it pledged to the nuclear industry and use it instead to increase efficiency, conservation, and non-fissile/non-fossil energy innovation.

But you already knew that.

* * *

Extra Credit:

Compare and contrast this 25-minute video from Al Jazeera and the Center for Investigative Reporting with what you read in the Bulletin of the Atomic Scientists report mentioned above. For that matter, contrast it with the two longer but somehow less rigorous videos from Frontline, which were discussed here and here.

Also, there are events all over the globe this weekend to commemorate the first anniversary of the Tohoku earthquake and the nuclear crisis it triggered. To find an event in your area, see this list from Beyond Nuclear and the Freeze our Fukushimas Campaign.

Frontline’s Fukushima “Meltdown” Perpetuates Industry Lie That Tsunami, Not Quake, Started Nuclear Crisis

Fukushima Daiichi as seen on March 16, 2011. (photo: Digital Globe via Wikipedia)

In all fairness, “Inside Japan’s Nuclear Meltdown,” the Frontline documentary that debuted on US public television stations last night (February 28), sets out to accomplish an almost impossible task: explain what has happened inside and around Japan’s Fukushima Daiichi nuclear facility since a massive earthquake and tsunami crippled reactors and safety systems on March 11, 2011–and do so in 53 minutes. The filmmakers had several challenges, not the least of which is that the Fukushima meltdowns are not a closed case, but an ever-evolving crisis. Add to that the technical nature of the information, the global impact of the disaster, the still-extant dangers in and around the crippled plant, the contentious politics around nuclear issues, and the refusal of the Tokyo Electric Power Company (TEPCO) to let its employees talk either to reporters or independent investigative bodies, and it quickly becomes apparent that Frontline had a lot to tackle in order to practice good journalism.

But if the first rule of reporting is anything like medicine–“do no harm”–than Frontline’s Fukushima coverage is again guilty of malpractice. While “Inside Japan’s Nuclear Meltdown” is not the naked apologia for the nuclear industry that Frontline’s January offering, “Nuclear Aftershocks,” was, some of the errors and oversights of this week’s episode are just as injurious to the truth.

And none more so than the inherent contradiction that aired in the first minutes of Tuesday’s show.

“Inside'” opens on “March 11, 2011 – Day 1.” Over shaking weather camera shots of Fukushima’s four exhaust towers, the narrator explains:

The earthquake that shook the Fukushima Dai-ichi nuclear power plant was the most powerful to strike Japan since records began. The company that operates the plant, TEPCO, has forbidden its workers from speaking publicly about what followed.

But one year on, they are starting to tell their stories. Some have asked for their identities to be hidden for fear of being fired.

One such employee (called “Ono” in the transcript) speaks through an interpreter: “I saw all the pipes fixed to the wall shifting and ripping off.”

Then the power went out, but as Frontline’s narrator explains:

The workers stayed calm because they knew Japanese power plants are designed to withstand earthquakes. The reactors automatically shut down within seconds. But the high radioactivity of nuclear fuel rods means they generate intense heat even after a shutdown. So backup generators kicked in to power the cooling systems and stop the fuel rods from melting.

Frontline then tells of the massive tsunami that hit Fukushima about 49 minutes after the earthquake:

The biggest of the waves was more than 40 feet high and traveling at over 100 miles an hour.

. . . .

At 3:35 PM, the biggest of the waves struck. It was more than twice the height of the plant’s seawall.

. . . .

Most of the backup diesel generators needed to power the cooling systems were located in basements. They were destroyed by the tsunami waters, meaning the workers had no way of keeping the nuclear fuel from melting.

The impression left for viewers is that while the quake knocked out Fukushima’s primary power, the diesel backup generators were effectively cooling the reactors until the tsunami flooded the generators.

It’s a good story, as stories go, and one that TEPCO and their nuclear industry brethren are fond of telling to anyone and everyone within the sound of their profit-enhanced, lobbyist-aided voices. They have told it so often that it seems to be part of the whole Fukushima narrative that less-interested parties can recount without so much as glancing at their talking points. Indeed, even Frontline’s writers thought they could toss it out there without any debate and then move on. One problem with that story, though–it’s not true.

I personally saw pipes that had come apart and I assume that there were many more that had been broken throughout the plant. There’s no doubt that the earthquake did a lot of damage inside the plant… I also saw that part of the wall of the turbine building for reactor one had come away. That crack might have affected the reactor.

Those are the words of a Fukushima maintenance worker who requested anonymity when he told his story to reporters for Great Britain’s Independent last August. That worker recalled hissing, leaking pipes in the immediate aftermath of the quake.

Another TEPCO employee, a Fukushima technician, also spoke to the Independent:

It felt like the earthquake hit in two waves, the first impact was so intense you could see the building shaking, the pipes buckling, and within minutes I saw pipes bursting. Some fell off the wall…

Someone yelled that we all needed to evacuate. But I was severely alarmed because as I was leaving I was told and I could see that several pipes had cracked open, including what I believe were cold water supply pipes. That would mean that coolant couldn’t get to the reactor core. If you can’t sufficiently get the coolant to the core, it melts down. You don’t have to have to be a nuclear scientist to figure that out.

Workers also describe seeing cracks and holes in reactor one’s containment building soon after the earthquake, and it has been reported that a radiation alarm went off a mile away from Fukushima Daiichi at 3:29 PM JST–43 minutes after the quake, but 6 minutes before the tsunami hit the plant’s seawall.

Indeed, much of the data available, as well as the behavior of Fukushima personnel, makes the case that something was going horribly wrong before the tsunami flooded the backup generators:

Mitsuhiko Tanaka, a former nuclear plant designer, describes what occurred on 11 March as a loss-of-coolant accident. “The data that Tepco has made public shows a huge loss of coolant within the first few hours of the earthquake. It can’t be accounted for by the loss of electrical power. There was already so much damage to the cooling system that a meltdown was inevitable long before the tsunami came.”

He says the released data shows that at 2.52pm, just after the quake, the emergency circulation equipment of both the A and B systems automatically started up. “This only happens when there is a loss of coolant.” Between 3.04 and 3.11pm, the water sprayer inside the containment vessel was turned on. Mr Tanaka says that it is an emergency measure only done when other cooling systems have failed. By the time the tsunami arrived and knocked out all the electrical systems, at about 3.37pm, the plant was already on its way to melting down.

In fact, these conclusions were actually corroborated by data buried in a TEPCO briefing last May–and they were of course corroborated by “Ono” in the opening minutes of Frontline’s report–but rather than use their documentary and their tremendous access to eyewitnesses as a way of starting a discussion about what really went wrong at Fukushima Daiichi, Frontline instead moved to end the debate by repeating the industry line as a kind of shorthand gospel.

This is not nitpicking. The implications of this point–the debate about whether the nuclear reactor, its cooling systems and containment (to say nothing yet of its spent fuel pools and their safety systems) were seriously damaged by the earthquake–are broad and have far-reaching consequences for nuclear facilities all over the globe.

To put it mildly, the pipes at Fukushima were a mess. Over the decade prior to the Tohoku quake, TEPCO was told repeatedly about the poor state of the plant’s pipes, ducts, and couplings. Fukushima was sighted numerous times for deteriorating joints, faked inspections and shoddy repairs. Technicians talk of how the systems didn’t match the blueprints, and that pipes had to be bent to match up and then welded together.

Fukushima was remarkably old, but it is not remarkable. Plants across Japan are of the same generations-old design. So are many nuclear reactors here in the United States. If the safety systems of a nuclear reactor can be dangerously compromised by seismic activity alone, then all of Japan’s reactors–and a dozen or more across the US–are one good shake away from a Fukushima-like catastrophe. And that means that those plants need to be shut down for extensive repairs and retrofits–if not decommissioned permanently.

The stakes for the nuclear industry are obviously very high. You can see how they would still be working overtime to drown out the evidence and push the “freak one-two punch” narrative. But it’s not the true story–indeed, it is dangerous lie–so it is hard to reconcile why the esteemed and resourceful journalists at Frontline would want to tell it.

* * *

That was not the only problem with Tuesday’s episode, but it is one of the most pernicious–and it presents itself so obviously right at the start of “Inside Japan’s Nuclear Meltdown.” Also problematic was the general impression left at the end of the program. While mention is made of the 100,000 displaced by the 12-mile Fukushima exclusion zone, nothing is said about the broader health implications for the entre country–and indeed for the rest of the world as radioactive isotopes from Fukushima spread well beyond Japan’s borders.

Alas, though Frontline tells of the massive amounts of seawater pumped into the damaged facility, nothing much is said about the contaminated water that is leaving the area, spreading into groundwater, rivers and the Pacific Ocean. The show talks of the efforts to open a valve to relieve pressure inside one reactor, but does not address growing evidence that the lid of the containment vessel likely lifted off at some point between the tsunami and the explosion in building one. And there is a short discussion of bringing the now-melted-down reactors to “cold shutdown,” but there is no mention of the recent “re-criticality“–the rising temperatures inside one of the damaged cores.

And to that point–and to a point often made in these columns–this disaster is not over. “Japan’s Meltdown” is not in the past–it is still a dangerous and evolving crisis. The “devil’s chain reaction” that could have required the evacuation of Tokyo is still very much a possibility should another earthquake jolt the region. . . which itself is considered likely.

Sadly–disturbingly–Frontline’s Fukushima tick-tock ends leaving the opposite impression. They acknowledge the years of work that lie ahead to clean up the mess, but the implication is that the path is clear. They acknowledge the tragedy, but treat it as does one of the film’s subjects, who is shown at Frontline’s end at a memorial for his lost family–it is something to be mourned, commemorated and honored.

But Fukushima’s crisis is not buried and gone, and though radioactive water has been swept out to sea and radioactive fallout has been blown around the world, the real danger of Fukushima Daiichi and nuclear plants worldwide is not gone with the wind.

As noted above, it is a difficult task to accurately and effectively tell this sweeping story in less than an hour–but the filmmakers should have acknowledged that and either refocused their one show, or committed to telling the story over a longer period of time. Choosing instead to use the frame of the nuclear industry and the governments that seek its largess is not good journalism because it has the potential to do much harm.