Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Book Salon – Joseph Mangano, Author of Mad Science: The Nuclear Power Experiment

[Note: On Saturday afternoon, I hosted FDL Book Salon, featuring a live Q&A with Joseph Mangano, author of Mad Science: The Nuclear Power Experiment. This is a repost of that discussion.]

In December of 1962, Consolidated Edison, New York City’s main purveyor of electricity, announced that it had submitted an official proposal to the US Atomic Energy Commission (the AEC, the precursor to today’s Nuclear Regulatory Commission) for the construction of a nuclear power plant on a site called Ravenswood. . . in Queens. . . on the East River. . . directly across from the United Nations. . . within five miles of roughly five million people.

Ravenswood became the site of America’s first demonstrations against nuclear power. It inspired petitions to President John F. Kennedy and NYC Mayor Robert Wagner, and the possibility of a nuclear reactor in such a densely populated area even invited public skepticism from the pro-nuclear head of the AEC, David Lilienthal. Finally, after a year of pressure, led by the borough’s community leaders, Con Edison withdrew their application.

But within three years, reports suggested Con Ed had plans to build a nuclear plant under Central Park. After that idea was roundly criticized, the utility publicly proposed a reactor complex under Welfare Island (now known as Roosevelt Island), instead.

Despite the strong support of Laurence Rockefeller, the brother of New York State’s governor, the Welfare Island project disappeared from Con Ed’s plans by 1970. . . soon to be replaced by the idea of a nuclear “jetport”–artificial islands to be built in the ocean just south of New York City that would host a pair of commercial reactors.

Does that sound like madness? Well, from today’s perspective–with Three Mile Island, Chernobyl, and now Fukushima universally understood as synonyms for disaster–it probably does. But there was a time before those meltdowns when nuclear power still had a bit of a glow, when, despite (or because of) the devastation from the atomic bombs dropped on Japan, many believed that the atom’s awesome power could be harnessed for good; a time when dangerous and deadly mishaps at a number of the nation’s earlier reactors were easily excused or kept completely secret.

In Mad Science: The Nuclear Power Experiment, Joseph Mangano returns to that time, and then methodically pulls back the curtain on the real history of nuclear folly and failure, and the energy source that continues to masquerade as clean, safe, and “too cheap to meter.”

From Chalk River, in Canada, the world’s first reactor meltdown, through Idaho’s EBR-1, Waltz Mill, PA, Santa Susana’s failed Sodium Reactor Experiment, the Idaho National Lab explosion that killed three, Fermi-1, which almost irradiated Detroit, and, of course, Three Mile Island, Mad Science provides a chilling catalog of nuclear accidents, all of which were disasters in their own right, and all of which illustrate a troubling pattern of safety breeches followed by secrecy and lies.

Nuclear power’s precarious existence is not, of course, just a story for the history books, and Mangano also details the state of America’s 104 remaining reactors. So many of today’s plants have problems, too, but perhaps the maddest thing about the mad science of civilian atomic power is that science often plays a minor role in decisions about the technology’s future.

From its earliest days, this supposedly super-cheap energy was financially unsustainable. By the mid-1950s, private insurers had turned their back on nuclear facilities, fearing the massive payouts that would follow any accident. The nuclear industry turned to the US government, and in 1957, the Price-Anderson Act limited a plant’s liability to an artificially low but apparently insurable figure–any damage beyond that would be covered by US taxpayers. Shippingport, America’s first large-scale commercial nuclear reactor, was built entirely with government money, and that is hardly an isolated story. Even before the Three Mile Island meltdown, Wall Street had walked away from nuclear energy, meaning that no new reactors could be built without massive federal loan guarantees.

Indeed, the cost of construction, when piled on top of the cost of fueling, skilled labor, operation and upkeep, made the prospect of opening a new nuclear plant financially unpalatable. So, as Mangano explains, nuclear utilities turned to another strategy for making their vertical profitable, one familiar to any student of late Western capitalism. Rather than build, energy companies would instead buy. Since the 1990s, the nuclear sector has seen massive consolidation. Mergers and acquisitions have created nuclear mega-corporations, like Exelon, Duke, and Entergy, which run multiple reactors across many facilities in many states. And the supposed regulators of the industry, the NRC, has encouraged this behavior by rubberstamping dozens upon dozens of 20-year license extensions, turning reactors that were supposed to be nearing the end of their functional lives into valuable assets.

But the pain of nuclear power isn’t only measured in meltdowns and money. Whether firing on all cylinders (as it were) or falling apart, nuclear plants have proven to be dangerous to the populations they are supposed to serve. Joseph Mangano, an epidemiologist by trade, and director of the Radiation and Public Health Project (RPHP), has made a career out of trying to understand the immediate and long-term effects of nuclear madness, be it from fallout, leaks, or the “permissible levels” of radioactive isotopes that are regularly released from reactors as part of normal operation.

As I mentioned earlier this week, Mangano and the RPHP are the inheritors of the Baby Tooth Survey, the groundbreaking examination of strontium levels in children born before, during and after the age of atmospheric nuclear bomb tests. The discovery of high levels of Sr-90, a radioactive byproduct of uranium fission, in the baby teeth of children born in the 1950s and ’60s led directly to the Partial Test Ban Treaty in 1963.

Mangano’s work has built on the original survey, linking elevated Sr-90 levels to cancer, and examining the increases in strontium in the bodies of children that lived close to nuclear power plants. And all of this is explained in great detail in Mad Science.

The author has also applied his expertise to the fallout from the ongoing Fukushima disaster. Last December, Mangano and Janette Sherman published a peer-reviewed article in the International Journal of Health Sciences (PDF) stating that in the 14 weeks following the start of the Japanese nuclear crisis, an estimated 14,000 excess deaths in the United States could be linked to radioactive fallout from Fukushima Daiichi. (RPHP has since revised that estimate–upward–to almost 22,000 deaths (PDF).)

That last study is not specifically detailed in Mad Science, but I hope we can touch on it today–along with some of the many equally maddening “experiments” in nuclear energy production that Mangano carefully unwraps in his book.

[Click here to read my two-hour chat with Joe Mangano.]

Looking Back at Our Nuclear Future

nuclear reactor, rocketdyne, LAT

The Los Angeles Times heralds the nuclear age in January 1957. (photo via wikipedia)

On March 11, communities around the world commemorated the first year of the still-evolving Fukushima Daiichi nuclear disaster with rallies, marches, moments of silence, and numerous retrospective reports and essays (including one here). But 17 days later, another anniversary passed with much less fanfare.

It was in the early morning hours of March 28, 1979, that a chain of events at the Three Mile Island nuclear power plant in Dauphin County, Pennsylvania caused what is known as a “loss of coolant accident,” resulting in a partial core meltdown, a likely hydrogen explosion, the venting of some amount of radioisotopes into the air and the dumping of 40,000 gallons of radioactive waste water into the Susquehanna River. TMI (as it is sometimes abbreviated) is often called America’s worst commercial nuclear accident, and though the nuclear industry and its acolytes have worked long and hard to downplay any adverse health effects stemming from the mishap, the fact is that what happened in Pennsylvania 33 years ago changed the face and future of nuclear power.

The construction of new nuclear power facilities in the US was already in decline by the mid 1970s, but the Three Mile Island disaster essentially brought all new projects to a halt. There were no construction licenses granted to new nuclear plants from the time of TMI until February of this year, when the NRC gave a hasty go-ahead to two reactors slated for the Vogtle facility in Georgia. And though health and safety concerns certainly played a part in this informal moratorium, cost had at least an equal role. The construction of new plants proved more and more expensive, never coming in on time or on budget, and the cleanup of the damaged unit at Three Mile Island took 14 years and cost over $1 billion. Even with the Price-Anderson Act limiting the industry’s liability, nuclear power plants are considered such bad risks that no financing can be secured without federal loan guarantees.

In spite of that–or because of that–the nuclear industry has pushed steadily over the last three decades to wring every penny out of America’s aging reactors, pumping goodly amounts of their hefty profits into lobbying efforts and campaign contributions designed to capture regulators and elected officials and propagate the age-old myth of an energy source that is clean, safe, and, if not exactly “too cheap to meter,” at least impressively competitive with other options. The result is a fleet of over 100 reactors nearing the end of their design lives–many with documented dangers and potential pitfalls that could rival TMI–now seeking and regularly getting license extensions from the Nuclear Regulatory Commission while that same agency softens and delays requirements for safety upgrades.

And all of that cozy cooperation between government and big business goes on with the nuclear industry pushing the idea of a “nuclear renaissance.” In the wake of Fukushima, the industry has in fact increased its efforts, lobbying the US and British governments to downplay the disaster, and working with its mouthpieces in Congress and on the NRC to try to kill recommended new regulations and force out the slightly more safety-conscious NRC chair. And, just this month, the Nuclear Energy Institute, the chief nuclear trade group, moved to take their message to what might be considered a less friendly cohort, launching a splashy PR campaign by underwriting public radio broadcasts and buying time for a fun and funky 60-second animated ad on The Daily Show.

All of this is done with the kind of confidence that only comes from knowing you have the money to move political practice and, perhaps, public opinion. Three Mile Island is, to the industry, the exception that proves the rule–if not an out-and-out success. “No one died,” you will hear–environmental contamination and the latest surveys now showing increased rates of Leukemia some 30 years later be damned–and that TMI is the only major accident in over half a century of domestic nuclear power generation.

Of course, this is not even remotely true–names like Browns Ferry, Cooper, Millstone, Indian Point and Vermont Yankee come to mind–but even if you discount plant fires and tritium leaks, Three Mile Island is not even America’s only meltdown.

There is, of course, the 1966 accident at Michigan’s Enrico Fermi Nuclear Generating Station, chronicled in the John Grant Fuller book We Almost Lost Detroit, but atom-lovers will dismiss this because Fermi 1 was an experimental breeder reactor, so it is not technically a “commercial” nuclear accident.

But go back in time another seven years–a full 20 before TMI–and the annals of nuclear power contain the troubling tale of another criticality accident, one that coincidentally is again in the news this week, almost 53 years later.

The Sodium Reactor Experiment

On July 12, 1957, the Sodium Reactor Experiment (SRE) at the Santa Susana Nuclear Field Laboratory near Simi Valley, California, became the first US nuclear reactor to produce electricity for a commercial power grid. SRE was a sodium-cooled reactor designed by Atomics International, a division of North American Aviation, a company more often known by the name of its other subsidiary, Rocketdyne. Southern California Edison used the electricity generated by SRE to light the nearby town of Moorpark.

Sometime during July 1959–the exact date is still not entirely clear–a lubricant used to cool the seals on the pump system seeped into the primary coolant, broke down in the heat and formed a compound that clogged cooling channels. Because of either curiosity or ignorance, operators continued to run the SRE despite wide fluctuations in core temperature and generating capacity.

Following a pattern that is now all too familiar, increased temperatures caused increased pressure, necessitating what was even then called a “controlled venting” of radioactive vapor. How much radioactivity was released into the environment is cause for some debate, for, in 1959, there was less monitoring and even less transparency. Current reconstructions, however, believe the release was possibly as high as 450 times greater than what was vented at Three Mile Island.

When the reactor was finally shut down and the fuel rods were removed (which was a trick in itself, as some were stuck and others broke), it was found that over a quarter showed signs of melting.

The SRE was eventually repaired and restarted in 1960, running on and off for another four years. Decommissioning began in 1976, and was finished in 1981, but the story doesn’t end there. Not even close.

Fifty-three years after a partial nuclear meltdown at the Santa Susana Field Laboratory site in the Chatsworth Hills, the U.S. Environmental Protection Agency has just released data finding extensive radioactive contamination still remains at the accident site.

“This confirms what we were worried about,” said Assemblywoman Julia Brownley, D-Oak Park, a long-time leader in the fight for a complete and thorough cleanup of this former Rocketdyne rocket engine testing laboratory. “This begins to answer critical questions about what’s still up there, where, how much, and how bad?”

Well, it sort of begins to answer it.

New soil samples weigh in at up to 1,000 times the radiation trigger levels (RTLs) agreed to when the Department of Energy struck a cleanup deal with the California Department of Toxic Substances in 2010. What’s more, these measurements follow two previous cleanup efforts by the DOE and Boeing, the company that now owns Santa Susana.

In light of the new findings, Assemblywoman Brownley has called on the DOE to comply with the agreement and do a real and thorough cleanup of the site. That means taking radiation levels down to what are the established natural background readings for the area. But that, as is noted by local reporter Michael Collins, “may be easier said than done”:

This latest U.S. EPA information appears to redefine what cleaning up to background actually is. Publicly available documents show that the levels of radiation in this part of Area IV where the SRE once stood are actually many thousands of times more contaminated than previously thought.

Just as troubling, the EPA’s RTLs, which are supposed to mirror the extensively tested and reported-on backgrounds of the numerous radionuclides at the site, were many times over the background threshold values (BTVs). So instead of cleaning up to background, much more radiation would be left in the ground, saving the government and lab owner Boeing millions in cleanup.

It is a disturbing tale of what Collins calls a kind of environmental “bait and switch” (of which he provides even more detail in an earlier report), but after a year of documenting the mis- and malfeasance of the nuclear industry and its supposed regulators, it is, to us here, anyway, not a surprising one.

To the atom-enamored, it is as if facts have a half-life all their own. The pattern of swearing that an event is no big deal, only to come back with revision after revision, each admitting a little bit more in a seemingly never-ending regression to what might approximately describe a terrible reality. It would be reminiscent of the “mom’s on the roof” joke if anyone actually believed that nuclear operators and their chummy government minders ever intended to eventually relay the truth.

Fukushima’s latest surprise

Indeed, that unsettling pattern is again visible in the latest news from Japan. This week saw revelations that radiation inside Fukushima Daiichi’s reactor 2 containment vessel clocked in at levels seriously higher than previously thought, while water levels are seriously lower.

An endoscopic camera, thermometer, water gauge and dosimeter were inserted into the number 2 reactor containment, and it documented radiation levels of up to 70 sieverts per hour, which is not only seven times the previous highest measurement, but 10 times higher than what is called a fatal dose (7 Sv/hr would kill a human in minutes).

The water level inside the containment vessel, estimated to be at 10 meters when the Japanese government declared a “cold shutdown” in December, turns out to be no more than 60 centimeters (about two feet).

This is disquieting news for many reasons. First, the high radiation not only makes it impossible for humans to get near the reactor, it makes current robotic technology impractical, as well. The camera, for instance, would only last 14 hours in those conditions. If the molten core is to be removed, a new class of radiation-resistant robots will have to be developed.

The extremely low water levels signal more troubling scenarios. Though some experts believe that the fuel rods have melted down or melted through to such an extent that two feet of water can keep them covered, it likely indicates a breach or breaches of the containment vessel. Plant workers, after all, have been pumping water into the reactor constantly for months now (why no one noticed that they kept having to add water to the system, or why no one cared, is plenty disturbing, as is the question of where all that extra water has gone).

Arnie Gundersen of nuclear engineering consultancy Fairewinds Associates believes that the level of water roughly corresponds with the lower lip of the vessel’s suppression pool–further evidence that reactor 2 suffered a hydrogen explosion, as did two other units at Fukushima. Gundersen also believes that the combination of heat, radioactivity and seawater likely degraded the seals on points where tubes and wires penetrated the structure–so even if there were no additional cracks from an explosion or the earthquake, the system is now almost certainly riddled with holes.

The holes pose a couple of problems, not only does it mean more contaminated water leaking into the environment, it precludes filling the building with water to shield people and equipment from radiation. Combined with the elevated radiation readings, this will most certainly mean a considerably longer and more expensive cleanup.

And reactor 2 was considered the Fukushima unit in the best shape.

(Reactor 2 is also the unit that experienced a rapid rise in temperature and possible re-criticality in early February. TEPCO officials later attributed this finding to a faulty thermometer, but if one were skeptical of that explanation before, the new information about high radiation and low water levels should warrant a re-examination of February’s events.)

What does this all mean? Well, for Japan, it means injecting another $22 billion into Fukushima’s nominal owners, TEPCO–$12 billion just to stay solvent, and $10.2 billion to cover compensation for those injured or displaced by the nuclear crisis. That cash dump comes on top of the $18 billion already coughed up by the Japanese government, and is just a small down payment on what is estimated to be a $137 billion bailout of the power company.

It also means a further erosion of trust in an industry and a government already short on respect.

The same holds true in the US, where poor communication and misinformation left the residents of central Pennsylvania panicked and perturbed some 33 years ago, and the story is duplicated on varying scales almost weekly somewhere near one of America’s 104 aging and increasingly accident-prone nuclear reactors.

And, increasingly, residents and the state and local governments that represent them are saying “enough.” Whether it is the citizens and state officials from California’s Simi Valley demanding the real cleanup of a 53-year-old meltdown, or the people and legislature of Vermont facing off with the federal government on who has ultimate authority to assure that the next nuclear accident doesn’t happen in their backyard, Americans are looking at their future in the context of nuclear’s troubled past.

One year after Fukushima, 33 years after Three Mile Island, and 53 years after the Sodium Reactor Experiment, isn’t it time the US federal government did so, too?

Union of Concerned Scientists Report: Nuclear “Near Misses” Symptom of Failing Regulatory Regime

(image: UCS report on The NRC and Nuclear Plant Safety in 2011, detail)

In its second annual report on the safety of nuclear power facilities (PDF) in the United States, the Union of Concerned Scientists have documented 15 troubling lapses–what they call “near misses”–at 13 of the nation’s atomic plants. The study details specific problems that still want for repairs, but much more disturbing, it also outlines systemic flaws in America’s nuclear regulation and oversight regime.

The problems range from aging and improperly maintained safety systems to unforgivably long delays in the implementation of Nuclear Regulatory Commission rules on fire suppression and seismic security:

We found that the NRC is allowing 47 reactors to operate despite known violations of fire-protection regulations dating back to 1980. The NRC is also allowing 27 reactors to operate even though their safety systems are not designed to protect them from earthquake-related hazards identified in 1996. Eight reactors suffer from both afflictions. The NRC established safety regulations to protect Americans from the inherent hazards of nuclear power plants. However, it is simply not fulfilling its mandate when it allows numerous plant owners to violate safety regulations for long periods of time.

The report also notes instances where nuclear workers were needlessly exposed to unsafe levels of radiation, and plants where failure to follow basic protocols had rendered backup systems functionally useless.

But perhaps most alarming (if not actually surprising) were the UCS findings on how the NRC handled Component Design Bases Inspections, or CDBIs:

Inspectors are supposed to use CDBIs to determine whether owners are operating and maintaining their reactors within specifications approved during design and licensing. Some of the problems concerned containment vent valves, battery power sources, and emergency diesel generators—components that affected the severity of the disaster at the Fukushima Dai-Ichi nuclear plant in Japan.

While it was good that the NRC identified these problems, each CDBI audits only a very small sample of possible trouble spots. For example, the CDBI at the Harris nuclear plant in North Carolina examined just 31 safety-related items among literally thousands of candidates. That audit found 10 problems. Beyond ensuring that the plant’s owner corrected those 10 problems, the NRC should have insisted that it identify and correct inadequacies in the plant’s testing and inspection regimes that allowed these problems to exist undetected in the first place. The true value of the CDBIs stems from the weaknesses they reveal in the owners’ testing and inspection regimes. But that value is realized only when the NRC forces owners to remedy those weaknesses.

In other words, it’s nice that you made the good folks at Harris fix those problems, but when a preliminary audit reveals a one-third failure rate, perhaps that plant has earned itself a full top-to-bottom inspection. (The UCS goes even further, recommending that when a nuclear facility operator–like an Exelon or Entergy–has more than one plant that fails an inspection, that company’s entire fleet of reactors should be subject to NRC review.)

As a matter of fact, the Union goes so far as to criticize the NRC’s entire approach to inspections, explaining that the job of regulators is not just to catch deficiencies and fix them. The entire process, UCS stresses, should compel plant managers to operate in such a way that ensures there will be no problems to catch–and so ensures that nuclear plants operate with the safety of its employees and the community at large as a top priority.

* * *

The Union of Concerned Scientists is a great resource. They keep a close watch on the nuclear industry, and do so with an unassailable level of scientific and technical expertise. They are critical of nuclear power as it exists today, but it would be a mistake to call them anti-nuclear. They advocate for safe energy and a clean environment, but if you read their work regularly, it is hard to say they are calling for an end to a certain technology. It makes the nuclear safety paper all the more damning, but it also poses a bit of a paradox.

In fact, reading this report brings to mind the joke about the economist on the desert island. Don’t know it? It goes something like this:

A physicist, a chemist and an economist are stranded and starving on a dessert island when they discover a can of soup that has washed ashore. But there’s a problem, how will they open the can?

The physicist says that with just right length of fallen tree as a lever, and just the right sized rock as a fulcrum, they could knock the top off the can.

“Ridiculous,” says the economist, “you will either smash the can or send it flying. Either way, the soup will splatter across the beach.”

The chemist says that he can analyze the list of ingredients and calculate just how hot they need to get the can in order to expand the soup enough to blow the can open.

“Insane,” says the economist, “if the can explodes, the soup will explode with it. We’ll be lucky to salvage a spoonful.”

“OK, then,” say the physicist and chemist in unison, “what do you propose?”

The economist strikes a thoughtful pose and says, “Assume we have a can opener. . . .”

Perhaps it is not fair to compare an association of scientists to the economist in this story, but UCS goes to admirable length describing the repeated failures of the Nuclear Regulatory Commission–about how the NRC falls short, from rule-making, to inspections, to enforcement–and then essentially says that if America’s nuclear plants are to operate safely, the NRC needs to “aggressively enforce its safety regulations.” Assume we had a regulatory body capable of regulating.

The Union says that the nuclear regulators are not doing their job–and they go further, noting that Congress has also failed by tolerating a flaccid Nuclear Regulatory Commission–but, mirroring the report’s critique of the NRC, the UCS focuses on individual incidents without addressing the systemic problem.

The NRC has had 37 years to evolve from the advocacy-oriented Atomic Energy Commission, the regulatory body’s predecessor, and yet it is still behaving as the nuclear industry’s watchful parent, rather than its top cop. Don’t just take this report as an example (well, 15 examples), look to an in-depth investigation done last summer by the Associated Press that documented the cozy relationship between plant owners and their supposed watchdogs.

The congressional committees that are supposed to provide the NRC with oversight are dominated by politicians beholden to the nuclear lobby for campaign contributions. This winter’s attempted coup against NRC Chairman Gregory Jaczko is only the latest in a long list of Capitol Hill follies designed to distract from the problems at hand and delay any increased regulation. Indeed, the problems with lax regulation and laxer oversight have plagued the system so long, it could be argued this is not a bug (as they say), but a feature.

* * *

Calling the 15 gross failures by operators and regulators “near misses” might get headlines because it sounds so ominous, however it is possible that the rubric actually downplays the problem. “Near misses” implies a bullet dodged, a past event, but the incidents highlighted, as well as the overall critique of the process, illustrate an ongoing crisis. These are not so much “near misses” as they are disasters in waiting.

Indeed, even what the report calls “positives”–three (yes, only three) instances where NRC intervention corrected a safety problem in time to prevent an accident–seem more like lucky breaks. For example, the government forced the operators of Nebraska’s Fort Calhoun nuclear plant to improve their flood protection, and in fact, the new equipment was able to protect the facility form a massive flood last summer. But the inflatable levees that were used to keep the flood waters at bay were just barely high enough to avoid being crested, and one even sprang a leak. Had the flooding continued just a little longer, the catastrophe that the UCS report gives the NRC credit for preventing would have likely occurred.

But even if you extend credit for keeping back the flood, what if (and not to get too biblical here) it was not a flood, but a fire? Fort Calhoun is among the 47 plants listed in the report as still not meeting the decades-old fire safety standards. As someone once remarked about another nuclear plant accident, the NRC is getting “credit for the grace of God.”

Alas, God has proven to be an uneven regulator, too. Those who had the misfortune of living downwind of Three Mile Island, Chernobyl or Fukushima have learned the Lord regulates in mysterious ways. Does the Union of Concerned Scientists really believe that the Nuclear Regulatory Commission can change radically enough to force sufficient safety upgrades on US nuclear plants to assure that no Fukushima-like (or even Fukushima-light) accident will ever happen here?

It is hard to believe they do. The report’s full title, after all, is “The NRC and Nuclear Power Plant Safety: Living on Borrowed Time.”

While a stronger regulatory body is a good idea–and one strongly urged by the UCS–the report provides no way to achieve that goal. Given the problems and the history, it is hard to believe even the best scientists in the field have an answer to nuclear safety’s political impediments.

So, given that, what should be the real conclusion of the Union’s report? It would be the same as the conclusion reached by any honest observer of nuclear power: atomic power–too dirty, too dangerous, and too expensive.

In the short-term, sure, the Nuclear Regulatory Commission needs to do a better job of policing plant safety–but in the long-term, this part of the NRC’s mandate needs to disappear along with its unstable, untenable, and un-regulatable target.