Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Oyster Creek Nuclear Alert: As Floodwaters Fall, More Questions Arise

Oyster Creek Nuclear Generating Station in pre-flood mode. (photo: NRCgov)

New Jersey’s Oyster Creek Nuclear Generating Station remains under an official Alert, a day-and-a-half after the US Nuclear Regulatory Commission declared the emergency classification due to flooding triggered by Hurricane Sandy. An Alert is the second category on the NRC’s four-point emergency scale. Neil Sheehan, a spokesman for the federal regulator, said that floodwaters around the plant’s water intake structure had receded to 5.7 feet at 2:15 PM EDT Tuesday, down from a high of 7.4 feet reached just after midnight.

Water above 6.5 to 7 feet was expected to compromise Oyster Creek’s capacity to cool its reactor and spent fuel pool, according to the NRC. An “Unusual Event,” the first level of emergency classification, was declared Monday afternoon when floodwaters climbed to 4.7 feet.

Though an emergency pump was brought in when water rose above 6.5 feet late Monday, the NRC and plant owner Exelon have been vague about whether it was needed. As of this writing, it is still not clear if Oyster Creek’s heat transfer system is functioning as designed.

As flooding continued and water intake pumps were threatened, plant operators also floated the idea that water levels in the spent fuel pool could be maintained with fire hoses. Outside observers, such as nuclear consultant Arnie Gundersen, suspected Oyster Creek might have accomplished this by repurposing its fire suppression system (and Reuters later reported the same), though, again, neither Exelon nor regulators have given details.

Whether the original intake system or some sort of contingency is being used, it appears the pumps are being powered by backup diesel generators. Oyster Creek, like the vast majority of southern New Jersey, lost grid power as Sandy moved inland Monday night. In the even of a site blackout, backup generators are required to provide power to cooling systems for the reactor–there is no such mandate, however, for spent fuel pools. Power for pool cooling is expected to come either from the grid or the electricity generated by the plant’s own turbines.

As the NRC likes to remind anyone who will listen, Oyster Creek’s reactor was offline for fueling and maintenance. What regulators don’t add, however, is that the reactor still needs cooling for residual decay heat, and that the fuel pool likely contains more fuel and hotter fuel as a result of this procedure, which means it is even more at risk for overheating. And, perhaps most notably, with the reactor shutdown, it is not producing the electricity that could be used to keep water circulating through the spent fuel pool.

If that sounds confusing, it is probably not by accident. Requests for more and more specific information (most notably by the nuclear watchdog site SimplyInfo) from Exelon and the NRC remain largely unanswered.

Oyster Creek was not the only nuclear power plant dealing with Sandy-related emergencies. As reported here yesterday, Nine Mile Point Unit 1 and Indian Point Unit 3–both in New York–each had to scram because of grid interruptions triggered by Monday’s superstorm. In addition, one of New Jersey’s Salem reactors shut down when four of six condenser circulators (water pumps that aid in heat transfer) failed “due to a combination of high river level and detritus from Hurricane Sandy’s transit.” Salem vented vapor from what are considered non-nuclear systems, though as noted often, that does not mean it is completely free of radioactive components. (Salem’s other reactor was offline for refueling.)

Limerick (PA) reactors 1 and 2, Millstone (CT) 3, and Vermont Yankee all reduced power output in response to Superstorm Sandy. The storm also caused large numbers of emergency warning sirens around both Oyster Creek and the Peach Bottom (PA) nuclear plant to fail.

If you thought all of these problems would cause nuclear industry representatives to lay low for a while, well, you’d be wrong:

“Our facilities’ ability to weather the strongest Atlantic tropical storm on record is due to rigorous precautions taken in advance of the storm,” Marvin Fertel, chief executive officer of the Nuclear Energy Institute, a Washington-based industry group, said yesterday in a statement.

Fertel went on to brag that of the 34 reactors it said were in Sandy’s path, 24 survived the storm without incident.

Or, to look at it another way, during a single day, the heavily populated eastern coast of the Unite States saw multiple nuclear reactors experience problems. And that’s in the estimation of the nuclear industry’s top lobbyist.

Or, should we say, the underestimation? Of the ten reactors not in Fertel’s group of 24, seven were already offline, and the industry is not counting them. So, by Fertel’s math, Oyster Creek does not figure against what he considers success. Power reductions and failed emergency warning systems are also not factored in, it appears.

This storm–and the trouble it caused for America’s nuclear fleet–comes in the context of an 18-month battle to improve nuclear plant safety in the wake of the multiple meltdowns and continuing crisis at Japan’s Fukushima Daiichi nuclear facility. Many of the rules and safety upgrades proposed by a US post-Fukushima taskforce are directly applicable to problems resulting from Superstorm Sandy. Improvements to flood preparation, backup power regimes, spent fuel storage and emergency notification were all part of the taskforce report–all of which were theoretically accepted by the Nuclear Regulatory Commission. But nuclear industry pushback, and stonewalling, politicking and outright defiance by pro-industry commissioners has severely slowed the execution of post-Fukushima lessons learned.

The acolytes of atom-splitting will no doubt point to the unprecedented nature of this massive hybrid storm, echoing the “who could have predicted” language heard from so many after the earthquake and tsunami that started the Fukushima disaster. Indeed, such language has already been used–though, granted, in a non-nuclear context–by Con Edison officials discussing massive power outages still afflicting New York City:

At a Consolidated Edison substation in Manhattan’s East Village, a gigantic wall of water defied elaborate planning and expectations, swamped underground electrical equipment, and left about 250,000 lower Manhattan customers without power.

Last year, the surge from Hurricane Irene reached 9.5 feet at the substation. ConEd figured it had that covered.

The utility also figured the infrastructure could handle a repeat of the highest surge on record for the area — 11 feet during a hurricane in 1821, according to the National Weather Service. After all, the substation was designed to withstand a surge of 12.5 feet.

With all the planning, and all the predictions, planning big was not big enough. Sandy went bigger — a surge of 14 feet.

“Nobody predicted it would be that high,” said ConEd spokesman Allan Drury.

In a decade that has seen most of the warmest years on record and some of the era’s worst storms, there needs to be some limit on such excuses. Nearly a million New York City residents (including this reporter) are expected to be without electricity through the end of the week. Residents in the outer boroughs and millions in New Jersey could be in the dark for far longer. Having a grid that simply survives a category 1 hurricane without a Fukushima-sized nuclear disaster is nothing to crow about.

The astronomical cost of restoring power to millions of consumers is real, as is the potential danger still posed by a number of crippled nuclear power plants. The price of preventing the current storm-related emergencies from getting worse is also not a trivial matter, nor are the radioactive isotopes vented with every emergency reactor scram. All of that should be part of the nuclear industry’s report card; all of that should raise eyebrows and questions the next time nuclear is touted as a clean, safe, affordable energy source for a climate change-challenged world.

UPDATE: The AP is reporting that the NRC has now lifted the emergency alert at Oyster Creek.

Edison Con? San Onofre Nuclear Plant Owner Proposes Reactor Restart

Containment domes or shell game? (Aerial view of San Onofre Nuclear Generating Station by Jelson25 via wikipedia)

Southern California Edison (SCE), the operator of the troubled San Onofre Nuclear Generating Station (SONGS), has proposed to restart one of the facility’s two damaged reactors without repairing or replacing the parts at the root of January’s shutdown. The Thursday announcement came over eight months after a ruptured heat transfer tube leaked radioactive steam, scramming Unit 3 and taking the entire plant offline. (Unit 2, offline for maintenance, revealed similar tube wear in a subsequent inspection; Unit 1 was taken out of service in 1992.)

But perhaps more tellingly, Edison’s plan–which must be reviewed by the Nuclear Regulatory Commission–was issued just weeks before the mandated start of hearings on rate cuts. California law requires an investigation into ratepayer relief when a facility fails to deliver electricity for nine months. Support of the zombie San Onofre plant has cost California consumers $54 million a month since the shutdown. It has been widely believed since spring that Unit 3 would likely never be able to safely generate power, and that the almost identical Unit 2 was similarly handicapped and would require a complete overhaul for its restart to even be considered.

Yet, calls for more immediate rate rollbacks were rebuffed by Edison and ignored by members of the California Public Utilities Commission (CPUC). Despite studies that showed SONGS tube wear and failure was due to bad modeling and flawed design, and a company pledge to layoff of one-third of plant employees, San Onofre’s operators claimed they were still pursuing a restart.

Thursday’s proposal for that restart does not directly engage any of the concerns voiced by nuclear engineers and watchdog groups.

When SONGS installed new turbines in 2010 and 2011, it did not replace “like with like”–that would have required a costly custom machining of parts no longer routinely manufactured. Instead, San Onofre’s owners moved to “uprate” their generators–cramming in more transfer tubes to increase output–with the nuclear industry equivalent of “off the shelf” parts. It was a transparently profit-driven decision, but more crucially, it was a major design change that should have required a lengthy license-amendment process at the NRC.

Federal regulators, however, took on faith industry assurances that changes were not that big a deal, and approved San Onofre’s massive retrofit without an extensive investigation into the plan.

What is now understood to have happened is that the design of new parts for San Onofre was based on flawed computer models that failed to anticipate new fluid dynamics, increased vibration, and more rapid wear in the numerous thin, metal, heat transfer tubes. It’s a flaw that presumably would have turned up in a more rigorous regulatory review, and, again, a problem not directly addressed by Edison’s restart plan.

Rather than repair or replace the tubes and turbines, San Onofre’s owners propose to simply plug the most severely degraded tubes in Unit 2 and then run that reactor at 70 percent power. After five months, Unit 2 would be shut down and inspected. (There was no plan offered for the future of Unit 3.)

Why 70 percent? Edison said it believes that would lessen vibration and decrease the rate of wear on the heat transfer tubes. Does that make any scientific sense? Not in the eyes of nuclear engineer Arnie Gundersen, who has produced three studies on San Onofre’s problems:

Restarting San Onofre without repairing the underlying problems first turns Southern California into a massive science experiment. Running at the reactor at a 30 percent reduction in power may not fix the problems but rather make them worse or shift the damage to another part of the generators. It’s a real gamble to restart either unit without undertaking repairs or replacing the damaged equipment.

S. David Freeman, former head of the Los Angeles Department of Water and Power, as well as the Tennessee Valley Authority, and now a senior advisor to Friends of the Earth, is even more pointed:

Neither of the reactors at San Onofre are safe to operate. While Edison may be under financial pressure to get one up and running, operating this badly damaged reactor at reduced power without fixing or replacing these leaky generators is like driving a car with worn-out brakes but promising to keep it under 50 miles an hour.

That is the scenario now before the NRC. An experimental roll of the dice within 50 miles of 8.4 million California residents, offered up with a “trust us” by the same folks who got the modeling dangerously wrong last time, versus multiple studies calling into question the viability of a plant that already has a long history of safety and engineering problems. Regulators are at least talking as if they understand:

“The agency will not permit a restart unless and until we can conclude the reactor can be operated safely,” NRC Chairman Allison Macfarlane said. “Our inspections and review will be painstaking, thorough and will not be rushed.”

The right words, but hardly reassuring ones given the commission’s past actions (or inactions) on San Onofre and numerous other dangerous events across the nation’s aging nuclear fleet.

The sting that keeps on stinging

But does NRC approval really matter to Southern California Edison, at least in the short run?

Operating only one of San Onofre’s reactors at two-thirds of its proposed output for five months sometime next year–which is the best-case scenario–does not provide a meaningful addition to California’s near- or long-term energy outlook. (California officials are already making plans for another year without San Onofre.) In addition, San Onofre has other problems to address, such as aforementioned staffing issues, new seismic evaluations required in the wake of the Fukushima disaster, newfound safety lapses, and ongoing concerns about the quality of the concrete used to plug 28-foot holes in both reactors’ containment domes (the holes were cut for installation of the new turbines, inquiries about the strength and durability of the concrete were made a year ago, but, to date, the NRC has not released a report).

But Thursday’s proposal does provide Edison with a modicum of cover going into an October 9 public information session and the upcoming debate over whether California consumers should still have to pay for a power plant that provides no power.

Indeed, billing for services not rendered could be considered a profit center for the US nuclear industry. San Onofre is but one case; ratepayers in Florida are also familiar with the scam.

The same day SCE submitted its SONGS plan, attorneys for the Florida Public Service Commission (PSC), Progress Energy and Florida Power & Light (FPL), appeared before the Florida Supreme Court to defend an “advance fee” that has allowed the utilities to soak Sunshine State ratepayers for upwards of $1 billion. The money collected, and additional fees approved last year by the PSC, are slated for the construction of new nuclear reactors in Levy County and at Turkey Point.

The court challenge was brought by the Southern Alliance for Clean Energy, which contends there is little evidence Progress or FPL can or ever really intend to build the new facilities. Indeed, FPL has spent some of its takings on existing operations, while Progress has blown hundreds of millions of dollars trying to repair its Crystal River nuclear plant, which has been offline since 2009, and likely will never return to service.

What do attorneys for the utilities say when challenged on these points? That their intent is borne out by the fact that both are still seeking construction and operating licenses from the Nuclear Regulatory Commission.

There is no indication NRC approval on those projects is imminent (in fact, no NRC approvals of any projects are imminent), nor are there any guarantees that the projects could be fully financed even with licenses and all that ratepayer cash.

But, be it for future fantasies or current failures, from Florida to California, electricity consumers are paying higher prices to perpetuate the myth of a nuclear renaissance and balance the books of the nuclear industry. . . while industry officials, lobbyists and favored politicians pocket a healthy share.

And not satisfied with that cushy arrangement, San Onofre’s operators are also pushing for permission to move its ratepayer-financed decommissioning fund into riskier investment properties. The industry promises this will bring higher yields, but, of course, it also chances bigger losses–and it guarantees larger fees, which would be passed on to Southern California consumers upon CPUC approval.

None of these actions–not the investment games, the rate hikes or the experiment with San Onofre’s damaged reactor–are actually about providing a steady supply of safe, affordable energy. These are all pecuniary plays. Across the country and across the board, nuclear operators seem more interested in cashing in than putting out.

More prudent for governments and utility commissions, and more beneficial for ratepayers, of course, would be to stop paying the vig to nuclear’s loan sharks, stop throwing good money after bad in a sector that is dying and dangerous, and start making investments in truly clean, truly renewable, and increasingly far more economical 21st Century energy technologies.

Until that happens, the most profitable thing about nuclear power will continue to be the capacity to charge for a service that might never be provided. Private utilities have understood this for a long time; ratepayers are becoming painfully aware of it, too. The question is, when will government regulators and utility commissions understand it–or at least fess up to being in on the con?

* * * *

Stop the Madness! Or at least learn more about it. Join me on Saturday, October 13, at 5 PM Eastern time (2 PM Pacific) when I host an FDL Book Salon chat with Joseph Mangano, author of Mad Science: The Nuclear Power Experiment.

Something Fishy: CRS Report Downplays Fukushima’s Effect on US Marine Environment

japan

(photo: JanneM)

Late Thursday, the United States Coast Guard reported that they had successfully scuttled the Ryou-Un Maru, the Japanese “Ghost Ship” that had drifted into US waters after being torn from its moorings by the tsunami that followed the Tohoku earthquake over a year ago. The 200-foot fishing trawler, which was reportedly headed for scrap before it was swept away, was seen as potentially dangerous as it drifted near busy shipping lanes.

Coincidentally, the “disappearing” of the Ghost Ship came during the same week the Congressional Research Service (CRS) released its report on the effects of the Fukushima Daiichi nuclear disaster on the US marine environment, and, frankly, the metaphor couldn’t be more perfect. The Ryou-Un Maru is now resting at the bottom of the ocean–literally nothing more to see there, thanks to a few rounds from a 25mm Coast Guard gun–and the CRS hopes to dispatch fears of the radioactive contamination of US waters and seafood with the same alacrity.

But while the Ghost Ship was not considered a major ecological threat (though it did go down with around 2,000 gallons of diesel fuel in its tanks), the US government acknowledges that this “good luck ship” (a rough translation of its name) is an early taste of the estimated 1.5 million tons of tsunami debris expected to hit North American shores over the next two or three years. Similarly, the CRS report (titled Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment [PDF]) adopts an overall tone of “no worries here–its all under control,” but a closer reading reveals hints of “more to come.”

Indeed, the report feels as it were put through a political rinse cycle, limited both in the strength of its language and the scope of its investigation. This tension is evident right from the start–take, for example, these three paragraphs from the report’s executive summary:

Both ocean currents and atmospheric winds have the potential to transport radiation over and into marine waters under U.S. jurisdiction. It is unknown whether marine organisms that migrate through or near Japanese waters to locations where they might subsequently be harvested by U.S. fishermen (possibly some albacore tuna or salmon in the North Pacific) might have been exposed to radiation in or near Japanese waters, or might have consumed prey with accumulated radioactive contaminants.

High levels of radioactive iodine-131 (with a half-life of about 8 days), cesium-137 (with a half-life of about 30 years), and cesium-134 (with a half-life of about 2 years) were measured in seawater adjacent to the Fukushima Dai-ichi site after the March 2011 events. EPA rainfall monitors in California, Idaho, and Minnesota detected trace amounts of radioactive iodine, cesium, and tellurium consistent with the Japanese nuclear incident, at concentrations below any level of concern. It is uncertain how precipitation of radioactive elements from the atmosphere may have affected radiation levels in the marine environment.

Scientists have stated that radiation in the ocean very quickly becomes diluted and would not be a problem beyond the coast of Japan. The same is true of radiation carried by winds. Barring another unanticipated release, radioactive contaminants from Fukushima Dai-ichi should be sufficiently dispersed over time that they will not prove to be a serious health threat elsewhere, unless they bioaccumulate in migratory fish or find their way directly to another part of the world through food or other commercial products.

Winds and currents have “the potential” to transport radiation into US waters? Winds–quite measurably–already have, and computer models show that currents, over the next couple of years, most certainly will.

Are there concentrations of radioisotopes that are “below concern?” No reputable scientist would make such a statement. And if monitors in the continental United States detected radioactive iodine, cesium and tellurium in March 2011, then why did they stop the monitoring (or at least stop reporting it) by June?

The third paragraph, however, wins the double-take prize. Radiation would not be a problem beyond the coast? Fish caught hundreds of miles away would beg to differ. “Barring another unanticipated release. . . ?” Over the now almost 13 months since the Fukushima crisis began, there have been a series of releases into the air and into the ocean–some planned, some perhaps unanticipated at the time, but overall, the pattern is clear, radioactivity continues to enter the environment at unprecedented levels.

And radioactive contaminants “should be sufficiently dispersed over time, unless they bioaccumulate?” Unless? Bioaccumulation is not some crazy, unobserved hypothesis, it is a documented biological process. Bioaccumulation will happen–it will happen in migratory fish and it will happen as under-policed food and commercial products (not to mention that pesky debris) make their way around the globe.

Maybe that is supposed to be read by inquiring minds as the report’s “please ignore he man behind the curtain” moment–an intellectual out clause disguised as an authoritative analgesic–but there is no escaping the intent. Though filled with caveats and counterfactuals, the report is clearly meant to serve as a sop to those alarmed by the spreading ecological catastrophe posed by the ongoing Fukushima disaster.

The devil is in the details–the dangers are in the data

Beyond the wiggle words, perhaps the most damning indictment of the CRS marine radiation report can be found in the footnotes–or, more pointedly, in the dates of the footnotes. Though this report was released over a year after the Tohoku earthquake and tsunami triggered the Fukushima nightmare, the CRS bases the preponderance of its findings on information generated during the disaster’s first month. In fact, of the document’s 29 footnotes, only a handful date from after May 2011–one of those points to a CNN report (authoritative!), one to a status update on the Fukushima reactor structures, one confirms the value of Japanese seafood imports, three are items tracking the tsunami debris, and one directs readers to a government page on FDA radiation screening, the pertinent part of which was last updated on March 28 of last year.

Most crucially, the parts of the CRS paper that downplay the amounts of radiation measured by domestic US sensors all cite data collected within the first few weeks of the crisis. The point about radioisotopes being “below any level of concern” comes from an EPA news release dated March 22, 2011–eleven days after the earthquake, only six days after the last reported reactor explosion, and well before so many radioactive releases into the air and ocean. It is like taking reports of only minor flooding from two hours after Hurricane Katrina passed over New Orleans, and using them as the standard for levee repair and gulf disaster planning (perhaps not the best example, as many have critiqued levee repairs for their failure to incorporate all the lessons learned from Katrina).

It now being April of 2012, much more information is available, and clearly any report that expects to be called serious should have included at least some of it.

By October of last year, scientists were already doubling their estimates of the radiation pushed into the atmosphere by the Daiichi reactors, and in early November, as reported here, France’s Institute for Radiological Protection and Nuclear Safety issued a report showing the amount of cesium 137 released into the ocean was 30 times greater than what was stated by TEPCO in May. Shockingly, the Congressional Research Service does not reference this report.

Or take the early March 2012 revelation that seaweed samples collected from off the coast of southern California show levels of radioactive iodine 131 500 percent higher than those from anywhere else in the US or Canada. It should be noted that this is the result of airborne fallout–the samples were taken in mid-to-late-March 2011, much too soon for water-borne contamination to have reached that area–and so serves to confirm models that showed a plume of radioactive fallout with the greatest contact in central and southern California. (Again, this specific report was released a month before the CRS report, but the data it uses were collected over a year ago.)

Then there are the food samples taken around Japan over the course of the last year showing freshwater and sea fish–some caught over 200 kilometers from Fukushima–with radiation levels topping 100 becquerels per kilogram (one topping 600 Bq/kg).

And the beat goes on

This information, and much similar to it, was all available before the CRS released its document, but the report also operates in a risibly artificial universe that assumes the situation at Fukushima Daiichi has basically stabilized. As a sampling of pretty much any week’s news will tell you, it has not. Take, for example, this week:

About 12 tons of water contaminated with radioactive strontium are feared to have leaked from the Fukushima No. 1 plant into the Pacific Ocean, Tepco said Thursday.

The leak occurred when a pipe broke off from a joint while the water was being filtered for cesium, Tokyo Electric Power Co. said.

The system doesn’t remove strontium, and most of the water apparently entered the sea via a drainage route, Tepco added.

The water contained 16.7 becquerels of cesium per cu. centimeter and tests are under way to determine how much strontium was in it, Tepco said.

This is the second such leak in less than two weeks, and as Kazuhiko Kudo, a professor of nuclear engineering at Kyushu University who visited Fukushima Daiichi twice last year, noted:

There will be similar leaks until Tepco improves equipment. The site had plastic pipes to transfer radioactive water, which Tepco officials said are durable and for industrial use, but it’s not something normally used at nuclear plants. Tepco must replace it with metal equipment, such as steel.

(The plastic tubes–complete with the vinyl and duct tape patch–can be viewed here.)

And would that the good people at the Congressional Research Service could have waited to read a report that came out the same day as theirs:

Radioactive material from the Fukushima nuclear disaster has been found in tiny sea creatures and ocean water some 186 miles (300 kilometers) off the coast of Japan, revealing the extent of the release and the direction pollutants might take in a future environmental disaster.

In some places, the researchers from Woods Hole Oceanographic Institution (WHOI) discovered cesium radiation hundreds to thousands of times higher than would be expected naturally, with ocean eddies and larger currents both guiding the “radioactive debris” and concentrating it.

Or would that the folks at CRS had looked to their fellow government agencies before they went off half-cocked. (The study above was done by researchers at Woods Hole and written up in the journal of the National Academy of Sciences.) In fact, it appears the CRS could have done that. In its report, CRS mentions that “Experts cite [Fukushima] as the largest recorded release of radiation to the ocean,” and the source for that point is a paper by Ken Buesseler–the same Ken Buesseler that was the oceanographer in charge of the WHOI study. Imagine what could have been if the Congressional Research Service had actually contacted the original researcher.

Can openers all around

Or perhaps it wouldn’t have mattered. For if there is one obvious takeaway from the CRS paper, beyond its limits of scope and authority, that seeks to absolve it of all other oversights–it is its unfailing confidence in government oversight.

Take a gander at the section under the bolded question “Are there implications for US seafood safety?”:

It does not appear that nuclear contamination of seafood will be a food safety problem for consumers in the United States. Among the main reasons are that:

  • damage from the disaster limited seafood production in the affected areas,
  • radioactive material would be diluted before reaching U.S. fishing grounds, and
  • seafood imports from Japan are being examined before entry into the United States.

According to the U.S. Food and Drug Administration (FDA), because of damage from the earthquake and tsunami to infrastructure, few if any food products are being exported from the affected region. For example, according to the National Federation of Fisheries Cooperative Associations, the region’s fishing industry has stopped landing and selling fish. Furthermore, a fishing ban has been enforced within a 2-kilometer radius around the damaged nuclear facility.

So, the Food and Drug Administration is relying on the word of an industry group and a Japanese government-enforced ban that encompasses a two-kilometer radius–what link of that chain is supposed to be reassuring?

Last things first: two kilometers? Well, perhaps the CRS should hire a few proofreaders. A search of the source materials finds that the ban is supposed to be 20-kilometers. Indeed, the Japanese government quarantined the land for a 20-kilometer radius. The US suggested evacuation from a 50-mile (80-kilometer) radius. The CRS’s own report notes contaminated fish were collected 30 kilometers from Fukushima. So why is even 20 kilometers suddenly a radius to brag about?

As for a damaged industry not exporting, numerous reports show the Japanese government stepping in to remedy that “problem.” From domestic PR campaigns encouraging the consumption of foodstuffs from Fukushima prefecture, to the Japanese companies selling food from the region to other countries at deep discounts, to the Japanese government setting up internet clearing houses to help move tainted products, all signs point to a power structure that sees exporting possibly radioactive goods as essential to its survival.

The point on dilution, of course, not only ignores the way many large scale fishing operations work, it ignores airborne contamination and runs counter to the report’s own acknowledgment of bioaccumulation.

But maybe the shakiest assertion of all is that the US Food and Drug Administration will stop all contaminated imports at the water’s edge. While imports hardly represent the total picture when evaluating US seafood safety, taking this for the small slice of the problem it covers, it engenders raised eyebrows.

First there is the oft-referenced point from nuclear engineer Arnie Gundersen, who said last summer that State Department officials told him of a secret agreement between Japan and Secretary Hilary Clinton guaranteeing the continued importation of Japanese food. While independent confirmation of this pact is hard to come by, there is the plain fact that, beyond bans on milk, dairy products, fruits and vegetables from the Fukushima region issued in late March 2011, the US has proffered no other restrictions on Japanese food imports (and those few restrictions for Japanese food were lifted for US military commissaries in September).

And perhaps most damning, there was the statement from an FDA representative last April declaring that North Pacific seafood was so unlikely to be contaminated that “no sampling or monitoring of our fish is necessary.” The FDA said at the time that it would rely on the National Oceanographic and Atmospheric Administration (NOAA) to tell it when they should consider testing seafood, but a NOAA spokesperson said it was the FDA’s call.

Good. Glad that’s been sorted out.

The Congressional Research Service report seems to fall victim to a problem noted often here–they assume a can opener. As per the joke, the writers stipulate a functioning mechanism before explaining their solution. As many nuclear industry-watchers assume a functioning regulatory process (as opposed to a captured Nuclear Regulatory Commission, an industry-friendly Department of Energy, and industry-purchased members of Congress) when speaking of the hypothetical safety of nuclear power, the CRS here assumes an FDA interested first and foremost in protecting the general public, instead of an agency trying to strike some awkward “balance” between health, profit and politics. The can opener story is a joke; the effects of this real-life example are not.

Garbage in, garbage out

The Congressional Research Service, a part of the Library of Congress, is intended to function as the research and analysis wing of the US Congress. It is supposed to be objective, it is supposed to be accurate, and it is supposed to be authoritative. America needs the CRS to be all of those things because the agency’s words are expected to inform federal legislation. When the CRS shirks its responsibility, shapes its words to fit comfortably into the conventional wisdom, or shaves off the sharp corners to curry political favor, the impact is more than academic.

When the CRS limits its scope to avoid inconvenient truths, it bears false witness to the most important events of our time. When the CRS pretends other government agencies are doing their jobs–despite documentable evidence to the contrary–then they are not performing theirs. And when the CRS issues a report that ignores the data and the science so that a few industries might profit, it is America that loses.

The authors of this particular report might not be around when the bulk of the cancers and defects tied to the radiation from Fukushima Daiichi present in the general population, but this paper’s integrity today could influence those numbers tomorrow. Bad, biased, or bowdlerized advice could scuttle meaningful efforts to make consequential policy.

If the policy analysts that sign their names to reports like this don’t want their work used for scrap paper, then maybe they should take a lesson from the Ryou-Un Maru. Going where the winds and currents take you makes you at best a curiosity, and more likely a nuisance–just so much flotsam and jetsam getting in the way of actual business. Works of note come with moral rudders, anchored to best data available; without that, the report might as well just say “good luck.”

Looking Back at Our Nuclear Future

nuclear reactor, rocketdyne, LAT

The Los Angeles Times heralds the nuclear age in January 1957. (photo via wikipedia)

On March 11, communities around the world commemorated the first year of the still-evolving Fukushima Daiichi nuclear disaster with rallies, marches, moments of silence, and numerous retrospective reports and essays (including one here). But 17 days later, another anniversary passed with much less fanfare.

It was in the early morning hours of March 28, 1979, that a chain of events at the Three Mile Island nuclear power plant in Dauphin County, Pennsylvania caused what is known as a “loss of coolant accident,” resulting in a partial core meltdown, a likely hydrogen explosion, the venting of some amount of radioisotopes into the air and the dumping of 40,000 gallons of radioactive waste water into the Susquehanna River. TMI (as it is sometimes abbreviated) is often called America’s worst commercial nuclear accident, and though the nuclear industry and its acolytes have worked long and hard to downplay any adverse health effects stemming from the mishap, the fact is that what happened in Pennsylvania 33 years ago changed the face and future of nuclear power.

The construction of new nuclear power facilities in the US was already in decline by the mid 1970s, but the Three Mile Island disaster essentially brought all new projects to a halt. There were no construction licenses granted to new nuclear plants from the time of TMI until February of this year, when the NRC gave a hasty go-ahead to two reactors slated for the Vogtle facility in Georgia. And though health and safety concerns certainly played a part in this informal moratorium, cost had at least an equal role. The construction of new plants proved more and more expensive, never coming in on time or on budget, and the cleanup of the damaged unit at Three Mile Island took 14 years and cost over $1 billion. Even with the Price-Anderson Act limiting the industry’s liability, nuclear power plants are considered such bad risks that no financing can be secured without federal loan guarantees.

In spite of that–or because of that–the nuclear industry has pushed steadily over the last three decades to wring every penny out of America’s aging reactors, pumping goodly amounts of their hefty profits into lobbying efforts and campaign contributions designed to capture regulators and elected officials and propagate the age-old myth of an energy source that is clean, safe, and, if not exactly “too cheap to meter,” at least impressively competitive with other options. The result is a fleet of over 100 reactors nearing the end of their design lives–many with documented dangers and potential pitfalls that could rival TMI–now seeking and regularly getting license extensions from the Nuclear Regulatory Commission while that same agency softens and delays requirements for safety upgrades.

And all of that cozy cooperation between government and big business goes on with the nuclear industry pushing the idea of a “nuclear renaissance.” In the wake of Fukushima, the industry has in fact increased its efforts, lobbying the US and British governments to downplay the disaster, and working with its mouthpieces in Congress and on the NRC to try to kill recommended new regulations and force out the slightly more safety-conscious NRC chair. And, just this month, the Nuclear Energy Institute, the chief nuclear trade group, moved to take their message to what might be considered a less friendly cohort, launching a splashy PR campaign by underwriting public radio broadcasts and buying time for a fun and funky 60-second animated ad on The Daily Show.

All of this is done with the kind of confidence that only comes from knowing you have the money to move political practice and, perhaps, public opinion. Three Mile Island is, to the industry, the exception that proves the rule–if not an out-and-out success. “No one died,” you will hear–environmental contamination and the latest surveys now showing increased rates of Leukemia some 30 years later be damned–and that TMI is the only major accident in over half a century of domestic nuclear power generation.

Of course, this is not even remotely true–names like Browns Ferry, Cooper, Millstone, Indian Point and Vermont Yankee come to mind–but even if you discount plant fires and tritium leaks, Three Mile Island is not even America’s only meltdown.

There is, of course, the 1966 accident at Michigan’s Enrico Fermi Nuclear Generating Station, chronicled in the John Grant Fuller book We Almost Lost Detroit, but atom-lovers will dismiss this because Fermi 1 was an experimental breeder reactor, so it is not technically a “commercial” nuclear accident.

But go back in time another seven years–a full 20 before TMI–and the annals of nuclear power contain the troubling tale of another criticality accident, one that coincidentally is again in the news this week, almost 53 years later.

The Sodium Reactor Experiment

On July 12, 1957, the Sodium Reactor Experiment (SRE) at the Santa Susana Nuclear Field Laboratory near Simi Valley, California, became the first US nuclear reactor to produce electricity for a commercial power grid. SRE was a sodium-cooled reactor designed by Atomics International, a division of North American Aviation, a company more often known by the name of its other subsidiary, Rocketdyne. Southern California Edison used the electricity generated by SRE to light the nearby town of Moorpark.

Sometime during July 1959–the exact date is still not entirely clear–a lubricant used to cool the seals on the pump system seeped into the primary coolant, broke down in the heat and formed a compound that clogged cooling channels. Because of either curiosity or ignorance, operators continued to run the SRE despite wide fluctuations in core temperature and generating capacity.

Following a pattern that is now all too familiar, increased temperatures caused increased pressure, necessitating what was even then called a “controlled venting” of radioactive vapor. How much radioactivity was released into the environment is cause for some debate, for, in 1959, there was less monitoring and even less transparency. Current reconstructions, however, believe the release was possibly as high as 450 times greater than what was vented at Three Mile Island.

When the reactor was finally shut down and the fuel rods were removed (which was a trick in itself, as some were stuck and others broke), it was found that over a quarter showed signs of melting.

The SRE was eventually repaired and restarted in 1960, running on and off for another four years. Decommissioning began in 1976, and was finished in 1981, but the story doesn’t end there. Not even close.

Fifty-three years after a partial nuclear meltdown at the Santa Susana Field Laboratory site in the Chatsworth Hills, the U.S. Environmental Protection Agency has just released data finding extensive radioactive contamination still remains at the accident site.

“This confirms what we were worried about,” said Assemblywoman Julia Brownley, D-Oak Park, a long-time leader in the fight for a complete and thorough cleanup of this former Rocketdyne rocket engine testing laboratory. “This begins to answer critical questions about what’s still up there, where, how much, and how bad?”

Well, it sort of begins to answer it.

New soil samples weigh in at up to 1,000 times the radiation trigger levels (RTLs) agreed to when the Department of Energy struck a cleanup deal with the California Department of Toxic Substances in 2010. What’s more, these measurements follow two previous cleanup efforts by the DOE and Boeing, the company that now owns Santa Susana.

In light of the new findings, Assemblywoman Brownley has called on the DOE to comply with the agreement and do a real and thorough cleanup of the site. That means taking radiation levels down to what are the established natural background readings for the area. But that, as is noted by local reporter Michael Collins, “may be easier said than done”:

This latest U.S. EPA information appears to redefine what cleaning up to background actually is. Publicly available documents show that the levels of radiation in this part of Area IV where the SRE once stood are actually many thousands of times more contaminated than previously thought.

Just as troubling, the EPA’s RTLs, which are supposed to mirror the extensively tested and reported-on backgrounds of the numerous radionuclides at the site, were many times over the background threshold values (BTVs). So instead of cleaning up to background, much more radiation would be left in the ground, saving the government and lab owner Boeing millions in cleanup.

It is a disturbing tale of what Collins calls a kind of environmental “bait and switch” (of which he provides even more detail in an earlier report), but after a year of documenting the mis- and malfeasance of the nuclear industry and its supposed regulators, it is, to us here, anyway, not a surprising one.

To the atom-enamored, it is as if facts have a half-life all their own. The pattern of swearing that an event is no big deal, only to come back with revision after revision, each admitting a little bit more in a seemingly never-ending regression to what might approximately describe a terrible reality. It would be reminiscent of the “mom’s on the roof” joke if anyone actually believed that nuclear operators and their chummy government minders ever intended to eventually relay the truth.

Fukushima’s latest surprise

Indeed, that unsettling pattern is again visible in the latest news from Japan. This week saw revelations that radiation inside Fukushima Daiichi’s reactor 2 containment vessel clocked in at levels seriously higher than previously thought, while water levels are seriously lower.

An endoscopic camera, thermometer, water gauge and dosimeter were inserted into the number 2 reactor containment, and it documented radiation levels of up to 70 sieverts per hour, which is not only seven times the previous highest measurement, but 10 times higher than what is called a fatal dose (7 Sv/hr would kill a human in minutes).

The water level inside the containment vessel, estimated to be at 10 meters when the Japanese government declared a “cold shutdown” in December, turns out to be no more than 60 centimeters (about two feet).

This is disquieting news for many reasons. First, the high radiation not only makes it impossible for humans to get near the reactor, it makes current robotic technology impractical, as well. The camera, for instance, would only last 14 hours in those conditions. If the molten core is to be removed, a new class of radiation-resistant robots will have to be developed.

The extremely low water levels signal more troubling scenarios. Though some experts believe that the fuel rods have melted down or melted through to such an extent that two feet of water can keep them covered, it likely indicates a breach or breaches of the containment vessel. Plant workers, after all, have been pumping water into the reactor constantly for months now (why no one noticed that they kept having to add water to the system, or why no one cared, is plenty disturbing, as is the question of where all that extra water has gone).

Arnie Gundersen of nuclear engineering consultancy Fairewinds Associates believes that the level of water roughly corresponds with the lower lip of the vessel’s suppression pool–further evidence that reactor 2 suffered a hydrogen explosion, as did two other units at Fukushima. Gundersen also believes that the combination of heat, radioactivity and seawater likely degraded the seals on points where tubes and wires penetrated the structure–so even if there were no additional cracks from an explosion or the earthquake, the system is now almost certainly riddled with holes.

The holes pose a couple of problems, not only does it mean more contaminated water leaking into the environment, it precludes filling the building with water to shield people and equipment from radiation. Combined with the elevated radiation readings, this will most certainly mean a considerably longer and more expensive cleanup.

And reactor 2 was considered the Fukushima unit in the best shape.

(Reactor 2 is also the unit that experienced a rapid rise in temperature and possible re-criticality in early February. TEPCO officials later attributed this finding to a faulty thermometer, but if one were skeptical of that explanation before, the new information about high radiation and low water levels should warrant a re-examination of February’s events.)

What does this all mean? Well, for Japan, it means injecting another $22 billion into Fukushima’s nominal owners, TEPCO–$12 billion just to stay solvent, and $10.2 billion to cover compensation for those injured or displaced by the nuclear crisis. That cash dump comes on top of the $18 billion already coughed up by the Japanese government, and is just a small down payment on what is estimated to be a $137 billion bailout of the power company.

It also means a further erosion of trust in an industry and a government already short on respect.

The same holds true in the US, where poor communication and misinformation left the residents of central Pennsylvania panicked and perturbed some 33 years ago, and the story is duplicated on varying scales almost weekly somewhere near one of America’s 104 aging and increasingly accident-prone nuclear reactors.

And, increasingly, residents and the state and local governments that represent them are saying “enough.” Whether it is the citizens and state officials from California’s Simi Valley demanding the real cleanup of a 53-year-old meltdown, or the people and legislature of Vermont facing off with the federal government on who has ultimate authority to assure that the next nuclear accident doesn’t happen in their backyard, Americans are looking at their future in the context of nuclear’s troubled past.

One year after Fukushima, 33 years after Three Mile Island, and 53 years after the Sodium Reactor Experiment, isn’t it time the US federal government did so, too?