The Brief Wondrous Life (and Long Dangerous Half-Life) of Strontium-90

Tooth to Science button2At roughly 5:30 in the morning on July 16, 1945, an implosion-design plutonium device, codenamed “the gadget,” exploded over the Jornada del Muerto desert in south-central New Mexico with a force equivalent to about 20,000 tons of TNT. It was the world’s first test of an atomic bomb, and as witnesses at base camp some ten miles away would soon relay to US President Harry Truman, the results were “satisfactory” and exceeded expectations. Within weeks, the United States would use a uranium bomb of a different design on the Japanese city of Hiroshima, and three days after that, a plutonium device similar to the gadget was dropped on Nagasaki, about 200 miles to the southwest.

Though Hiroshima and Nagasaki are the only instances where atomic weapons were used against a wartime enemy, between 1945 and 1963, the world experienced hundreds upon hundreds of nuclear weapons tests, the great majority of which were above ground or in the sea–in other words, in the atmosphere. The US tested atom and hydrogen bombs in Nevada, at the Nevada Test Site, and in the Pacific Ocean, on and around the Marshall Islands, in an area known as the Pacific Proving Grounds. After the Soviet Union developed its own atomic weapon in 1949, it carried out hundreds of similar explosions, primarily in Kazakhstan, and the UK performed more than 20 of its own atmospheric nuclear tests, mostly in Australia and the South Pacific, between 1952 and 1958.

Though military authorities and officials with the US Atomic Energy Commission initially downplayed the dispersal and dangers of fallout from these atmospheric tests, by the early 1950s, scientists in nuclear and non-nuclear countries alike began to raise concerns. Fallout from atmospheric tests was not contained simply to the blast radius or a region near the explosion, instead the products of fission and un-fissioned nuclear residue were essentially vaporized by the heat and carried up into the stratosphere, sweeping across the globe, and eventually returning to earth in precipitation. A host of radioactive isotopes contaminated land and surface water, entering the food chain through farms and dairies.

The tale of the teeth

In order to demonstrate that fallout was widespread and had worked its way into the population, a group of researchers, headed by Dr. Barry Commoner and Drs. Louise and Eric Reiss, founded the Baby Tooth Survey under the auspices of Washington University (where Commoner then taught) and the St. Louis Citizens’ Committee for Nuclear Information. The tooth survey sought to track strontium-90 (Sr-90), a radioactive isotope of the alkaline earth metal strontium, which occurs as a result–and only as a result–of nuclear fission. Sr-90 is structurally similar to calcium, and so, once in the body, works its way into bones and teeth.

While harvesting human bones was impractical, researchers realized that baby teeth should be readily available. Most strontium in baby teeth would transfer from mother to fetus during pregnancy, and so birth records would provide accurate data about where and when those teeth were formed. The tooth survey collected baby teeth, initially from the St. Louis area, eventually from around the globe, and analyzed them for strontium.

By the early ’60s, the program had collected well over a quarter-million teeth, and ultimately found that children in St. Louis in 1963 had 50 times more Sr-90 in them than children born in 1950. Armed with preliminary results from this survey and a petition signed by thousands of scientists worldwide, Dr. Commoner successfully lobbied President John F. Kennedy to negotiate and sign the Partial Test Ban Treaty, halting atmospheric nuclear tests by the US, UK and USSR. By the end of the decade, strontium-90 levels in newly collected baby teeth were substantially lower than the ’63 samples.

The initial survey, which ended in 1970, continues to have relevance today. Some 85,000 teeth not used in the original project were turned over to researchers at the Radiation and Public Health Project (RPHP) in 2001. The RPHP study, released in 2010, found that donors from the Baby Tooth Survey who had died of cancer before age 50 averaged over twice the Sr-90 in their samples compared with those who had lived past their 50th birthday.

But the perils of strontium-90–or, indeed, a host of radioactive isotopes that are strontium’s travel companions–did not cease with the ban on atmospheric nuclear tests. Many of the hazards of fallout could also be associated with the radiological pollution that is part-and-parcel of nuclear power generation. The controlled fission in a nuclear reactor produces all of the elements created in the uncontrolled fission of a nuclear explosion. This point was brought home by the RPHP work, when it found strontium-90 was 30- to 50-percent higher in baby teeth collected from children born in “nuclear counties,” (PDF) the roughly 40 percent of US counties situated within 100 miles of a nuclear power plant or weapons lab.

Similar baby teeth research has been conducted over the last 30 years in Denmark, Japan and Germany, with measurably similar results. While Sr-90 levels continued to decrease in babies born through the mid 1970s, as the use of nuclear power starts to spread worldwide, that trend flattens. Of particular note, a study conducted by the German section of the International Physicians for the Prevention of Nuclear War (winner of the 1985 Nobel Peace Prize) found ten-times more strontium-90 in the teeth of children born after the 1986 Chernobyl nuclear disaster when compared with samples from 1983.

While radioactive strontium itself can be linked to several diseases, including leukemia and bone cancers, Sr-90, as mentioned above, is but one of the most measurable of many dangerous isotopes released into the environment by the normal, everyday operation of nuclear reactors, even without the catastrophic discharges that come with accidents and meltdowns. Tritium, along with radioactive variants of iodine, cesium and xenon (to name just a few) can often be detected in elevated levels in areas around nuclear facilities.

Epidemiological studies have shown higher risks of breast and prostate cancers for those living in US nuclear counties. But while the Environmental Protection Agency collects sporadic data on the presence of radioactive isotopes such as Sr-90, the exact locations of the sampling sites are not part of the data made available to the general public. Further, while “unusual” venting of radioactive vapor or the dumping of contaminated water from a nuclear plant has to be reported to the Nuclear Regulatory Commission (and even then, it is the event that is reported, not the exact composition of the discharge), the radio-isotopes that are introduced into the environment by the typical operation of a reactor meet with far less scrutiny. In the absence of better EPA data and more stringent NRC oversight, studies like the Baby Tooth Survey and its contemporary brethren are central to the public understanding of the dangers posed by the nuclear power industry.

June and Sr-90: busting out all over

As if to underscore the point, strontium-90 served as the marker for troubling developments on both sides of the Pacific just this June.

In Japan, TEPCO–still the official operator of Fukushima Daiichi–revealed it had found Sr-90 in groundwater surrounding the crippled nuclear plant at “very high” levels. Between December 2012 and May 2013, levels of strontium-90 increased over 100-fold, to 1,000 becquerels per liter–33 times the Japanese limit for the radioactive isotope.

The samples were taken less than 100 feet from the coast. From that point, reports say, the water usually flows out to the Pacific Ocean.

Beyond the concerns raised by the affects of the strontium-90 (and the dangerously high amounts of tritium detected along with it) when the radioactive contamination enters the food chain, the rising levels of Sr-90 likely indicate other serious problems at Fukushima. Most obviously, there is now little doubt that TEPCO has failed to contain contaminated water leaking from the damaged reactor buildings–contrary to the narrative preferred by company officials.

But skyrocketing levels of strontium-90 could also suggest that the isotope is still being produced–that nuclear fission is still occurring in one or more of the damaged reactor cores. Or even, perhaps, outside the reactors, as the corium (the term for the molten, lava-like nuclear fuel after a meltdown) in as many as three units is believed to have melted through the steel reactor containment and possibly eroded the concrete floor, as well.

An ocean away, in Washington state, radiological waste, some of which dates back to the manufacture of those first atom bombs, sits in aging storage tanks at the Hanford Nuclear Reservation–and some of those tanks are leaking.

In truth, tanks at Hanford, considered by many the United States’ most contaminated nuclear site, have been leaking for some time. But the high-level radioactive waste in some of the old, single-wall tanks had been transferred to newer, double-walled storage, which was supposed to provide better containment. On June 20, however, the US Department of Energy reported that workers at Hanford detected radioactive contamination–specifically Sr-90–outside one of the double-walled tanks, possibly suggesting a breach. The predominant radionuclides in the 850,000-gallon tank are reported to be strontium-90 and cesium-137.

The tank, along with hundreds of others, sits about five miles from the Columbia River, water source for much of the region. Once contamination leaks from the tanks, it mixes with ground water, and, in time, should make its way to the river. “I view this as a crisis,” said Tom Carpenter, executive director of the watchdog group Hanford Challenge, “These tanks are not supposed to fail for 50 years.”

Destroyer of worlds

In a 1965 interview, J. Robert Oppenheimer, the Manhattan Project’s science director who was in charge of the Los Alamos facility that developed the first atomic bombs, looked back twenty years to that July New Mexico morning:

We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, “Now I am become Death, the destroyer of worlds.” I suppose we all thought that, one way or another.

“We knew the world would not be the same.” Oppenheimer was most likely speaking figuratively, but, as it turns out, he also reported a literal truth. Before July 16, 1945, there was no strontium-90 or cesium-137 in the atmosphere–it simply did not exist in nature. But ever since that first atomic explosion, these anthropogenic radioactive isotopes have been part of earth’s every turn.

Strontium-90–like cesium-137 and a catalog of other hazardous byproducts of nuclear fission–takes a long time to decay. The detritus of past detonations and other nuclear disasters will be quite literally with us–in our water and soil, in our tissue and bone–for generations. These radioactive isotopes have already been linked to significant suffering, disease and death. Their danger was acknowledged by the United States when JFK signed the 1963 Test Ban Treaty. Now would be a good time to acknowledge the perspicacity of that president, phase out today’s largest contributors of atmospheric Sr-90, nuclear reactors, and let the sun set on this toxic metal’s life.

 

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Oyster Creek Nuclear Alert: As Floodwaters Fall, More Questions Arise

Oyster Creek Nuclear Generating Station in pre-flood mode. (photo: NRCgov)

New Jersey’s Oyster Creek Nuclear Generating Station remains under an official Alert, a day-and-a-half after the US Nuclear Regulatory Commission declared the emergency classification due to flooding triggered by Hurricane Sandy. An Alert is the second category on the NRC’s four-point emergency scale. Neil Sheehan, a spokesman for the federal regulator, said that floodwaters around the plant’s water intake structure had receded to 5.7 feet at 2:15 PM EDT Tuesday, down from a high of 7.4 feet reached just after midnight.

Water above 6.5 to 7 feet was expected to compromise Oyster Creek’s capacity to cool its reactor and spent fuel pool, according to the NRC. An “Unusual Event,” the first level of emergency classification, was declared Monday afternoon when floodwaters climbed to 4.7 feet.

Though an emergency pump was brought in when water rose above 6.5 feet late Monday, the NRC and plant owner Exelon have been vague about whether it was needed. As of this writing, it is still not clear if Oyster Creek’s heat transfer system is functioning as designed.

As flooding continued and water intake pumps were threatened, plant operators also floated the idea that water levels in the spent fuel pool could be maintained with fire hoses. Outside observers, such as nuclear consultant Arnie Gundersen, suspected Oyster Creek might have accomplished this by repurposing its fire suppression system (and Reuters later reported the same), though, again, neither Exelon nor regulators have given details.

Whether the original intake system or some sort of contingency is being used, it appears the pumps are being powered by backup diesel generators. Oyster Creek, like the vast majority of southern New Jersey, lost grid power as Sandy moved inland Monday night. In the even of a site blackout, backup generators are required to provide power to cooling systems for the reactor–there is no such mandate, however, for spent fuel pools. Power for pool cooling is expected to come either from the grid or the electricity generated by the plant’s own turbines.

As the NRC likes to remind anyone who will listen, Oyster Creek’s reactor was offline for fueling and maintenance. What regulators don’t add, however, is that the reactor still needs cooling for residual decay heat, and that the fuel pool likely contains more fuel and hotter fuel as a result of this procedure, which means it is even more at risk for overheating. And, perhaps most notably, with the reactor shutdown, it is not producing the electricity that could be used to keep water circulating through the spent fuel pool.

If that sounds confusing, it is probably not by accident. Requests for more and more specific information (most notably by the nuclear watchdog site SimplyInfo) from Exelon and the NRC remain largely unanswered.

Oyster Creek was not the only nuclear power plant dealing with Sandy-related emergencies. As reported here yesterday, Nine Mile Point Unit 1 and Indian Point Unit 3–both in New York–each had to scram because of grid interruptions triggered by Monday’s superstorm. In addition, one of New Jersey’s Salem reactors shut down when four of six condenser circulators (water pumps that aid in heat transfer) failed “due to a combination of high river level and detritus from Hurricane Sandy’s transit.” Salem vented vapor from what are considered non-nuclear systems, though as noted often, that does not mean it is completely free of radioactive components. (Salem’s other reactor was offline for refueling.)

Limerick (PA) reactors 1 and 2, Millstone (CT) 3, and Vermont Yankee all reduced power output in response to Superstorm Sandy. The storm also caused large numbers of emergency warning sirens around both Oyster Creek and the Peach Bottom (PA) nuclear plant to fail.

If you thought all of these problems would cause nuclear industry representatives to lay low for a while, well, you’d be wrong:

“Our facilities’ ability to weather the strongest Atlantic tropical storm on record is due to rigorous precautions taken in advance of the storm,” Marvin Fertel, chief executive officer of the Nuclear Energy Institute, a Washington-based industry group, said yesterday in a statement.

Fertel went on to brag that of the 34 reactors it said were in Sandy’s path, 24 survived the storm without incident.

Or, to look at it another way, during a single day, the heavily populated eastern coast of the Unite States saw multiple nuclear reactors experience problems. And that’s in the estimation of the nuclear industry’s top lobbyist.

Or, should we say, the underestimation? Of the ten reactors not in Fertel’s group of 24, seven were already offline, and the industry is not counting them. So, by Fertel’s math, Oyster Creek does not figure against what he considers success. Power reductions and failed emergency warning systems are also not factored in, it appears.

This storm–and the trouble it caused for America’s nuclear fleet–comes in the context of an 18-month battle to improve nuclear plant safety in the wake of the multiple meltdowns and continuing crisis at Japan’s Fukushima Daiichi nuclear facility. Many of the rules and safety upgrades proposed by a US post-Fukushima taskforce are directly applicable to problems resulting from Superstorm Sandy. Improvements to flood preparation, backup power regimes, spent fuel storage and emergency notification were all part of the taskforce report–all of which were theoretically accepted by the Nuclear Regulatory Commission. But nuclear industry pushback, and stonewalling, politicking and outright defiance by pro-industry commissioners has severely slowed the execution of post-Fukushima lessons learned.

The acolytes of atom-splitting will no doubt point to the unprecedented nature of this massive hybrid storm, echoing the “who could have predicted” language heard from so many after the earthquake and tsunami that started the Fukushima disaster. Indeed, such language has already been used–though, granted, in a non-nuclear context–by Con Edison officials discussing massive power outages still afflicting New York City:

At a Consolidated Edison substation in Manhattan’s East Village, a gigantic wall of water defied elaborate planning and expectations, swamped underground electrical equipment, and left about 250,000 lower Manhattan customers without power.

Last year, the surge from Hurricane Irene reached 9.5 feet at the substation. ConEd figured it had that covered.

The utility also figured the infrastructure could handle a repeat of the highest surge on record for the area — 11 feet during a hurricane in 1821, according to the National Weather Service. After all, the substation was designed to withstand a surge of 12.5 feet.

With all the planning, and all the predictions, planning big was not big enough. Sandy went bigger — a surge of 14 feet.

“Nobody predicted it would be that high,” said ConEd spokesman Allan Drury.

In a decade that has seen most of the warmest years on record and some of the era’s worst storms, there needs to be some limit on such excuses. Nearly a million New York City residents (including this reporter) are expected to be without electricity through the end of the week. Residents in the outer boroughs and millions in New Jersey could be in the dark for far longer. Having a grid that simply survives a category 1 hurricane without a Fukushima-sized nuclear disaster is nothing to crow about.

The astronomical cost of restoring power to millions of consumers is real, as is the potential danger still posed by a number of crippled nuclear power plants. The price of preventing the current storm-related emergencies from getting worse is also not a trivial matter, nor are the radioactive isotopes vented with every emergency reactor scram. All of that should be part of the nuclear industry’s report card; all of that should raise eyebrows and questions the next time nuclear is touted as a clean, safe, affordable energy source for a climate change-challenged world.

UPDATE: The AP is reporting that the NRC has now lifted the emergency alert at Oyster Creek.

New Fukushima Video Shows Disorganized Response, Organized Deception

A frame from early in the newly released Fukushima video.

Tokyo Electric Power Company (TEPCO), the operator of the Fukushima Daiichi nuclear power plant when the Tohoku earthquake and tsunami struck last year, bowed to public and government pressure this week, releasing 150 hours of video recorded during the first days of the Fukushima crisis. Even with some faces obscured and two-thirds of the audio missing, the tapes clearly show a nuclear infrastructure wholly unprepared for the disaster, and an industry and government wholly determined to downplay that disaster’s severity:

Though incomplete, the footage from a concrete bunker at the plant confirms what many had long suspected: that the Tokyo Electric Power Company, the plant’s operator, knew from the early hours of the crisis that multiple meltdowns were likely despite its repeated attempts in the weeks that followed to deny such a probability.

It also suggests that the government, during one of the bleakest moments, ordered the company not to share information with the public, or even local officials trying to decide if more people should evacuate.

Above all, the videos depict mayhem at the plant, a lack of preparedness so profound that too few buses were on hand to carry workers away in the event of an evacuation. They also paint a close-up portrait of the man at the center of the crisis, Mr. Yoshida, who galvanizes his team of engineers as they defy explosions and fires — and sometimes battle their own superiors.

That summary is from New York Times Tokyo-based reporter Hiroko Tabuchi. The story she tells is compelling and terrifying, and focuses on the apparent heroism of Masao Yoshida, Fukushima’s chief manager when the crisis began, along with the far less estimable behavior of TEPCO and Japanese government officials. It is worth a couple of your monthly quota of clicks to read all the way through.

The story is but one take on the video, and I point this out not because I question Tabuchi’s reporting on its content, much of which is consistent with what is already known about the unholy alliance between the nuclear industry and the Japanese government, and about what those parties did to serve their own interests at the expense of the Japanese people (and many others across the northern hemisphere). Instead, I bring this up because I do not myself speak Japanese, and I am only allowed to view a 90-minute “highlight reel” and not the entire 150 hours of video, and so I am dependent on other reporters’ interpretations. And because neither TEPCO nor the Japanese government (which now essentially owns TEPCO) has yet proven to be completely open or honest on matters nuclear, the subtle differences in those interpretations matter.

Tabuchi took to Twitter to say how much she wanted to tell the story as “a tribute to Fukushima Daiichi chief Yoshida and the brave men on the ground who tried to save us.” But in a separate tweet, Tabuchi said she was “heartbroken” to discover her article was cut in half.

Editing is, of course, part of journalism. Trimming happens to many stories in many papers. But I had to raise an eyebrow when I saw a note at the bottom of Tabuchi’s piece that said Matthew Wald “contributed reporting from Washington.” I have previously been critical of Wald–a Times veteran, contributor to their Green blog, and often their go-to reporter on nuclear power–for stories that sometimes read like brochures from the Nuclear Energy Institute. Wald tends to perpetuate myths in line with the old “clean, safe, and too cheap to meter” saw, while reserving a much, uh, healthier (?) skepticism for nuclear power critics and renewable energy advocates.

There is, of course, no way to know what Wald’s contributions (or redactions) were in this case, and it is doubtful any of the parties involved would tell us, but what particularly stokes my curiosity is this paragraph:

Despite the close-up view of the disaster, the videos — which also capture teleconferences with executives in Tokyo — leave many questions unresolved, in good part because only 50 of 150 hours include audio. The company blamed technical problems for the lack of audio.

TEPCO might blame technical problems, but reports from other news services seem to leave little doubt that the general belief is that the audio has been withheld–or in some cases most obviously obscured–by TEPCO. The BBC’s Mariko Oi saw it this way:

Tepco has bowed to pressure to release 150 hours of teleconferencing footage but the tape was heavily edited and mostly muted to “protect employees’ privacy”.

. . . .

Tepco is again under criticism for not releasing the full recordings and has been asked if it was removing more than employees’ names and phone numbers.

And Mari Yamaguchi of the Associated Press reported even more directly about TEPCO’s intent:

Japan’s former prime minister criticized the tsunami-hit nuclear plant’s operator Wednesday for heavily editing the limited video coverage it released of the disaster, including a portion in which his emotional speech to utility executives and workers was silenced.

Naoto Kan called for Tokyo Electric Power Co. to release all of its video coverage, beyond the first five days. Two-thirds of the 150 hours of videos it released Monday are without sound, including one segment showing Kan’s visit to the utility’s headquarters on March 15 last year, four days after a tsunami critically damaged three reactors at the Fukushima Dai-ichi power plant.

Many people’s faces, except for the plant chief and top executives in Tokyo, are obscured in the videos and frequent beeps mask voices and other sound.

The AP story also points out that the released video arbitrarily ends at midnight on March 15–and though it is not known how much more tape exists, it appears clear that TEPCO has held some substantial portion back. After five days, the Fukushima crisis was far from over, after all (as it is still far from over), and the recordings end amidst some of the disaster’s most critical events.

But the New York Times omits all of this, leaving TEPCO’s Rose Mary Woods-like excuse to stand as the innocent truth.

That’s a shame, because the way you read this story changes when you look at some of the horrific revelations keeping in mind that this is only the part TEPCO decided it could let you see. Here are just a few highlights. . . or lowlights:

  • Plant managers and TEPCO officials were aware from the earliest hours of the crisis that they were likely facing multiple meltdowns.
  • Japanese government officials withheld information–and ordered TEPCO to withhold information–on radiation levels that could have helped untold numbers of civilians reduce their exposure.
  • Despite warnings years prior that such natural disasters were possible in the region, Fukushima operators had no plan to deal with the damage and loss of power caused by the quake and tsunami.
  • TEPCO did not even have the infrastructure or procedures in place to evacuate its own employees from an imperiled facility.
  • Plant officials were–from the earliest days–as worried about the spent fuel pools as they were about the reactors. Those on the scene feared that most of the pools at Daiichi, not just the one at reactor four, were facing loss of coolant and the fires and massive radiation leaks that would follow, though publicly they said none of the pools were a danger at the time.

And there is more about the dire conditions for plant workers, the lack of food or water, the high levels of radiation exposure, and even a point where employees had to pool their cash to buy water and gasoline. And, as noted above, that’s just the part TEPCO has deemed acceptable for release.

Above all, though–beyond the discrepancies in reporting, beyond the moral failings of TEPCO and government officials, beyond the heroism of those at the crippled facility–what the new Fukushima tapes reveal is what those who watch the nuclear industry have mostly known all along. Nuclear power is dangerous–the radiation, the complexity of the system, the waste, the reliance on everything going right, and the corrupt conspiracy between industry and government saddle this form of energy production with unacceptable risks. The video now available might shed some light on how things at Fukushima went horribly wrong, but the entire world already knows plenty of who, what, where and when. We all know that things at Fukushima did go horribly wrong, and so many know that they must suffer because of it.

Made in Japan? Fukushima Crisis Is Nuclear, Not Cultural

(photo: Steve Snodgrass)

Since the release of the Fukushima Nuclear Accident Independent Committee’s official report last week, much has been made of how it implicates Japanese culture as one of the root causes of the crisis. The committee’s chairman, Dr. Kiyoshi Kurokawa, makes the accusation quite plainly in the opening paragraphs of the executive summary [PDF]:

What must be admitted – very painfully – is that this was a disaster “Made in Japan.” Its fundamental causes are to be found in the ingrained conventions of Japanese culture: our reflexive obedience; our reluctance to question authority; our devotion to ‘sticking with the program’; our groupism; and our insularity.

That this apparently critical self-examination was seized upon by much of the western media’s coverage of the report probably does not come as a surprise–especially when you consider that this revelation falls within the first 300 words of an 88-page document. Cultural stereotypes and incomplete reads are hardly new to establishment reportage. What might come as a shock, however, is that this painful admission is only made in the English-language version of the document, and only in the chairman’s introduction is the “made in Japan” conclusion drawn so specifically.

What replaces the cultural critique in the Japanese edition and in the body of the English summary is a ringing indictment of the cozy relationship between the Japanese nuclear industry and the government agencies that were supposed to regulate it. This “regulatory capture,” as the report details, is certainly central to the committee’s findings and crucial to understanding how the Fukushima disaster is a manmade catastrophe, but it is not unique to the culture of Japan.

Indeed, observers of the United States will recognize this lax regulatory construct as part-and-parcel of problems that threaten the safety and health of its citizenry, be it in the nuclear sector, the energy sector as a whole, or across a wide variety of officially regulated industries.

No protection

The Japanese Diet’s Fukushima report includes a healthy dose of displeasure with the close ties between government regulators and the nuclear industry they were supposed to monitor. The closed, insular nature of nuclear oversight that might be attributed to Japanese culture by a superficial read is, in fact, a product of the universally familiar “revolving door” that sees industry insiders taking turns as government bureaucrats, and regulatory staff “graduating” to well-compensated positions in the private sector.

Mariko Oi, a reporter at the BBC’s Tokyo bureau, described the situation this way when discussing the Fukushima report on the World Service:

When there was a whistleblower, the first call that the government or the ministry made was to TEPCO, saying, “Hey, you’ve got a whistleblower,” instead of “Hey, you’ve got a problem at the nuclear reactor.”

A disturbing betrayal of accountability in any context, it is especially troubling with the ominous repercussions of the Fukushima disaster still metastasizing. And it is also ominously familiar.

Look, for example, just across the Pacific:

[San Onofre Nuclear Generating Station] was chastised two years ago by the U.S. Nuclear Regulatory Commission for creating an atmosphere in which employees fear retaliation if they report safety concerns.

. . . .

Edward Bussey, a former health physics technician at the plant, sued Edison in state court after he was fired in 2006 under what he said were trumped-up charges that he had falsified initials on logs documenting that certain materials had been checked for radiation. Bussey contended that he was really fired in retaliation for complaining about safety concerns to his supervisors and the NRC.

San Onofre–SONGS, if you will–has been offline since January when a radioactive steam leak led to the discovery of severely degraded copper tubing in both of the plant’s existing reactors. But here’s the real kicker: whistleblower suits at SONGS, like the one from Mr. Bussey, have routinely been summarily dismissed thanks to a little known legal loophole:

San Onofre is majority owned and operated by Southern California Edison, a private company, but it sits on land leased from the Camp Pendleton Marine Corps base.

That puts the plant in a so-called federal enclave, where courts have held that many California laws, including labor laws intended to protect whistle-blowers, do not apply.

Lawsuits filed in state court by San Onofre workers who claimed that they were fired or retaliated against for reporting safety concerns, sexual harassment and other issues have been tossed out because of the plant’s location.

The Los Angeles Times cites examples dating back to the construction of San Onofre where personnel who complained about safety or work conditions were terminated and left without many of the legal options normally afforded most California citizens. The history of SONGS is liberally peppered with accidents and safety breaches–and the lies and cover-ups from its owner-operators that go with them. Considering that San Onofre employees are regularly punished for exposing problems and have fewer whistleblower protections, is it at all surprising that SONGS is reported to have the worst safety record of all US nuclear plants?

If San Onofre’s track record isn’t evidence enough of the dangers of weak regulation, the findings and conclusions of the latest Fukushima report make it crystal clear: “safety culture” is not undermined by Japanese culture so much as it is by the more international culture of corruption born of the incestuous relationship between industry and regulators.

It’s a nuclear thing…

But the corrupt culture–be it national or universal–is itself a bit of a dodge. As noted by the Financial Times, the Japanese and their regulatory structure have managed to operate the technologically complex Shinkansen bullet trains since 1964 without a single derailment or fatal collision.

As the Diet’s report makes abundantly clear–far more clear than any talk about Japanese culture–the multiple failures at and around Fukushima Daiichi were directly related to the design of the reactors and to fatal flaws inherent in nuclear power generation.

Return for a moment to something discussed here last summer, The Light Water Paradox: “In order to safely generate a steady stream of electricity, a light water reactor needs a steady stream of electricity.” As previously noted, this is not some perpetual motion riddle–all but one of Japan’s commercial nuclear reactors and every operating reactor in the United States is of a design that requires water to be actively pumped though the reactor containment in order to keep the radioactive fuel cool enough to prevent a string of catastrophes, from hydrogen explosions and cladding fires, to core meltdowns and melt-throughs.

Most of the multiple calamities to befall Fukushima Daiichi have their roots in the paradox. As many have observed and the latest Japanese report reiterates, the Tohoku earthquake caused breaches in reactor containment and cooling structures, and damaged all of Fukushima’s electrical systems, save the diesel backup generators, which were in turn taken out by the tsunami that followed the quake. Meeting the demands of the paradox–circulating coolant in a contained system–was severely compromised after the quake, and was rendered completely impossible after the tsunami. Given Japan’s seismic history, and the need of any light water reactor for massive amounts of water, Fukushima wouldn’t really have been a surprise even if scientists hadn’t been telling plant operators and Japanese regulators about these very problems for the last two decades.

Back at San Onofre, US regulators disclosed Thursday that the damage to the metal tubes that circulate radioactive water between the reactor and the steam turbines (in other words, part of the system that takes heat away from the core) was far more extensive than had previously been disclosed by plant operators:

[Each of San Onofre’s steam generators has] 9,727 U-shaped tubes inside, each three-quarters of an inch in diameter.

The alloy tubes represent a critical safety barrier — if one breaks, there is the potential that radioactivity could escape into the atmosphere. Also, serious leaks can drain protective cooling water from a reactor.

Gradual wear is common in such tubing, but the rate of erosion at San Onofre startled officials since the equipment is relatively new. The generators were replaced in a $670 million overhaul and began operating in April 2010 in Unit 2 and February 2011 in Unit 3.

Tubes have to be taken out of service if 35 percent — roughly a third — of the wall wears away, and each of the four generators at the plant is designed to operate with a maximum of 778 retired tubes.

In one troubled generator in Unit 3, 420 tubes have been retired. The records show another 197 tubes in that generator have between 20 percent and 34 percent wear, meaning they are close to reaching the point when they would be at risk of breaking.

More than 500 others in that generator have between 10 percent and 19 percent wear in the tube wall.

“The new data reveal that there are thousands of damaged tubes in both Units 2 and 3, raising serious questions whether either unit should ever be restarted,” said Daniel Hirsch, a lecturer on nuclear policy at the University of California, Santa Cruz, who is a critic of the industry. “The problem is vastly larger than has been disclosed to date.”

And if anything, the Nuclear Regulatory Commission is underplaying the problem. A report from Fairewinds Associates, also released this week, unfavorably compared San Onofre’s situation with similar problems at other facilities:

[SONGS] has plugged 3.7 times as many steam generator tubes than the combined total of the entire number of plugged replacement steam generator tubes at all the other nuclear power plants in the US.

The report also explains that eight of the tubes failed a “pressure test” at San Onofre, while the same test at other facilities had never triggered any more than one tube breach. Fairewinds goes on to note that both units at San Onofre are equally precarious, and that neither can be restarted with any real promise of safe operation.

And while the rapid degeneration of the tubing might be peculiar to San Onofre, the dangers inherent in a system that requires constant power for constant cooling–lest a long list of possible problems triggers a toxic crisis–are evident across the entire US nuclear fleet. Cracked containment buildings, coolant leaks, transformer fires, power outages, and a vast catalogue of human errors fill the NRC’s event reports practically every month of every year for the past 40 years. To put it simply, with nuclear power, too much can go wrong when everything has to go right.

And this is to say nothing of the dangers that come with nuclear waste storage. Like with the reactors, the spent fuel pools that dot the grounds of almost every nuclear plant in America and Japan require a consistent and constantly circulating water supply to keep them from overheating (which would result in many of the same disastrous outcomes seen with damaged reactors). At Fukushima, one of the spent fuel pools is, at any given point, as much of a concern as the severely damaged reactor cores.

Ions and tigers and bears, oh my!

Even with the latest findings, however, Japanese Prime Minister Yoshihiko Noda pushed ahead with the restart of the precariously situated and similarly flawed nuclear reactor complex at Oi. It is as if the PM and the nuclear industry feared Japan surviving another summer without nuclear-generated electricity would demonstrate once and for all that the country had no reason to trade so much of its health and safety for an unnecessary return.

But the people of Japan seem to see it differently. Tens of thousands have turned out to demonstrate against their nation’s slide back into this dangerous culture of corruption. (Remember, the Oi restart comes without any safety upgrades made in response to the Fukushima disaster.)

And maybe there’s where cultural distinctions can be drawn. In Japan, the citizenry–especially women–are not demonstrating “reflexive obedience,” instead, they are demonstrating. In the United States, where 23 nuclear reactors are of the same design as Fukushima Daiichi, and 184 million people within 50 miles of a nuclear power plant, when the chairman of the Nuclear Regulatory Commission suggested requiring some modest upgrades as a response to the Fukushima disaster, the nuclear industry got its henchmen on the NRC and in Congress to push him out. . . with little public outcry.

Still, the BBC’s Mariko Oi lamented on the day the Fukushima report was released that Japanese media was paying more attention to the birth of a giant panda at a Tokyo zoo. That sort of response would seem all too familiar to any consumer of American media.

That baby panda, it should be noted, has since died. The radioactive fallout from Fukushima, however, lingers, and the crisis at Daiichi is far from over. The threat to global heath and safety that is unique to nuclear power lives on.

Fukushima Nuclear Disaster “Man-Made” Reports Japanese Panel; Quake Damaged Plant Before Tsunami

Aerial view of the Oi Nuclear Power Plant, Fukui Prefecture, Japan. (photo: Japan Ministry of Land, Infrastructure and Transport via Wikipedia)

The massive disaster at the Fukushima Daiichi nuclear facility that began with the March 11, 2011 Tohoku earthquake and tsunami could have been prevented and was likely made worse by the response of government officials and plant owners, so says a lengthy report released today by the Japanese Diet (their parliament).

The official report of The Fukushima Nuclear Accident Independent Investigation Committee [PDF] harshly criticizes the Japanese nuclear industry for avoiding safety upgrades and disaster plans that could have mitigated much of what went wrong after a massive quake struck the northeast of Japan last year. The account also includes direct evidence that Japanese regulatory agencies conspired with TEPCO (Fukushima’s owner-operator) to help them forestall improvements and evade scrutiny:

The TEPCO Fukushima Nuclear Power Plant accident was the result of collusion between the government, the regulators and TEPCO, and the lack of governance by said parties. They effectively betrayed the nation’s right to be safe from nuclear accidents.

. . . .

We found evidence that the regulatory agencies would explicitly ask about the operators’ intentions whenever a new regulation was to be implemented. For example, NISA informed the operators that they did not need to consider a possible station blackout (SBO) because the probability was small and other measures were in place. It then asked the operators to write a report that would give the appropriate rationale for why this consideration was unnecessary.

The report also pointed to Japanese cultural conventions, namely the reluctance to question authority–a common refrain in many post-Fukushima analyses.

But perhaps most damning, and most important to the future of Japan and to the future of nuclear power worldwide, is the Investigation’s finding that parts of the containment and cooling systems at Fukushima Daiichi were almost certainly damaged by the earthquake before the mammoth tsunami caused additional destruction:

We conclude that TEPCO was too quick to cite the tsunami as the cause of the nuclear accident and deny that the earthquake caused any damage.

. . . .

[I]t is impossible to limit the direct cause of the accident to the tsunami without substantive evidence. The Commission believes that this is an attempt to avoid responsibility by putting all the blame on the unexpected (the tsunami), as they wrote in their midterm report, and not on the more foreseeable earthquake.

Through our investigation, we have verified that the people involved were aware of the risk from both earthquakes and tsunami. Further, the damage to Unit 1 was caused not only by the tsunami but also by the earthquake, a conclusion made after considering the facts that: 1) the largest tremor hit after the automatic shutdown (SCRAM); 2) JNES confirmed the possibility of a small-scale LOCA (loss of coolant accident); 3) the Unit 1 operators were concerned about leakage of coolant from the valve, and 4) the safety relief valve (SR) was not operating.

Additionally, there were two causes for the loss of external power, both earthquake-related: there was no diversity or independence in the earthquake-resistant external power systems, and the Shin-Fukushima transformer station was not earthquake resistant.

As has been discussed here many times, the nuclear industry and its boosters in government like to point to the “who could have possibly imagined,” “one-two punch” scenario of quake and tsunami to both vouch for the safety of other nuclear facilities and counter any call for reexamination and upgrades of existing safety systems. Fukushima, however, has always proved the catastrophic case study that actually countered this argument–and now there is an exhaustive study to buttress the point.

First, both the quake and the tsunami were far from unpredictable. The chances of each–as well as the magnitude–were very much part of predictions made by scientists and government bureaucrats. There is documentation that Japanese regulators knew and informed their nuclear industry of these potential disasters, but then looked the other way or actively aided the cause as plant operators consistently avoided improving structures, safety systems and accident protocols.

Second, even if there had not been a tsunami, Fukushima Daiichi would have still been a disaster. While the crisis was no doubt exacerbated by the loss of the diesel generators and the influx of seawater, the evidence continues to mount that reactor containment was breached and cooling systems were damaged by the earthquake first. Further, it was the earthquake that damaged all the electrical systems and backups aside from the diesel generators, and there is no guarantee that all generators would have worked flawlessly for their projected life-spans, that the other external and internal power systems could have been restored quickly, or that enough additional portable power could have been trucked in to the facility in time to prevent further damage. In fact, much points to less than optimal resolution of all of these problems.

To repeat, there was loss of external power, loss of coolant, containment breach, and release of radiation after the quake, but before the tsunami hit the Fukushima nuclear plant.

And now for the bad news. . . .

And yet, as harsh as this new report is (and it is even more critical than was expected, which is actually saying something), on first reading, it still appears to pull a punch.

Though the failure of the nuclear reactors and their safety systems is now even further documented in this report, its focus on industry obstruction and government collusion continues in some ways to perpetuate the “culture of safety” myth. By labeling the Fukushima disaster as “Made in Japan,” “manmade” and “preventable,” the panel–as we are fond of saying here–assumes a can opener. By talking up all that government and industry did wrong in advance of March 11, 2011, by critiquing all the lies and crossed signals after the earthquake and tsunami, and by recommending new protocols and upgrades, the Japanese report fiats a best-case scenario for a technology that has consistently proven that no such perfect plan exists.

The facts were all there before 3/11/11, and all the revelations since just add to the atomic pile. Nuclear fission is a process that has to go flawlessly to consistently provide safe and economical electrical power–but the process is too complex, and relies on too many parts, too many people and too volatile a fuel for that to ever really happen. Add in the costs and hazards of uranium mining, transport, fuel milling, and waste storage, and nuclear again proves itself to be dirty, dangerous, and disgustingly expensive.

* * *

And, as if to put an exclamation point at the end of the Diet’s report (and this column), the Japanese government moved this week to restart the nuclear plant at Oi, bringing the No. 3 reactor online just hours before the release of the new Fukushima findings. The Oi facility rests on a fault line, and seismologists, nuclear experts and activists have warned that this facility is at risk much in the way Fukushima Daiichi proved to be.

Most of Japan’s reactors were taken offline following the Tohoku quake, with the last of them–the Oi plant–shut down earlier this year. In the wake of the disaster, Japan’s then-Prime Minister, Naoto Kan, suggested that it might be time for his country to turn away from nuclear power. Demonstrators across Japan seemed to agree and urged Kansai Electric Power Company and current Prime Minister Yoshihiko Noda to delay the restart of Oi. But the government seemed to be hurrying to get Oi back up, despite many questions and several technical glitches.

Noda insists the rush is because of the need for electricity during the hot summer months, but Japan managed surprisingly well last summer (when more of the country’s infrastructure was still damaged from the quake and tsunami) with better conservation and efficiency measures. Perhaps release of this new report provides a more plausible explanation for the apparent urgency.

Imagine a Nuclear-Free California (You Don’t Have To, It’s Already Here)

We welcome our salp overlords. (A chain of salp in the Red Sea; photo: Lars Plougman via Wikipedia)

California has two nuclear power plants. San Onofre, between Los Angeles and San Diego, has been offline for months as everyone tries to find an excuse for the alarmingly rapid wear on new reactor tubing. (Being shut down, however, did not prevent a fire from breaking out this week when a pipe ruptured and released radioactive steam.)

But as of Thursday, Diablo Canyon, the nuclear plant to the north, is also offline–thanks to. . . uh, salp?

Yes, salp–those loveable, gelatinous, jellyfish-like, plankton-eating sea creatures that multiply like, well, salp–have swarmed Diablo Canyon’s water intake system. D-Can draws in tens of thousands of gallons of seawater every day to cool its reactors, and with all that salp clogging the intake pipes, the plant could no longer operate safely.

That may sound like a freak event, but it isn’t. San Onofre had to shut down in 2005 to clear out 11,000 pounds of anchovies that had the bad luck of swimming too close to that plant’s intake filters. . . and in 2004, it shut down, too, but that time it was due to 14,000 pounds of sardines.

And just last year, actual jellyfish (sorry, salps) brought down Florida Power & Light’s St. Lucie nuclear power station. Jellyfish have also previously crippled nuclear facilities in the UK, Israel and Japan.

But back to California, where without nuclear power, the state is heading for a disaster of biblical proportions–we’re talking human sacrifice, dogs and cats living together, mass hysteria!

Actually, no. What will happen is that Pacific Gas & Electric, the owner of Diablo Canyon, and Southern California Edison, San Onofre’s operator, will have to buy electricity (or continue to buy electricity) in order to deliver what they are obligated to deliver. That’s no fun for the big utilities, and maybe it looks biblical to the bean counters, but it is not an energy apocalypse.

Of course, instead of throwing millions after billions to buy surplus electricity elsewhere while also paying to staff, examine and repair its dormant, ancient nuclear facilities, power companies could try to invest more in 21st Century renewable alternatives.

And maybe that would happen if the market were actually, you know, a market. But with tax breaks, loan guarantees, and liability caps, the industry has little motivation to make sound financial or environmental decisions.

But there’s no time like the present to start. And right now, in California, that present is nuclear-free.

A little bit pregnant?

On Thursday, NPR’s Richard Harris delivered a report that regurgitated the nuclear industry’s latest message morph–once “clean, safe, and too cheap to meter,” the 21st Century PR spin has nuclear as the climate-friendly energy option.

The radio piece is ostensibly about how the world’s industrialized nations are failing to meet their climate goals–and this is true (and this is a problem). But Harris does the world and the climate cause no favors when he lazily posits: “Nuclear power produces very little carbon dioxide. . . .”

What does Harris mean by “nuclear power produces very little carbon dioxide?” Is that supposed to be a hedge? If you are isolating the atomic pile generating heat to boil water inside a closed system, then you might as well say “no CO2,” but if you are honest and take into account the whole lifecycle of nuclear fuel–from mining and refining through transportation and storage–then nuclear power produces a prodigious amount of greenhouse gases. Which is it Richard?

Probably just an oversight

The Washington Post published self-serving letter to the editor supporting a recent pro-nuclear editorial, but neglected to include that the letter was written by the current vice president and president elect and sitting member of the board of directors of the unabashedly pro-nuclear American Nuclear Society.

If only Nixon had apologized!

Fukushima Governor Yuhei Sato apologized Wednesday for prefectural officials who deleted records on the spread of radioactive fallout immediately following the start of the Daiichi nuclear crisis in March of 2011. The data from the country’s System for Prediction of Environmental Emergency Dose Information (SPEEDI) could have better informed citizens on when and where to evacuate during the first days after the Tohoku quake and tsunami destroyed safety systems at the Fukushima Daiichi nuclear power plant, and could have also given those trying to piece together what happened inside the reactors important forensic evidence.

At a news conference, Sato said, “A big problem lies in the fact that we failed to fully share the information soon after the nuclear disaster broke out.”

Well, yeah, that–and that you erased it.

Not to worry though, the government “reprimanded” its supervising officials and also “issued strong warnings” to the two government employees that actually did the deleting. So, citizens of Northern Japan, we’re good?

“Let’s Eat Cesium Beef”

That is (as translated by EXSKF) the name of an event in Iwate, Japan designed to encourage people to eat local beef known to be contaminated with radioactive cesium from Fukushima’s fallout.

No, this did not appear in a Japanese version of The Onion (Tamanegi?), this a real event as reported by Kyodo News in a series called “New Happiness in Japan.” Apparently, happiness is knowing you’re only poisoning your children a little bit. . . because there were kids at this thing.

The event was, uh, cooked up by the head of a meat-packing company to show a group of his regular customers–including young couples with kids–that beef containing radioactive cesium, but at levels lower than the provisional safety limits, still tasted OK.

According to the source of the translation, this story has people all over Japan shaking their heads wondering what this meat packer could have been thinking, but there have been several stories over the past year documenting even more official Japanese government efforts to get citizens to consume agricultural products from Fukushima and surrounding regions.

Imagine a nuclear-free Japan

Soon, you won’t have to imagine that, either. The last of Japan’s 50 commercial reactors still online will soon shut down.

Wait? Fifty? Wasn’t it 54? Well, earlier this month, Japan removed the four damaged reactors at Fukushima Daiichi from their official list of the country’s commercial reactors.

Probably wise.

Oh, and, notice, also no mass hysteria. The radiation that has contaminated air, water, and land might have many Japanese very worried, but the country has managed to handle the reduced electrical generating capacity remarkably well. They did this thing called “conservation.” Been doing it for over a year now. Think of all the dogs and cats that have been spared. . . not to mention the salp.

Something Fishy: CRS Report Downplays Fukushima’s Effect on US Marine Environment

japan

(photo: JanneM)

Late Thursday, the United States Coast Guard reported that they had successfully scuttled the Ryou-Un Maru, the Japanese “Ghost Ship” that had drifted into US waters after being torn from its moorings by the tsunami that followed the Tohoku earthquake over a year ago. The 200-foot fishing trawler, which was reportedly headed for scrap before it was swept away, was seen as potentially dangerous as it drifted near busy shipping lanes.

Coincidentally, the “disappearing” of the Ghost Ship came during the same week the Congressional Research Service (CRS) released its report on the effects of the Fukushima Daiichi nuclear disaster on the US marine environment, and, frankly, the metaphor couldn’t be more perfect. The Ryou-Un Maru is now resting at the bottom of the ocean–literally nothing more to see there, thanks to a few rounds from a 25mm Coast Guard gun–and the CRS hopes to dispatch fears of the radioactive contamination of US waters and seafood with the same alacrity.

But while the Ghost Ship was not considered a major ecological threat (though it did go down with around 2,000 gallons of diesel fuel in its tanks), the US government acknowledges that this “good luck ship” (a rough translation of its name) is an early taste of the estimated 1.5 million tons of tsunami debris expected to hit North American shores over the next two or three years. Similarly, the CRS report (titled Effects of Radiation from Fukushima Dai-ichi on the U.S. Marine Environment [PDF]) adopts an overall tone of “no worries here–its all under control,” but a closer reading reveals hints of “more to come.”

Indeed, the report feels as it were put through a political rinse cycle, limited both in the strength of its language and the scope of its investigation. This tension is evident right from the start–take, for example, these three paragraphs from the report’s executive summary:

Both ocean currents and atmospheric winds have the potential to transport radiation over and into marine waters under U.S. jurisdiction. It is unknown whether marine organisms that migrate through or near Japanese waters to locations where they might subsequently be harvested by U.S. fishermen (possibly some albacore tuna or salmon in the North Pacific) might have been exposed to radiation in or near Japanese waters, or might have consumed prey with accumulated radioactive contaminants.

High levels of radioactive iodine-131 (with a half-life of about 8 days), cesium-137 (with a half-life of about 30 years), and cesium-134 (with a half-life of about 2 years) were measured in seawater adjacent to the Fukushima Dai-ichi site after the March 2011 events. EPA rainfall monitors in California, Idaho, and Minnesota detected trace amounts of radioactive iodine, cesium, and tellurium consistent with the Japanese nuclear incident, at concentrations below any level of concern. It is uncertain how precipitation of radioactive elements from the atmosphere may have affected radiation levels in the marine environment.

Scientists have stated that radiation in the ocean very quickly becomes diluted and would not be a problem beyond the coast of Japan. The same is true of radiation carried by winds. Barring another unanticipated release, radioactive contaminants from Fukushima Dai-ichi should be sufficiently dispersed over time that they will not prove to be a serious health threat elsewhere, unless they bioaccumulate in migratory fish or find their way directly to another part of the world through food or other commercial products.

Winds and currents have “the potential” to transport radiation into US waters? Winds–quite measurably–already have, and computer models show that currents, over the next couple of years, most certainly will.

Are there concentrations of radioisotopes that are “below concern?” No reputable scientist would make such a statement. And if monitors in the continental United States detected radioactive iodine, cesium and tellurium in March 2011, then why did they stop the monitoring (or at least stop reporting it) by June?

The third paragraph, however, wins the double-take prize. Radiation would not be a problem beyond the coast? Fish caught hundreds of miles away would beg to differ. “Barring another unanticipated release. . . ?” Over the now almost 13 months since the Fukushima crisis began, there have been a series of releases into the air and into the ocean–some planned, some perhaps unanticipated at the time, but overall, the pattern is clear, radioactivity continues to enter the environment at unprecedented levels.

And radioactive contaminants “should be sufficiently dispersed over time, unless they bioaccumulate?” Unless? Bioaccumulation is not some crazy, unobserved hypothesis, it is a documented biological process. Bioaccumulation will happen–it will happen in migratory fish and it will happen as under-policed food and commercial products (not to mention that pesky debris) make their way around the globe.

Maybe that is supposed to be read by inquiring minds as the report’s “please ignore he man behind the curtain” moment–an intellectual out clause disguised as an authoritative analgesic–but there is no escaping the intent. Though filled with caveats and counterfactuals, the report is clearly meant to serve as a sop to those alarmed by the spreading ecological catastrophe posed by the ongoing Fukushima disaster.

The devil is in the details–the dangers are in the data

Beyond the wiggle words, perhaps the most damning indictment of the CRS marine radiation report can be found in the footnotes–or, more pointedly, in the dates of the footnotes. Though this report was released over a year after the Tohoku earthquake and tsunami triggered the Fukushima nightmare, the CRS bases the preponderance of its findings on information generated during the disaster’s first month. In fact, of the document’s 29 footnotes, only a handful date from after May 2011–one of those points to a CNN report (authoritative!), one to a status update on the Fukushima reactor structures, one confirms the value of Japanese seafood imports, three are items tracking the tsunami debris, and one directs readers to a government page on FDA radiation screening, the pertinent part of which was last updated on March 28 of last year.

Most crucially, the parts of the CRS paper that downplay the amounts of radiation measured by domestic US sensors all cite data collected within the first few weeks of the crisis. The point about radioisotopes being “below any level of concern” comes from an EPA news release dated March 22, 2011–eleven days after the earthquake, only six days after the last reported reactor explosion, and well before so many radioactive releases into the air and ocean. It is like taking reports of only minor flooding from two hours after Hurricane Katrina passed over New Orleans, and using them as the standard for levee repair and gulf disaster planning (perhaps not the best example, as many have critiqued levee repairs for their failure to incorporate all the lessons learned from Katrina).

It now being April of 2012, much more information is available, and clearly any report that expects to be called serious should have included at least some of it.

By October of last year, scientists were already doubling their estimates of the radiation pushed into the atmosphere by the Daiichi reactors, and in early November, as reported here, France’s Institute for Radiological Protection and Nuclear Safety issued a report showing the amount of cesium 137 released into the ocean was 30 times greater than what was stated by TEPCO in May. Shockingly, the Congressional Research Service does not reference this report.

Or take the early March 2012 revelation that seaweed samples collected from off the coast of southern California show levels of radioactive iodine 131 500 percent higher than those from anywhere else in the US or Canada. It should be noted that this is the result of airborne fallout–the samples were taken in mid-to-late-March 2011, much too soon for water-borne contamination to have reached that area–and so serves to confirm models that showed a plume of radioactive fallout with the greatest contact in central and southern California. (Again, this specific report was released a month before the CRS report, but the data it uses were collected over a year ago.)

Then there are the food samples taken around Japan over the course of the last year showing freshwater and sea fish–some caught over 200 kilometers from Fukushima–with radiation levels topping 100 becquerels per kilogram (one topping 600 Bq/kg).

And the beat goes on

This information, and much similar to it, was all available before the CRS released its document, but the report also operates in a risibly artificial universe that assumes the situation at Fukushima Daiichi has basically stabilized. As a sampling of pretty much any week’s news will tell you, it has not. Take, for example, this week:

About 12 tons of water contaminated with radioactive strontium are feared to have leaked from the Fukushima No. 1 plant into the Pacific Ocean, Tepco said Thursday.

The leak occurred when a pipe broke off from a joint while the water was being filtered for cesium, Tokyo Electric Power Co. said.

The system doesn’t remove strontium, and most of the water apparently entered the sea via a drainage route, Tepco added.

The water contained 16.7 becquerels of cesium per cu. centimeter and tests are under way to determine how much strontium was in it, Tepco said.

This is the second such leak in less than two weeks, and as Kazuhiko Kudo, a professor of nuclear engineering at Kyushu University who visited Fukushima Daiichi twice last year, noted:

There will be similar leaks until Tepco improves equipment. The site had plastic pipes to transfer radioactive water, which Tepco officials said are durable and for industrial use, but it’s not something normally used at nuclear plants. Tepco must replace it with metal equipment, such as steel.

(The plastic tubes–complete with the vinyl and duct tape patch–can be viewed here.)

And would that the good people at the Congressional Research Service could have waited to read a report that came out the same day as theirs:

Radioactive material from the Fukushima nuclear disaster has been found in tiny sea creatures and ocean water some 186 miles (300 kilometers) off the coast of Japan, revealing the extent of the release and the direction pollutants might take in a future environmental disaster.

In some places, the researchers from Woods Hole Oceanographic Institution (WHOI) discovered cesium radiation hundreds to thousands of times higher than would be expected naturally, with ocean eddies and larger currents both guiding the “radioactive debris” and concentrating it.

Or would that the folks at CRS had looked to their fellow government agencies before they went off half-cocked. (The study above was done by researchers at Woods Hole and written up in the journal of the National Academy of Sciences.) In fact, it appears the CRS could have done that. In its report, CRS mentions that “Experts cite [Fukushima] as the largest recorded release of radiation to the ocean,” and the source for that point is a paper by Ken Buesseler–the same Ken Buesseler that was the oceanographer in charge of the WHOI study. Imagine what could have been if the Congressional Research Service had actually contacted the original researcher.

Can openers all around

Or perhaps it wouldn’t have mattered. For if there is one obvious takeaway from the CRS paper, beyond its limits of scope and authority, that seeks to absolve it of all other oversights–it is its unfailing confidence in government oversight.

Take a gander at the section under the bolded question “Are there implications for US seafood safety?”:

It does not appear that nuclear contamination of seafood will be a food safety problem for consumers in the United States. Among the main reasons are that:

  • damage from the disaster limited seafood production in the affected areas,
  • radioactive material would be diluted before reaching U.S. fishing grounds, and
  • seafood imports from Japan are being examined before entry into the United States.

According to the U.S. Food and Drug Administration (FDA), because of damage from the earthquake and tsunami to infrastructure, few if any food products are being exported from the affected region. For example, according to the National Federation of Fisheries Cooperative Associations, the region’s fishing industry has stopped landing and selling fish. Furthermore, a fishing ban has been enforced within a 2-kilometer radius around the damaged nuclear facility.

So, the Food and Drug Administration is relying on the word of an industry group and a Japanese government-enforced ban that encompasses a two-kilometer radius–what link of that chain is supposed to be reassuring?

Last things first: two kilometers? Well, perhaps the CRS should hire a few proofreaders. A search of the source materials finds that the ban is supposed to be 20-kilometers. Indeed, the Japanese government quarantined the land for a 20-kilometer radius. The US suggested evacuation from a 50-mile (80-kilometer) radius. The CRS’s own report notes contaminated fish were collected 30 kilometers from Fukushima. So why is even 20 kilometers suddenly a radius to brag about?

As for a damaged industry not exporting, numerous reports show the Japanese government stepping in to remedy that “problem.” From domestic PR campaigns encouraging the consumption of foodstuffs from Fukushima prefecture, to the Japanese companies selling food from the region to other countries at deep discounts, to the Japanese government setting up internet clearing houses to help move tainted products, all signs point to a power structure that sees exporting possibly radioactive goods as essential to its survival.

The point on dilution, of course, not only ignores the way many large scale fishing operations work, it ignores airborne contamination and runs counter to the report’s own acknowledgment of bioaccumulation.

But maybe the shakiest assertion of all is that the US Food and Drug Administration will stop all contaminated imports at the water’s edge. While imports hardly represent the total picture when evaluating US seafood safety, taking this for the small slice of the problem it covers, it engenders raised eyebrows.

First there is the oft-referenced point from nuclear engineer Arnie Gundersen, who said last summer that State Department officials told him of a secret agreement between Japan and Secretary Hilary Clinton guaranteeing the continued importation of Japanese food. While independent confirmation of this pact is hard to come by, there is the plain fact that, beyond bans on milk, dairy products, fruits and vegetables from the Fukushima region issued in late March 2011, the US has proffered no other restrictions on Japanese food imports (and those few restrictions for Japanese food were lifted for US military commissaries in September).

And perhaps most damning, there was the statement from an FDA representative last April declaring that North Pacific seafood was so unlikely to be contaminated that “no sampling or monitoring of our fish is necessary.” The FDA said at the time that it would rely on the National Oceanographic and Atmospheric Administration (NOAA) to tell it when they should consider testing seafood, but a NOAA spokesperson said it was the FDA’s call.

Good. Glad that’s been sorted out.

The Congressional Research Service report seems to fall victim to a problem noted often here–they assume a can opener. As per the joke, the writers stipulate a functioning mechanism before explaining their solution. As many nuclear industry-watchers assume a functioning regulatory process (as opposed to a captured Nuclear Regulatory Commission, an industry-friendly Department of Energy, and industry-purchased members of Congress) when speaking of the hypothetical safety of nuclear power, the CRS here assumes an FDA interested first and foremost in protecting the general public, instead of an agency trying to strike some awkward “balance” between health, profit and politics. The can opener story is a joke; the effects of this real-life example are not.

Garbage in, garbage out

The Congressional Research Service, a part of the Library of Congress, is intended to function as the research and analysis wing of the US Congress. It is supposed to be objective, it is supposed to be accurate, and it is supposed to be authoritative. America needs the CRS to be all of those things because the agency’s words are expected to inform federal legislation. When the CRS shirks its responsibility, shapes its words to fit comfortably into the conventional wisdom, or shaves off the sharp corners to curry political favor, the impact is more than academic.

When the CRS limits its scope to avoid inconvenient truths, it bears false witness to the most important events of our time. When the CRS pretends other government agencies are doing their jobs–despite documentable evidence to the contrary–then they are not performing theirs. And when the CRS issues a report that ignores the data and the science so that a few industries might profit, it is America that loses.

The authors of this particular report might not be around when the bulk of the cancers and defects tied to the radiation from Fukushima Daiichi present in the general population, but this paper’s integrity today could influence those numbers tomorrow. Bad, biased, or bowdlerized advice could scuttle meaningful efforts to make consequential policy.

If the policy analysts that sign their names to reports like this don’t want their work used for scrap paper, then maybe they should take a lesson from the Ryou-Un Maru. Going where the winds and currents take you makes you at best a curiosity, and more likely a nuisance–just so much flotsam and jetsam getting in the way of actual business. Works of note come with moral rudders, anchored to best data available; without that, the report might as well just say “good luck.”

Looking Back at Our Nuclear Future

nuclear reactor, rocketdyne, LAT

The Los Angeles Times heralds the nuclear age in January 1957. (photo via wikipedia)

On March 11, communities around the world commemorated the first year of the still-evolving Fukushima Daiichi nuclear disaster with rallies, marches, moments of silence, and numerous retrospective reports and essays (including one here). But 17 days later, another anniversary passed with much less fanfare.

It was in the early morning hours of March 28, 1979, that a chain of events at the Three Mile Island nuclear power plant in Dauphin County, Pennsylvania caused what is known as a “loss of coolant accident,” resulting in a partial core meltdown, a likely hydrogen explosion, the venting of some amount of radioisotopes into the air and the dumping of 40,000 gallons of radioactive waste water into the Susquehanna River. TMI (as it is sometimes abbreviated) is often called America’s worst commercial nuclear accident, and though the nuclear industry and its acolytes have worked long and hard to downplay any adverse health effects stemming from the mishap, the fact is that what happened in Pennsylvania 33 years ago changed the face and future of nuclear power.

The construction of new nuclear power facilities in the US was already in decline by the mid 1970s, but the Three Mile Island disaster essentially brought all new projects to a halt. There were no construction licenses granted to new nuclear plants from the time of TMI until February of this year, when the NRC gave a hasty go-ahead to two reactors slated for the Vogtle facility in Georgia. And though health and safety concerns certainly played a part in this informal moratorium, cost had at least an equal role. The construction of new plants proved more and more expensive, never coming in on time or on budget, and the cleanup of the damaged unit at Three Mile Island took 14 years and cost over $1 billion. Even with the Price-Anderson Act limiting the industry’s liability, nuclear power plants are considered such bad risks that no financing can be secured without federal loan guarantees.

In spite of that–or because of that–the nuclear industry has pushed steadily over the last three decades to wring every penny out of America’s aging reactors, pumping goodly amounts of their hefty profits into lobbying efforts and campaign contributions designed to capture regulators and elected officials and propagate the age-old myth of an energy source that is clean, safe, and, if not exactly “too cheap to meter,” at least impressively competitive with other options. The result is a fleet of over 100 reactors nearing the end of their design lives–many with documented dangers and potential pitfalls that could rival TMI–now seeking and regularly getting license extensions from the Nuclear Regulatory Commission while that same agency softens and delays requirements for safety upgrades.

And all of that cozy cooperation between government and big business goes on with the nuclear industry pushing the idea of a “nuclear renaissance.” In the wake of Fukushima, the industry has in fact increased its efforts, lobbying the US and British governments to downplay the disaster, and working with its mouthpieces in Congress and on the NRC to try to kill recommended new regulations and force out the slightly more safety-conscious NRC chair. And, just this month, the Nuclear Energy Institute, the chief nuclear trade group, moved to take their message to what might be considered a less friendly cohort, launching a splashy PR campaign by underwriting public radio broadcasts and buying time for a fun and funky 60-second animated ad on The Daily Show.

All of this is done with the kind of confidence that only comes from knowing you have the money to move political practice and, perhaps, public opinion. Three Mile Island is, to the industry, the exception that proves the rule–if not an out-and-out success. “No one died,” you will hear–environmental contamination and the latest surveys now showing increased rates of Leukemia some 30 years later be damned–and that TMI is the only major accident in over half a century of domestic nuclear power generation.

Of course, this is not even remotely true–names like Browns Ferry, Cooper, Millstone, Indian Point and Vermont Yankee come to mind–but even if you discount plant fires and tritium leaks, Three Mile Island is not even America’s only meltdown.

There is, of course, the 1966 accident at Michigan’s Enrico Fermi Nuclear Generating Station, chronicled in the John Grant Fuller book We Almost Lost Detroit, but atom-lovers will dismiss this because Fermi 1 was an experimental breeder reactor, so it is not technically a “commercial” nuclear accident.

But go back in time another seven years–a full 20 before TMI–and the annals of nuclear power contain the troubling tale of another criticality accident, one that coincidentally is again in the news this week, almost 53 years later.

The Sodium Reactor Experiment

On July 12, 1957, the Sodium Reactor Experiment (SRE) at the Santa Susana Nuclear Field Laboratory near Simi Valley, California, became the first US nuclear reactor to produce electricity for a commercial power grid. SRE was a sodium-cooled reactor designed by Atomics International, a division of North American Aviation, a company more often known by the name of its other subsidiary, Rocketdyne. Southern California Edison used the electricity generated by SRE to light the nearby town of Moorpark.

Sometime during July 1959–the exact date is still not entirely clear–a lubricant used to cool the seals on the pump system seeped into the primary coolant, broke down in the heat and formed a compound that clogged cooling channels. Because of either curiosity or ignorance, operators continued to run the SRE despite wide fluctuations in core temperature and generating capacity.

Following a pattern that is now all too familiar, increased temperatures caused increased pressure, necessitating what was even then called a “controlled venting” of radioactive vapor. How much radioactivity was released into the environment is cause for some debate, for, in 1959, there was less monitoring and even less transparency. Current reconstructions, however, believe the release was possibly as high as 450 times greater than what was vented at Three Mile Island.

When the reactor was finally shut down and the fuel rods were removed (which was a trick in itself, as some were stuck and others broke), it was found that over a quarter showed signs of melting.

The SRE was eventually repaired and restarted in 1960, running on and off for another four years. Decommissioning began in 1976, and was finished in 1981, but the story doesn’t end there. Not even close.

Fifty-three years after a partial nuclear meltdown at the Santa Susana Field Laboratory site in the Chatsworth Hills, the U.S. Environmental Protection Agency has just released data finding extensive radioactive contamination still remains at the accident site.

“This confirms what we were worried about,” said Assemblywoman Julia Brownley, D-Oak Park, a long-time leader in the fight for a complete and thorough cleanup of this former Rocketdyne rocket engine testing laboratory. “This begins to answer critical questions about what’s still up there, where, how much, and how bad?”

Well, it sort of begins to answer it.

New soil samples weigh in at up to 1,000 times the radiation trigger levels (RTLs) agreed to when the Department of Energy struck a cleanup deal with the California Department of Toxic Substances in 2010. What’s more, these measurements follow two previous cleanup efforts by the DOE and Boeing, the company that now owns Santa Susana.

In light of the new findings, Assemblywoman Brownley has called on the DOE to comply with the agreement and do a real and thorough cleanup of the site. That means taking radiation levels down to what are the established natural background readings for the area. But that, as is noted by local reporter Michael Collins, “may be easier said than done”:

This latest U.S. EPA information appears to redefine what cleaning up to background actually is. Publicly available documents show that the levels of radiation in this part of Area IV where the SRE once stood are actually many thousands of times more contaminated than previously thought.

Just as troubling, the EPA’s RTLs, which are supposed to mirror the extensively tested and reported-on backgrounds of the numerous radionuclides at the site, were many times over the background threshold values (BTVs). So instead of cleaning up to background, much more radiation would be left in the ground, saving the government and lab owner Boeing millions in cleanup.

It is a disturbing tale of what Collins calls a kind of environmental “bait and switch” (of which he provides even more detail in an earlier report), but after a year of documenting the mis- and malfeasance of the nuclear industry and its supposed regulators, it is, to us here, anyway, not a surprising one.

To the atom-enamored, it is as if facts have a half-life all their own. The pattern of swearing that an event is no big deal, only to come back with revision after revision, each admitting a little bit more in a seemingly never-ending regression to what might approximately describe a terrible reality. It would be reminiscent of the “mom’s on the roof” joke if anyone actually believed that nuclear operators and their chummy government minders ever intended to eventually relay the truth.

Fukushima’s latest surprise

Indeed, that unsettling pattern is again visible in the latest news from Japan. This week saw revelations that radiation inside Fukushima Daiichi’s reactor 2 containment vessel clocked in at levels seriously higher than previously thought, while water levels are seriously lower.

An endoscopic camera, thermometer, water gauge and dosimeter were inserted into the number 2 reactor containment, and it documented radiation levels of up to 70 sieverts per hour, which is not only seven times the previous highest measurement, but 10 times higher than what is called a fatal dose (7 Sv/hr would kill a human in minutes).

The water level inside the containment vessel, estimated to be at 10 meters when the Japanese government declared a “cold shutdown” in December, turns out to be no more than 60 centimeters (about two feet).

This is disquieting news for many reasons. First, the high radiation not only makes it impossible for humans to get near the reactor, it makes current robotic technology impractical, as well. The camera, for instance, would only last 14 hours in those conditions. If the molten core is to be removed, a new class of radiation-resistant robots will have to be developed.

The extremely low water levels signal more troubling scenarios. Though some experts believe that the fuel rods have melted down or melted through to such an extent that two feet of water can keep them covered, it likely indicates a breach or breaches of the containment vessel. Plant workers, after all, have been pumping water into the reactor constantly for months now (why no one noticed that they kept having to add water to the system, or why no one cared, is plenty disturbing, as is the question of where all that extra water has gone).

Arnie Gundersen of nuclear engineering consultancy Fairewinds Associates believes that the level of water roughly corresponds with the lower lip of the vessel’s suppression pool–further evidence that reactor 2 suffered a hydrogen explosion, as did two other units at Fukushima. Gundersen also believes that the combination of heat, radioactivity and seawater likely degraded the seals on points where tubes and wires penetrated the structure–so even if there were no additional cracks from an explosion or the earthquake, the system is now almost certainly riddled with holes.

The holes pose a couple of problems, not only does it mean more contaminated water leaking into the environment, it precludes filling the building with water to shield people and equipment from radiation. Combined with the elevated radiation readings, this will most certainly mean a considerably longer and more expensive cleanup.

And reactor 2 was considered the Fukushima unit in the best shape.

(Reactor 2 is also the unit that experienced a rapid rise in temperature and possible re-criticality in early February. TEPCO officials later attributed this finding to a faulty thermometer, but if one were skeptical of that explanation before, the new information about high radiation and low water levels should warrant a re-examination of February’s events.)

What does this all mean? Well, for Japan, it means injecting another $22 billion into Fukushima’s nominal owners, TEPCO–$12 billion just to stay solvent, and $10.2 billion to cover compensation for those injured or displaced by the nuclear crisis. That cash dump comes on top of the $18 billion already coughed up by the Japanese government, and is just a small down payment on what is estimated to be a $137 billion bailout of the power company.

It also means a further erosion of trust in an industry and a government already short on respect.

The same holds true in the US, where poor communication and misinformation left the residents of central Pennsylvania panicked and perturbed some 33 years ago, and the story is duplicated on varying scales almost weekly somewhere near one of America’s 104 aging and increasingly accident-prone nuclear reactors.

And, increasingly, residents and the state and local governments that represent them are saying “enough.” Whether it is the citizens and state officials from California’s Simi Valley demanding the real cleanup of a 53-year-old meltdown, or the people and legislature of Vermont facing off with the federal government on who has ultimate authority to assure that the next nuclear accident doesn’t happen in their backyard, Americans are looking at their future in the context of nuclear’s troubled past.

One year after Fukushima, 33 years after Three Mile Island, and 53 years after the Sodium Reactor Experiment, isn’t it time the US federal government did so, too?

As World Honors Fukushima Victims, NRC Gives Them a One-Fingered Salute

Sign from Fukushima commemoration and anti-nuclear power rally, Union Square Park, NYC, 3/11/12. (photo: G. Levine)

Nearly a week after the first anniversary of the Japanese earthquake and tsunami that started the crisis at the Fukushima Daiichi nuclear power facility, I am still sorting through the dozens of reports, retrospectives and essays commemorating the event. The sheer volume of material has been a little exhausting, but that is, of course, compounded by the weight of the subject. From reviewing the horrors of a year ago–now even more horrific, thanks to many new revelations about the disaster–to contemplating what lies ahead for residents of Japan and, indeed, the world, it is hard just to read about it; living it–then, now, and in the future–is almost impossible for me to fathom.

But while living with the aftermath might be hard to imagine, that such a catastrophe could and likely would happen was not. In fact, if there is a theme (beyond the suffering of the Japanese people) that runs through all the Fukushima look-backs, it is the predictability–the mountains of evidence that said Japan’s nuclear plants were vulnerable, and if nothing were done, a disaster (like the one we have today) should be expected.

I touched on this last week in my own anniversary examination, and now I see that Dawn Stover, contributing editor at The Bulletin of the Atomic Scientists, draws a similar comparison:

Although many politicians have characterized 3/11 and 9/11 as bizarre, near-impossible events that could not have been foreseen, in both cases there were clear but unheeded warnings. . . . In the case of 3/11, the nuclear plant’s operators ignored scientific studies showing that the risks of a tsunami had been dramatically underestimated. Japan’s “safety culture,” which asserted that accidents were impossible, prevented regulators from taking a hard look at whether emergency safety systems would function properly in a tsunami-caused station blackout.

Stover goes on to explain many points where the two nightmare narratives run parallel. She notes how while governments often restrict information, stating that they need to guard against mass panic, it is actually the officials who are revealed to be in disarray. By contrast, in both cases, first responders behaved rationally and professionally, putting themselves at great risk in attempts to save others.

In both cases, communication–or, rather, the terrible lack of it–between sectors of government and between officials and responders exacerbated the crisis and put more lives at risk.

And with both 9/11 and 3/11, the public’s trust in government was shaken. And that crisis of trust was made worse by officials obscuring the facts and covering their tracks to save their own reputations.

But perhaps with that last point, it is more my reading my observations into hers than a straight retelling of Stover. Indeed, it is sad to note that Stover concludes her Fukushima think piece with a similar brand of CYA hogwash:

By focusing needed attention on threats to our existence, 3/11 and 9/11 have brought about some positive changes. The nuclear disaster in Japan has alerted nuclear regulators and operators around the world to the vulnerabilities of nuclear power plant cooling systems and will inevitably lead to better standards for safety and siting — and perhaps even lend a new urgency to the problem of spent fuel. Likewise, 9/11 resulted in new security measures and intelligence reforms that have thus far prevented another major terrorist attack in the United States and have created additional safeguards for nuclear materials.

When it comes to post-9/11 “security” and “intelligence reforms,” Stover is clearly out of her depth, and using the Bush-Cheney “no new attacks” fallacy frankly undermines the credibility of the entire essay. But I reference it here because it sets up a more important point.

If only Stover had taken a lesson from her own story. The Fukushima disaster has not alerted nuclear regulators and operators to vulnerabilities–as has been made clear here and in several of the post-Fukushima reports, those vulnerabilities were all well known, and known well in advance of 3/11/11.

But even if this were some great and grand revelation, some signal moment, some clarion call, what in the annals of nuclear power makes Stover or any other commentator think that call will be heard? “Inevitably lead to better standards”–inevitably? We’d all exit laughing if we weren’t running for our lives.

Look no further than the “coincidental” late-Friday, pre-anniversary news dump from the US Nuclear Regulatory Commission.

Late on March 9, 2012, two days before the earthquake and tsunami would be a year in the rear-view mirror, the NRC put on a big splashy show. . . uh, strike that. . . released a weirdly underplayed written announcement that the commission had approved a set of new rules drawing on lessons learned from the Fukushima crisis:

The Nuclear Regulatory Commission ordered major safety changes for U.S. nuclear power plants Friday. . . .

The orders require U.S. nuclear plants to install or improve venting systems to limit core damage in a serious accident and to install sophisticated equipment to monitor water levels in pools of spent nuclear fuel.

The plants also must improve protection of safety equipment installed after the 2001 terrorist attacks and make sure it can handle damage to multiple reactors at the same time.

Awwwrighty then, that sounds good, right? New rules, more safety, responsive to the Japanese disaster at last–but the timing instantly raised questions.

It didn’t take long to discover these were not the rules you were looking for.

First off, these are only some of the recommendations put before the commission by their Near-Term Task Force some ten months ago, and while better monitoring of water levels in spent fuel pools and plans to handle multiple disasters are good ideas, it has been noted that the focus on hardening the vents in Mark I and Mark II boiling water reactors actually misdiagnoses what really went wrong in two of the Fukushima Daiichi reactors.

Also, it should be noted this represents less than half the recommendations in last summer’s report. It also does not mandate a migration of spent fuel from pools to dry casks, an additional precaution not explicitly in the report, but stressed by NRC chief Gregory Jaczko, as well as many industry watchdogs.

But most important–and glaring–of all, the language under which these rules passed could make it that almost none of them are ever enforced.

This is a little technical, so let me turn to one of the few members of Congress that actually spends time worrying about this, Rep. Ed Markey (D MA-7):

While I am encouraged that the Commission supports moving forward with three of the most straightforward and quickly-issued nuclear safety Orders recommended by their own expert staff, I am disappointed that several Commissioners once again have rejected the regulatory justification that they are necessary for the adequate protection of nuclear reactors in this country. . . .

After the terrorist attacks of September 11, 2001, the NRC determined that some nuclear security upgrades were required to be implemented for the “adequate protection” of all U.S. nuclear reactors. This meant that nuclear reactors would not be considered to be sufficiently secure without these new measures, and that an additional cost-benefit “backfit” analysis would not be required to justify their implementation. The “adequate protection” concept is derived from the Atomic Energy Act of 1954, and is reflected in NRC’s “Backfit Rule” which specifies that new regulations for existing nuclear reactors are not required to include this extra cost-benefit “backfit” analysis when the new regulations are “necessary to ensure that the facility provides adequate protection to the health and safety of the public.”

Both the NRC Fukushima Task Force and the NRC staff who reviewed the Task Force report concluded that the new post-Fukushima safety recommendations, including the Orders issued today, were also necessary for the “adequate protection” of existing U.S. nuclear power plants, and that additional cost-benefit analysis should not be required to justify their implementation.

While Chairman Jaczko’s vote re-affirmed his support of all the Near-Term Task Force’s recommendations, including the need to mandate them all on the basis that they are necessary for the adequate protection of all U.S. nuclear power plants, Commissioner Svinicki did not do so for any of the Orders, Commissioner Magwood did not do so for two of the three Orders, and Commissioners Apostolakis and Ostendorff rejected that basis for one of the three. As a result, the Order requiring technologies to monitor conditions in spent nuclear fuel pools during emergencies will proceed using a different regulatory basis. More importantly, the inability of the Commission to unanimously accept its own staff’s recommendations on these most straightforward safety measures presents an ominous signal of the manner in which the more complicated next sets of safety measures will be considered.

In other words, last Friday’s move was regulatory kabuki. By failing to use the strictest language for fuel pools, plant operators will be allowed to delay compliance for years, if not completely excuse themselves from it, based on the argument that the safety upgrade is too costly.

The other two rules are also on shaky ground, as it were. And even if by some miracle, the industry chose not to fight them, and the four uber-pro-nuclear commissioners didn’t throw up additional roadblocks, nothing is required of the nuclear facilities until December 31, 2016.

So, rather than it being a salutary moment, a tribute of sorts to the victims in Japan on the anniversary of their disaster, the announcement by the NRC stands more as an insult. It’s as if the US government is saying, “Sure, there are lessons to be learned here, but the profits of private energy conglomerates are more important than any citizen’s quaint notions of health and safety. ”

As if any more examples were needed, these RINOs (rules in name only) demonstrate again that in America, as in Japan, the government is too close to the nuclear industry it is supposed to police.

And, for the bigger picture, as if any more examples were needed, be it before or after March 11, it really hasn’t been that hard to imagine the unimaginable. When an industry argues it has to forgo a margin of safety because of cost, there’s a good chance it was too dangerous and too expensive to begin with.

* * *

By way of contrast, take a look in at the some of the heartfelt expressions of commemoration and protest from New York’s Fukushima memorial and anti-nuclear rally, held last Sunday in Union Square Park.