Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

The Long, Long Con: Seventy Years of Nuclear Fission; Thousands of Centuries of Nuclear Waste

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi's first atomic pile--and the start of the world's nuclear waste problem. (Photo: Nathan Guy via Flickr)

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi’s first atomic pile–and the start of the world’s nuclear waste problem. (Photo: Nathan Guy via Flickr)

On December 2, 1942, a small group of physicists under the direction of Enrico Fermi gathered on an old squash court beneath Alonzo Stagg Stadium on the Campus of the University of Chicago to make and witness history. Uranium pellets and graphite blocks had been stacked around cadmium-coated rods as part of an experiment crucial to the Manhattan Project–the program tasked with building an atom bomb for the allied forces in WWII. The experiment was successful, and for 28 minutes, the scientists and dignitaries present observed the world’s first manmade, self-sustaining nuclear fission reaction. They called it an atomic pile–Chicago Pile 1 (CP-1), to be exact–but what Fermi and his team had actually done was build the world’s first nuclear reactor.

The Manhattan Project’s goal was a bomb, but soon after the end of the war, scientists, politicians, the military and private industry looked for ways to harness the power of the atom for civilian use, or, perhaps more to the point, for commercial profit. Fifteen years to the day after CP-1 achieved criticality, President Dwight Eisenhower threw a ceremonial switch to start the reactor at Shippingport, PA, which was billed as the first full-scale nuclear power plant built expressly for civilian electrical generation.

Shippingport was, in reality, little more than a submarine engine on blocks, but the nuclear industry and its acolytes will say that it was the beginning of billions of kilowatts of power, promoted (without a hint of irony) as “clean, safe, and too cheap to meter.” It was also, however, the beginning of what is now a, shall we say, weightier legacy: 72,000 tons of nuclear waste.

Atoms for peace, problems forever

News of Fermi’s initial success was communicated by physicist Arthur Compton to the head of the National Defense Research Committee, James Conant, with artistically coded flair:

Compton: The Italian navigator has landed in the New World.
Conant: How were the natives?
Compton: Very friendly.

But soon after that initial success, CP-1 was disassembled and reassembled a short drive away, in Red Gate Woods. The optimism of the physicists notwithstanding, it was thought best to continue the experiments with better radiation shielding–and slightly removed from the center of a heavily populated campus. The move was perhaps the first necessitated by the uneasy relationship between fissile material and the health and safety of those around it, but if it was understood as a broader cautionary tale, no one let that get in the way of “progress.”

A stamp of approval: the US Postal Service commemorated Eisenhower's initiative in 1955.

A stamp of approval: the US Postal Service commemorated Eisenhower’s initiative in 1955.

By the time the Shippingport reactor went critical, North America already had a nuclear waste problem. The detritus from manufacturing atomic weapons was poisoning surrounding communities at several sites around the continent (not that most civilians knew it at the time). Meltdowns at Chalk River in Canada and the Experimental Breeder Reactor in Idaho had required fevered cleanups, the former of which included the help of a young Navy officer named Jimmy Carter. And the dangers of errant radioisotopes were increasing with the acceleration of above-ground atomic weapons testing. But as President Eisenhower extolled “Atoms for Peace,” and the US Atomic Energy Commission promoted civilian nuclear power at home and abroad, a plan to deal with the “spent fuel” (as used nuclear fuel rods are termed) and other highly radioactive leftovers was not part of the program (beyond, of course, extracting some of the plutonium produced by the fission reaction for bomb production, and the promise that the waste generated by US-built reactors overseas could at some point be marked “return to sender” and repatriated to the United States for disposal).

Attempts at what was called “reprocessing”–the re-refining of used uranium into new reactor fuel–quickly proved expensive, inefficient and dangerous, and created as much radioactive waste as it hoped to reuse. It also provided an obvious avenue for nuclear weapons proliferation because of the resulting production of plutonium. The threat of proliferation (made flesh by India’s test of an atomic bomb in 1976) led President Jimmy Carter to cancel the US reprocessing program in 1977. Attempts by the Department of Energy to push mixed-oxide (MOX) fuel fabrication (combining uranium and plutonium) over the last dozen years has not produced any results, either, despite over $5 billion in government investments.

In fact, there was no official federal policy for the management of used but still highly radioactive nuclear fuel until passage of The Nuclear Waste Policy Act of 1982. And while that law acknowledged the problem of thousands of tons of spent fuel accumulating at US nuclear plants, it didn’t exactly solve it. Instead, the NWPA started a generation of political horse trading, with goals and standards defined more by market exigencies than by science, that leaves America today with what amounts to over five-dozen nominally temporary repositories for high-level radioactive waste–and no defined plan to change that situation anytime soon.

When you assume…

When a US Court of Appeals ruled in June that the Nuclear Regulatory Commission acted improperly when it failed to consider all the risks of storing spent radioactive fuel onsite at the nation’s nuclear power facilities, it made specific reference to the lack of any real answers to the generations-old question of waste storage:

[The Nuclear Regulatory Commission] apparently has no long-term plan other than hoping for a geologic repository. . . . If the government continues to fail in its quest to establish one, then SNF (spent nuclear fuel) will seemingly be stored on site at nuclear plants on a permanent basis. The Commission can and must assess the potential environmental effects of such a failure.

The court concluded the current situation–where spent fuel is stored across the country in what were supposed to be temporary configurations–“poses a dangerous long-term health and environmental risk.”

The decision also harshly criticized regulators for evaluating plant relicensing with the assumption that spent nuclear fuel would be moved to a central long-term waste repository.

A mountain of risks

The Nuclear Waste Policy Act set in motion an elaborate process that was supposed to give the US a number of possible waste sites, but, in the end, the only option seriously explored was the Yucca Mountain site in Nevada. After years of preliminary construction and tens of millions of dollars spent, Yucca was determined to be a bad choice for the waste:

[Yucca Mountain’s] volcanic formation is more porous and less isolated than originally believed–there is evidence that water can seep in, there are seismic concerns, worries about the possibility of new volcanic activity, and a disturbing proximity to underground aquifers. In addition, Yucca mountain has deep spiritual significance for the Shoshone and Paiute peoples.

Every major Nevada politician on both sides of the aisle has opposed the Yucca repository since its inception. Senate Majority Leader Harry Reid has worked most of his political life to block the facility. And with the previous NRC head, Gregory Jaczko, (and now his replacement, Allison Macfarlane, as well) recommending against it, the Obama administration’s Department of Energy moved to end the project.

Even if it were an active option, Yucca Mountain would still be many years and maybe as much as $100 million away from completion. And yet, the nuclear industry (through recipients of its largesse in Congress) has challenged the administration to spend any remaining money in a desperate attempt to keep alive the fantasy of a solution to their waste crisis.

Such fevered dreams, however, do not qualify as an actual plan, according to the courts.

The judges also chastised the NRC for its generic assessment of spent fuel pools, currently packed well beyond their projected capacity at nuclear plants across the United States. Rather than examine each facility and the potential risks specific to its particular storage situation, the NRC had only evaluated the safety risks of onsite storage by looking at a composite of past events. The court ruled that the NRC must appraise each plant individually and account for potential future dangers. Those dangers include leaks, loss of coolant, and failures in the cooling systems, any of which might result in contamination of surrounding areas, overheating and melting of stored rods, and the potential of burning radioactive fuel–risks heightened by the large amounts of fuel now densely packed in the storage pools and underscored by the ongoing disaster at Japan’s Fukushima Daiichi plant.

Indeed, plants were not designed nor built to house nuclear waste long-term. The design life of most reactors in the US was originally 40 years. Discussions of the spent fuel pools usually gave them a 60-year lifespan. That limit seemed to double almost magically as nuclear operators fought to postpone the expense of moving cooler fuel to dry casks and of the final decommissioning of retired reactors.

Everyone out of the pool

As disasters as far afield as the 2011 Tohoku earthquake and last October’s Hurricane Sandy have demonstrated, the storage of spent nuclear fuel in pools requires steady supplies of power and cool water. Any problem that prevents the active circulation of liquid through the spent fuel pools–be it a loss of electricity, the failure of a back-up pump, the clogging of a valve or a leak in the system–means the temperature in the pools will start to rise. If the cooling circuit is out long enough, the water in the pools will start to boil. If the water level dips (due to boiling or a leak) enough to expose hot fuel rods to the air, the metal cladding on the rods will start to burn, in turn heating the fuel even more, resulting in plumes of smoke carrying radioactive isotopes into the atmosphere.

And because these spent fuel pools are so full–containing as much as five times more fuel than they were originally designed to hold, and at densities that come close to those in reactor cores–they both heat stagnant water more quickly and reach volatile temperatures faster when exposed to air.

A spent fuel pool and dry casks. (Both photos courtesy of the US Nuclear Regulatory Commission)

A spent fuel pool and dry casks. (Both photos courtesy of the US Nuclear Regulatory Commission)

After spent uranium has been in a pool for at least five years (considerably longer than most fuel is productive as an energy source inside the reactor), fuel rods are deemed cool enough to be moved to dry casks. Dry casks are sealed steel cylinders filled with spent fuel and inert gas, which are themselves encased in another layer of steel and concrete. These massive fuel “coffins” are then placed outside, spaced on concrete pads, so that air can circulate and continue to disperse heat.

While the long-term safety of dry casks is still in question, the fact that they require no active cooling system gives them an advantage, in the eyes of many experts, over pool storage. As if to highlight that difference, spent fuel pools at Fukushima Daiichi have posed some of the greatest challenges since the March 2011 earthquake and tsunami, whereas, to date, no quake or flood-related problems have been reported with any of Japan’s dry casks. The disparity was so obvious, that the NRC’s own staff review actually added a proposal to the post-Fukushima taskforce report, recommending that US plants take more fuel out of spent fuel pools and move it to dry casks. (A year-and-a-half later, however, there is still no regulation–or even a draft–requiring such a move.)

But current dry cask storage poses its own set of problems. Moving fuel rods from pools to casks is slow and costly–about $1.5 million per cask, or roughly $7 billion to move all of the nation’s spent fuel (a process, it is estimated, that would take no less than five to ten years). That is expensive enough to have many nuclear plant operators lobbying overtime to avoid doing it.

Further, though not as seemingly vulnerable as fuel pools, dry casks are not impervious to natural disaster. In 2011, a moderate earthquake centered about 20 miles from the North Anna, Virginia, nuclear plant caused most of its vertical dry casks–each weighing 115 tons–to shift, some by more than four inches. The facility’s horizontal casks didn’t move, but some showed what was termed “cosmetic damage.”

Dry casks at Michigan’s Palisades plant sit on a pad atop a sand dune just 100 yards from Lake Michigan. An earthquake there could plunge the casks into the water. And the casks at Palisades are so poorly designed and maintained, submersion could result in water contacting the fuel, contaminating the lake and possibly triggering a nuclear chain reaction.

And though each cask contains far less fissile material than one spent fuel pool, casks are still considered possible targets for terrorism. A TOW anti-tank missile would breach even the best dry cask (PDF), and with 25 percent of the nation’s spent fuel now stored in hundreds of casks across the country, all above ground, it provides a rich target environment.

Confidence game

Two months after the Appeals Court found fault with the Nuclear Regulatory Commission’s imaginary waste mitigation scenario, the NRC announced it would suspend the issuing of new reactor operating licenses, license renewals and construction licenses until the agency could craft a new plan for dealing with the nation’s growing spent nuclear fuel crisis. In drafting its new nuclear “Waste Confidence Decision” (NWCD)–the methodology used to assess the hazards of nuclear waste storage–the Commission said it would evaluate all possible options for resolving the issue.

At first, the NRC said this could include both generic and site-specific actions (remember, the court criticized the NRC’s generic appraisals of pool safety), but as the prescribed process now progresses, it appears any new rule will be designed to give the agency, and so, the industry, as much wiggle room as possible. At a public hearing in November, and later at a pair of web conferences in early December, the regulator’s Waste Confidence Directorate (yes, that’s what it is called) outlined three scenarios (PDF) for any future rulemaking:

  • Storage until a repository becomes available at the middle of the century
  • Storage until a repository becomes available at the end of the century
  • Continued storage in the event a repository is not available

And while, given the current state of affairs, the first option seems optimistic, the fact that their best scenario now projects a repository to be ready by about 2050 is a story in itself.

When the Nuclear Waste Policy Act was signed into law by President Reagan early in 1983, it was expected the process it set in motion would present at least one (and preferably another) long-term repository by the late 1990s. But by the time the “Screw Nevada Bill” (as it is affectionately known in the Silver State) locked in Yucca Mountain as the only option for permanent nuclear waste storage, the projected opening was pushed back to 2007.

But Yucca encountered problems from its earliest days, so a mid-’90s revision of the timeline postponed the official start, this time to 2010. By 2006, the Department of Energy was pegging Yucca’s opening at 2017. And, when the NWPA was again revised in 2010–after Yucca was deemed a non-option–it conveniently avoided setting a date for the opening of a national long-term waste repository altogether.

It was that 2010 revision that was thrown out by the courts in June.

“Interim storage” and “likely reactors”

So, the waste panel now has three scenarios–but what are the underlying assumptions for those scenarios? Not, obviously, any particular site for a centralized, permanent home for the nation’s nuclear garbage–no new site has been chosen, and it can’t even be said there is an active process at work that will choose one.

There are the recommendations of a Blue Ribbon Commission (BRC) convened by the president after Yucca Mountain was off the table. Most notable there, was a recommendation for interim waste storage, consolidated at a handful of locations across the country. But consolidated intermediate waste storage has its own difficulties, not the least of which is that no sites have yet been chosen for any such endeavor. (In fact, plans for the Skull Valley repository, thought to be the interim facility closest to approval, were abandoned by its sponsors just days before Christmas.)

Just-retired New Mexico Senator Jeff Bingaman (D), the last chair of the Energy and Natural Resources Committee, tried to turn the BRC recommendations into law. When he introduced his bill in August, however, he had to do so without any cosponsors. Hearings on the Nuclear Waste Administration Act of 2012 were held in September, but the gavel came down on the 112th Congress without any further action.

In spite of the underdeveloped state of intermediate storage, however, when the waste confidence panel was questioned on the possibility, interim waste repositories seemed to emerge, almost on the fly, as an integral part of any revised waste policy rule.

“Will any of your scenarios include interim centralized above-ground storage?” we asked during the last public session. Paul Michalak, who heads the Environmental Impact Statement branch of the Waste Confidence Directorate, first said temporary sites would be considered in the second and third options. Then, after a short pause, Mr. Michalak added (PDF p40), “First one, too. All right. Right. That’s right. So we’re considering an interim consolidated storage facility [in] all three scenarios.”

The lack of certainty on any site or sites is, however, not the only fuzzy part of the picture. As mentioned earlier, the amount of high-level radioactive waste currently on hand in the US and in need of a final resting place is upwards of 70,000 tons–already at the amount that was set as the initial limit for the Yucca Mountain repository. Given that there are still over 100 domestic commercial nuclear reactors more or less in operation, producing something like an additional 2,000 tons of spent fuel every year, what happens to the Waste Confidence Directorate’s scenarios as the years and waste pile up? How much waste were regulators projecting they would have to deal with–how much spent fuel would a waste confidence decision assume the system could confidently handle?

There was initial confusion on what amount of waste–and at what point in time–was informing the process. Pressed for clarification on the last day of hearings, NRC officials finally posited that it was assumed there would be 150,000 metric tons of spent fuel–all deriving from the commercial reactor fleet–by 2050. By the end of the century, the NRC expects to face a mountain of waste weighing 270,000 metric tons (PDF pp38-41) (though this figure was perplexingly termed both a “conservative number” and an “overestimate”).

How did the panel arrive at these numbers? Were they assuming all 104 (soon to be 103–Wisconsin’s Kewaunee Power Station will shut down by mid-2013 for reasons its owner, Dominion Resources, says are based “purely on economics”) commercial reactors nominally in operation would continue to function for that entire time frame–even though many are nearing the end of their design life and none are licensed to continue operation beyond the 2030s? Were they counting reactors like those at San Onofre, which have been offline for almost a year, and are not expected to restart anytime soon? Or the troubled reactors at Ft. Calhoun in Nebraska and Florida’s Crystal River? Neither facility has been functional in recent years, and both have many hurdles to overcome if they are ever to produce power again. Were they factoring in the projected AP1000 reactors in the early stages of construction in Georgia, or the ones slated for South Carolina? Did the NRC expect more or fewer reactors generating waste over the course of the next 88 years?

The response: waste estimates include all existing facilities, plus “likely reactors”–but the NRC cannot say exactly how many reactors that is (PDF p41).

Jamming it through

Answers like those from the Waste Confidence Directorate do not inspire (pardon the expression) confidence for a country looking at a mountain of eternally toxic waste. Just what would the waste confidence decision (and the environmental impact survey that should result from it) actually cover? What would it mandate, and what would change as a result?

How long is it? Does this NRC chart provide a justification for the narrow scope of the waste confidence process? (US Nuclear Regulatory PDF, p12)

How long is it? Does this NRC chart provide a justification for the narrow scope of the waste confidence process? (US Nuclear Regulatory PDF, p12)

In past relicensing hearings–where the public could comment on proposed license extensions on plants already reaching the end of their 40-year design life–objections based on the mounting waste problem and already packed spent fuel pools were waived off by the NRC, which referenced the waste confidence decision as the basis of its rationale. Yet, when discussing the parameters of the process for the latest, court-ordered revision to the NWCD, Dr. Keith McConnell, Director of the Waste Confidence Directorate, asserted that waste confidence was not connected to the site-specific licensed life of operations (PDF p42), but only to a period defined as “Post-Licensed Life Storage” (which appears, if a chart in the directorate’s presentation (PDF p12) is to be taken literally, to extend from 60 years after the initial creation of waste, to 120 years–at which point a phase labeled “Disposal” begins). Issues of spent fuel pool and dry cask safety are the concerns of a specific plant’s relicensing process, said regulators in the latest hearings.

“It’s like dealing with the Mad Hatter,” commented Kevin Kamps, a radioactive waste specialist for industry watchdog Beyond Nuclear. “Jam yesterday, jam tomorrow, but never jam today.”

The edict originated with the White Queen in Lewis Carroll’s Through the Looking Glass, but it is all too appropriate–and no less maddening–when trying to motivate meaningful change at the Nuclear Regulatory Commission. The NRC has used the nuclear waste confidence decision in licensing inquiries, but in these latest scoping hearings, we are told the NWCD does not apply to on-site waste storage. The Appeals Court criticized the lack of site-specificity in the waste storage rules, but the directorate says they are now only working on a generic guideline. The court disapproved of the NRC’s continued relicensing of nuclear facilities based on the assumption of a long-term geologic repository that in reality did not exist–and the NRC said it was suspending licensing pending a new rule–but now regulators say they don’t anticipate the denial or even the delay of any reactor license application while they await the new waste confidence decision (PDF pp49-50).

In fact, the NRC has continued the review process on pending applications, even though there is now no working NWCD–something deemed essential by the courts–against which to evaluate new licenses.

The period for public comment on the scope of the waste confidence decision ended January 2, and no more scoping hearings are planned. There will be other periods for civic involvement–during the environmental impact survey and rulemaking phases–but, with each step, the areas open to input diminish. And the current schedule has the entire process greatly accelerated over previous revisions.

On January 3, a coalition of 24 grassroots environmental groups filed documents with the Nuclear Regulatory Commission (PDF) protesting “the ‘hurry up’ two-year timeframe” for this assessment, noting the time allotted for environmental review falls far short of the 2019 estimate set by the NRC’s own technical staff. The coalition observed that two years was also not enough time to integrate post-Fukushima recommendations, and that the NRC was narrowing the scope of the decision–ignoring specific instructions from the Appeals Court–in order to accelerate the drafting of a new waste storage rule.

Speed might seem a valuable asset if the NRC were shepherding a Manhattan Project-style push for a solution to the ever-growing waste problem–the one that began with the original Manhattan Project–but that is not what is at work here. Instead, the NRC, under court order, is trying to set the rules for determining the risk of all that high-level radioactive waste if there is no new, feasible solution. The NRC is looking for a way to permit the continued operation of the US nuclear fleet–and so the continued manufacture of nuclear waste–without an answer to the bigger, pressing question.

A plan called HOSS

While there is much to debate about what a true permanent solution to the nuclear waste problem might look like, there is little question that the status quo is unacceptable. Spent fuel pools were never intended to be used as they are now used–re-racked and densely packed with over a generation of fuel assemblies. Both the short- and long-term safety and security of the pools has now been questioned by the courts and laid bare by reality. Pools at numerous US facilities have leaked radioactive waste (PDF) into rivers, groundwater and soil. Sudden “drain downs” have come perilously close to triggering major accidents in plants shockingly close to major population centers. Recent hurricanes have knocked out power to cooling systems and flooded backup generators, and last fall’s superstorm came within inches of overwhelming the coolant intake structure at Oyster Creek in New Jersey.

The crisis at Japan’s Fukushima Daiichi facility was so dangerous and remains dangerous to this day in part because of the large amounts of spent fuel stored in pools next to the reactors but outside of containment–a design identical to 35 US nuclear reactors. A number of these GE Mark 1 Boiling Water Reactors–such as Oyster Creek and Vermont Yankee–have more spent fuel packed into their individual pools than all the waste in Fukushima Daiichi Units 1, 2, 3, and 4 combined.

Dry casks, the obvious next “less-bad” option for high-level radioactive waste, were also not supposed to be a permanent panacea. The design requirements and manufacturing regulations of casks–especially the earliest generations–do not guarantee their reliability anywhere near the 100 to 300 years now being casually tossed around by NRC officials. Some of the nation’s older dry casks (which in this case means 15 to 25 years) have already shown seal failures and structural wear (PDF). Yet, the government does not require direct monitoring of casks for excessive heat or radioactive leaks–only periodic “walkthroughs.”

Add in the reluctance of plant operators to spend money on dry cask transfer and the lack of any workable plan to quickly remove radioactive fuel from failed casks, and dry cask storage also appears to fail to attain any court-ordered level of confidence.

Interim plans, such as regional consolidated above-ground storage, remain just that–plans. There are no sites selected and no designs for such a facility up for public scrutiny. What is readily apparent, though, is that the frequent transport of nuclear waste increases the risk of nuclear accidents. There does not, as of now, exist a transfer container that is wholly leak proof, accident proof, and impervious to terrorist attack. Moving high-level radioactive waste across the nation’s highways, rail lines and waterways has raised fears of “Mobile Chernobyls” and “Floating Fukushimas.”

More troubling still, if past (and present) is prologue, is the tendency of options designed as “interim” to morph into a default “permanent.” Can the nation afford to kick the can once more, spending tens (if not hundreds) of millions of dollars on a “solution” that will only add a collection of new challenges to the existing roster of problems? What will the interim facilities become beyond the next problem, the next site for costly mountains of poorly stored, dangerous waste?

Hardened: The more robust HOSS option as proposed in 2003. (From "Robust Storage of Spent Nuclear Fuel: A Neglected Issue of Homeland Security" courtesy of the Nuclear Information and Resource Service)

Hardened: The more robust HOSS option as proposed in 2003. (From “Robust Storage of Spent Nuclear Fuel: A Neglected Issue of Homeland Security” courtesy of the Nuclear Information and Resource Service)

If there is an interim option favored by many nuclear experts, engineers and environmentalists (PDF), it is something called HOSS–Hardened On-Site Storage (PDF). HOSS is a version of dry cask storage that is designed and manufactured to last longer, is better protected against leaks and better shielded from potential attacks. Proposals (PDF) involve steel, concrete and earthen barriers incorporating proper ventilation and direct monitoring for heat and radiation.

But not all reactor sites are good candidates for HOSS. Some are too close to rivers that regularly flood, some are vulnerable to the rising seas and increasingly severe storms brought on by climate change, and others are close to active geologic fault zones. For facilities where hardened on-site storage would be an option, nuclear operators will no doubt fight the requirements because of the increased costs above and beyond the price of standard dry cask storage, which most plant owners already try to avoid or delay.

The first rule of holes

Mixed messages: A simple stone marker in Red Gate Woods, just outside Chicago, tries to both warn and reassure visitors to this public park. (Photo: Kevin Kamps, Beyond Nuclear. Used by permission.)

Mixed messages: A simple stone marker in Red Gate Woods, just outside Chicago, tries to both warn and reassure visitors to this public park. (Photo: Kevin Kamps, Beyond Nuclear. Used by permission.)

In a wooded park just outside Chicago sits a dirt mound, near a bike path, that contains parts of the still-highly radioactive remains of CP-1, the world’s first atomic pile. Seven decades after that nuclear fuel was first buried, many health experts would not recommend that spot (PDF) for a long, languorous picnic, nor would they recommend drinking from nearby water fountains. To look at it in terms Arthur Compton might favor, when it comes to the products of nuclear chain reactions, the natives are restless. . . and will remain so for millennia to come.

One can perhaps forgive those working in the pressure cooker of the Manhattan Project and in the middle of a world war for ignoring the forest for the trees–for not considering waste disposal while pursuing a self-sustaining nuclear chain reaction. Perhaps. But, as the burial mound in Red Gate Woods reminds us, ignoring a problem does not make it go away.

And if that small pile, or the mountains of spent fuel precariously stored around the nation are not enough of a prompt, the roughly $960 million that the federal government has had to pay private nuclear operators should be. For every year that the Department of Energy does not provide a permanent waste repository–or at least some option that takes the burden of storing spent nuclear fuel off the hands (and off the books) of power companies–the government is obligated to reimburse the industry for the costs of onsite waste storage. By 2020, it is estimated that $11 billion in public money will have been transferred into the pockets of private nuclear companies. By law, these payments cannot be drawn from the ratepayer-fed fund that is earmarked for a permanent geologic repository, and so, these liabilities must be paid out of the federal budget. Legal fees for defending the DoE against these claims will add another 20 to 30 percent to settlement costs.

The Federal Appeals Court, too, has sent a clear message that the buck needs to stop somewhere at some point–and that such a time and place should be both explicit and realistic. The nuclear waste confidence scoping process, however, is already giving the impression that the NRC’s next move will be generic and improbable.

The late, great Texas journalist Molly Ivins once remarked, “The first rule of holes” is “when you’re in one, stop digging.” For high-level radioactive waste, that hole is now a mountain, over 70 years in the making and over 70,000 tons high. If the history of the atomic age is not evidence enough, the implications of the waste confidence decision process put the current crisis in stark relief. There is, right now, no good option for dealing with the nuclear detritus currently on hand, and there is not even a plan to develop a good option in the near future. Without a way to safely store the mountain of waste already created, under what rationale can a responsible government permit the manufacture of so much more?

The federal government spends billions to perpetuate and protect the nuclear industry–and plans to spend billions more to expand the number of commercial reactors. Dozens of facilities already are past, or are fast approaching, the end of their design lives, but the Nuclear Regulatory Commission has yet to reject any request for an operating license extension–and it is poised to approve many more, nuclear waste confidence decision not withstanding. Plant operators continue to balk at any additional regulations that would require better waste management.

The lesson of the first 70 years of fission is that we cannot endure more of the same. The government–from the DoE to the NRC–should reorient its priorities from creating more nuclear waste to safely and securely containing what is now here. Money slated for subsidizing current reactors and building new ones would be better spent on shuttering aging plants, designing better storage options for their waste, modernizing the electrical grid, and developing sustainable energy alternatives. (And reducing demand through conservation programs should always be part of the conversation.)

Enrico Fermi might not have foreseen (or cared about) the mountain of waste that began with his first atomic pile, but current scientists, regulators and elected officials have the benefit of hindsight. If the first rule of holes says stop digging, then the dictum here should be that when you’re trying to summit a mountain, you don’t keep shoveling more garbage on top.

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

LIPA’s Nuclear Legacy Leaves Sandy’s Survivors in the Dark

Head of Long Island Power Authority Steps Aside as Governor Convenes Special Commission, But Problems Have Deep Roots

The decommissioned Shoreham Nuclear Power Plant still occupies a 58-acre site on Long Island Sound. (photo: Paul Searing via Wikipedia)

As the sun set on Veterans Day, 2012, tens of thousands of homes on New York’s Long Island prepared to spend another night in darkness. The lack of light was not part of any particular memorial or observance; instead, it was the noisome and needless culmination of decades of mismanagement and malfeasance by a power company still struggling to pay for a now-moldering nuclear plant that never provided a single usable kilowatt to the region’s utility customers.

The enterprise in charge of all that darkness bears little resemblance to the sorts of power companies that provide electricity to most Americans–it is not a private energy conglomerate, nor is it really a state- or municipality-owned public utility–but the pain and frustration felt by Long Island residents should be familiar to many. And the tale of how an agency mandated by law to provide “a safer, more efficient, reliable and economical supply of electric energy” failed to deliver any of that is at its very least cautionary, and can likely serve as an object lesson for the entire country.

Almost immediately, the United States will be faced with tough choices about how to create and deliver electrical power. Those choices are defined not just by demand but by a warming climate and an infrastructure already threatened by the changes that climate brings. When one choice, made by a private concern nearly 50 years ago, means weeks of power outages and billions of dollars in repair costs today, it suggests new decisions about America’s energy strategy should be handled with care.

A stormy history

Two weeks after Hurricane-cum-Superstorm Sandy battered the eastern coast of the United States, upwards of 76,000 customers of the Long Island Power Authority (LIPA) were still without power. That number is down markedly from the one million LIPA customers (91 percent of LIPA’s total customer base) that lost power as Sandy’s fierce winds, heavy rains and massive storm surge came up the Atlantic Coast on Monday, October 29, and down, too, from the over 300,000 still without service on election day, but at each step of the process, consumers and outside observers alike agreed it was too many waiting too long.

And paying too much. LIPA customers suffer some of the highest utility rates in the country, and yet, the power outages that came with last month’s storm–and a subsequent snowstorm nine days later–while disgraceful, were far from unexpected. The Long Island Power Authority and its corporate predecessor, the Long Island Lighting Company (LILCO), have a long track record of service failures and glacial disaster response times dating back to Hurricane Gloria, which hit the region in the autumn of 1985.

After Gloria, when many Long Island homes lost power for two weeks, and again after widespread outages resulted from 2011’s Hurricane Irene, the companies responsible for providing electricity to the residents of most of Nassau and Suffolk Counties, along with parts of the Borough of Queens in New York City, were told to make infrastructure improvements. In 2006, it was reported that LIPA had pledged $20 million annually in grid improvements. But the reality proved to be substantially less–around $12.5 million–while LIPA also cut back on transmission line inspections.

Amidst the current turmoil, New York Governor Andrew Cuomo has been highly critical of LIPA, calling for the “removal of management” for the “colossal misjudgments” that led to the utility’s failures. Cuomo made similar statements about LIPA and its private, for-profit subcontractor, National Grid, last year after Hurricane Irene. But as another day mercifully dawned on tens of thousands of homes still without electricity over two weeks after Sandy moved inland, the dysfunctional structure in charge of the dysfunctional infrastructure remains largely unchanged.

Which, it must be noted, is especially vexing because Governor Cuomo should not be powerless when it came to making changes to the Long Island Power Authority.

It was Andrew’s father, Governor Mario Cuomo, who oversaw the creation of LIPA in 1985 to clean up the fiscal and physical failures of the Long Island Lighting Company. LILCO’s inability to quickly restore power to hundreds of thousands of customers after Hurricane Gloria met with calls for change quite similar to contemporary outrage. But it was LILCO’s crushing debt that perhaps exacerbated problems with post-Gloria cleanup and absolutely precipitated the government takeover.

The best-laid schemes

It was April 1965 when LILCO’s president announced plans for Long Island’s first commercial nuclear power facility to be built on Long Island Sound near the town of Brookhaven, already home to a complex of research reactors. The 540-megawatt General Electric boiling water reactor (similar in design to those that failed last year in Japan) was estimated to cost $65 million and come online in 1973.

The price of the Shoreham nuclear project quickly ballooned, first, as LILCO proposed additional reactors across Long Island–none of which were ever built–and then as the utility up-rated the original design to 820 megawatts. Further design changes, some mandated by the Nuclear Regulatory Commission, and construction delays pushed the price tag to $2 billion by the late 1970s.

Because of its proximity to the Brookhaven reactors, LILCO expected little public activism against Shoreham, but local opposition steadily grew throughout the ’70s. The Sierra Club, the Audubon Society and environmentalist Barry Commoner all raised early objections. After a meltdown at Pennsylvania’s Three Mile Island nuclear plant in 1979, Shoreham saw 15,000 gather in protest outside its gates, an action that resulted in 600 arrests.

Three Mile Island led to new NRC rules requiring nuclear facilities to coordinate with civic authorities on emergency plans for accidents necessitating the evacuation of surrounding communities. LILCO was forced to confront the fact that, for Shoreham, the only routes for evacuation were already clogged highways that bottlenecked at a few bridges into Manhattan, 60 miles to the west.

In 1983, the government of Suffolk County, where Shoreham was located, determined that there was no valid plan for evacuating its population. New York Governor Mario Cuomo followed suit, ordering state regulators not to approve the LILCO-endorsed evacuation plan.

Still, as Shoreham’s reactor was finally completed in 1984, 11 years late and nearly 100-times over budget, the NRC granted LILCO permission for a low-power test.

But 1985’s Hurricane Gloria further eroded trust in LILCO, and the next year, the Chernobyl disaster further galvanized opposition to a nuclear plant. As LILCO’s debts mounted, it became apparent that a new structure was needed to deliver dependable power to Long Island residents.

The power to act

The Long Island Power Act of 1985 created LIPA to assume LILCO’s assets, and a subsidiary of this municipal authority, also known as LIPA, acquired LILCO’s electric transmission and distribution systems a year later. The radioactive but moribund Shoreham plant was purchased by the state for one dollar, and later decommissioned at a cost of $186 million. (Shoreham’s turbines were sent to Ohio’s Davis-Besse nuclear facility; its nuclear fuel was sent to the Limerick Nuclear Power Plant, with LILCO/LIPA paying Philadelphia Electric Company $50 million to take the fuel off its hands.)

The $6 billion Shoreham folly was passed on to consumers in the form of a three-percent surcharge on utility bills, to be charged for 30 years. Service on LILCO’s $7 billion in debt, half of which is a direct result of Shoreham, makes up 16 percent of every LIPA bill.

But LIPA itself is not really a utility company. Its roughly 100 employees are low on public utilities experience and, as the New York Times reports, high on political patronage. The majority of LILCO’s non-nuclear power plants were sold to a newly created company called KeySpan (itself a product of a merger of LILCO holdings with Brooklyn Union Gas), and maintenance of LIPA’s grid was subcontracted to KeySpan.

KeySpan was in turn purchased by British power company National Grid in 2006.

The situation is now further complicated, as National Grid lost out to Public Service Enterprise Group, New Jersey’s largest electricity provider, in a bid to continue its maintenance contract. PSEG takes over the upkeep of LIPA’s grid in 2014.

A commission on transmission

On Tuesday, Governor Cuomo the Younger announced formation of a Moreland Commission–a century-old New York State provision that allows for an investigative body with subpoena power–to explore ways of reforming or restructuring LIPA, including the possibility of integrating with the state’s New York Power Authority. And just hours later, LIPA’s COO and acting CEO, Michael Hervey, announced he would leave the utility at the end of the year.

But the problems look more systemic than one ouster or one commission, no matter how august, can correct. Andrew Cuomo’s inability to appoint new LIPA board members could owe as much to entrenched patronage practices as to political pre-positioning. State Republicans, overwhelmingly the beneficiaries of LIPA posts, are engaged in a behind-the-scenes standoff with the Democratic Governor over whom to name as a permanent LIPA director. Others see Cuomo as too willing to accept the political cover conveyed by not having his appointees take control of the LIPA board.

Still, no matter who runs LIPA, the elaborate public-private Russian-doll management structure makes accountability, not to mention real progress, hard to fathom. Perhaps it would be crazy to expect anything but regular disasters from a grid maintained by a foreign-owned, lame-duck, for-profit corporation under the theoretical direction of a leaderless board of political appointees, funded by some of the highest electricity rates in the country. And those rates, by the way, are not subject to the same public utilities commission oversight that would regulate a private utility, nor do they seem sensitive to any democratic checks and balances.

And all of this was created to bail out a utility destabilized by the money pit that is nuclear power.

The truth has consequences

As nighttime temperatures dip below freezing, it will be cold comfort, indeed, for those still without the power to light or heat their homes to learn that money they have personally contributed to help LIPA with its nuclear debt could have instead paid for the burying of vulnerable transmission lines and the storm-proofing of electrical transformers. But the unfortunate results of that trade-off hold a message for the entire country.

It was true (if not obvious) in 1965, it was true in 1985, and it is still true today: nuclear power, beyond being dirty and dangerous, is an absurdly expensive way to generate electricity. This is especially apropos now, in the wake of a superstorm thought to be a harbinger of things to come as the climate continues to warm.

In recent years, the nuclear industry has latched on to global warming as its latest raison d’être, claiming, quite inaccurately, that nuclear is a low-greenhouse gas answer to growing electrical needs. While the entire lifecycle of nuclear power is decidedly not climate friendly, it is perhaps equally as important to consider that nuclear plants take too long and cost too much to build. The time, as well as the federal and consumer dollars, would be better spent on efficiency, conservation, and truly renewable, truly climate-neutral energy projects.

That is not a hypothetical; that is the lesson of LIPA, and the unfortunate reality–still–for far too many New York residents.

Superstorm Sandy Shows Nuclear Plants Who’s Boss

Oyster Creek Nuclear Power Station as seen in drier times. (photo via wikipedia)

Once there was an ocean liner; its builders said it was unsinkable. Nature had other ideas.

On Monday evening, as Hurricane Sandy was becoming Post-Tropical Cyclone Sandy, pushing record amounts of water on to Atlantic shores from the Carolinas to Connecticut, the Nuclear Regulatory Commission issued a statement. Oyster Creek, the nation’s oldest operating nuclear reactor, was under an Alert. . . and under a good deal of water.

An Alert is the second rung on the NRC’s four-point emergency classification scale. It indicates “events are in process or have occurred which involve an actual or potential substantial degradation in the level of safety of the plant.” (By way of reference, the fourth level–a General Emergency–indicates substantial core damage and a potential loss of containment.)

As reported earlier, Oyster Creek’s coolant intake structure was surrounded by floodwaters that arrived with Sandy. Oyster Creek’s 47-year-old design requires massive amounts of external water that must be actively pumped through the plant to keep it cool. Even when the reactor is offline, as was the case on Monday, water must circulate through the spent fuel pools to keep them from overheating, risking fire and airborne radioactive contamination.

With the reactor shut down, the facility is dependant on external power to keep water circulating. But even if the grid holds up, rising waters could trigger a troubling scenario:

The water level was more than six feet above normal. At seven feet, the plant would lose the ability to cool its spent fuel pool in the normal fashion, according to Neil Sheehan, a spokesman for the Nuclear Regulatory Commission.

The plant would probably have to switch to using fire hoses to pump in extra water to make up for evaporation, Mr. Sheehan said, because it could no longer pull water out of Barnegat Bay and circulate it through a heat exchanger, to cool the water in the pool.

If hoses desperately pouring water on endangered spent fuel pools remind you of Fukushima, it should. Oyster Creek is the same model of GE boiling water reactor that failed so catastrophically in Japan.

The NRC press release (PDF) made a point–echoed in most traditional media reports–of noting that Oyster Creek’s reactor was shut down, as if to indicate that this made the situation less urgent. While not having to scram a hot reactor is usually a plus, this fact does little to lessen the potential problem here. As nuclear engineer Arnie Gundersen told Democracy Now! before the Alert was declared:

[Oyster Creek is] in a refueling outage. That means that all the nuclear fuel is not in the nuclear reactor, but it’s over in the spent fuel pool. And in that condition, there’s no backup power for the spent fuel pools. So, if Oyster Creek were to lose its offsite power—and, frankly, that’s really likely—there would be no way cool that nuclear fuel that’s in the fuel pool until they get the power reestablished. Nuclear fuel pools don’t have to be cooled by diesels per the old Nuclear Regulatory Commission regulations.

A site blackout (SBO) or a loss of coolant issue at Oyster Creek puts all of the nuclear fuel and high-level radioactive waste at risk. The plant being offline does not change that, though it does, in this case, increase the risk of an SBO.

But in the statement from the NRC, there was also another point they wanted to underscore (or one could even say “brag on”): “As of 9 p.m. EDT Monday, no plants had to shut down as a result of the storm.”

If only regulators had held on to that release just one more minute. . . .

SCRIBA, NY – On October 29 at 9 p.m., Nine Mile Point Unit 1 experienced an automatic reactor shutdown.

The shutdown was caused by an electrical grid disturbance that caused the unit’s output breakers to open. When the unit’s electrical output breakers open, there is nowhere to “push” or transmit the power and the unit is appropriately designed to shut down under these conditions.

“Our preliminary investigation identified a lighting pole in the Scriba switchyard that had fallen onto an electrical component. This is believed to have caused the grid disturbance. We continue to evaluate conditions in the switchyard,” said Jill Lyon, company spokesperson.

Nine Mile Point Nuclear Station consists of two GE boiling water reactors, one of which would be the oldest operating in the US were it not for Oyster Creek. They are located just outside Oswego, NY, on the shores of Lake Ontario. Just one week ago, Unit 1–the older reactor–declared an “unusual event” as the result of a fire in an electrical panel. Then, on Monday, the reactor scrammed because of a grid disturbance, likely caused by a lighting pole knocked over by Sandy’s high winds.

An hour and forty-five minutes later, and 250 miles southeast, another of the nation’s ancient reactors also scrammed because of an interruption in offsite power. Indian Point, the very old and very contentious nuclear facility less than an hour’s drive north of New York City, shut down because of “external grid issues.” And Superstorm Sandy has given Metropolitan New York’s grid a lot of issues.

While neither of these shutdowns is considered catastrophic, they are not as trivial as the plant operators and federal regulators would have you believe. First, emergency shutdowns–scrams–are not stress-free events, even for the most robust of reactors. As discussed here before, it is akin to slamming the breaks on a speeding locomotive. These scrams cause wear and tear aging reactors can ill afford.

Second, scrams produce pressure that usually leads to the venting of some radioactive vapor. Operators and the NRC will tell you that these releases are well within “permissible” levels–what they can’t tell you is that “permissible” is the same as “safe.”

If these plants were offline, or running at reduced power, the scrams would not have been as hard on the reactors or the environment. Hitting the breaks at 25 mph is easier on a car than slamming them while going 65. But the NRC does not have a policy of ordering shutdowns or reductions in capacity in advance of a massive storm. In fact, the NRC has no blanket protocol for these situations, period. By Monday morning, regulators agreed to dispatch extra inspectors to nuclear plants in harm’s way (and they gave them sat phones, too!), but they left it to private nuclear utility operators to decide what would be done in advance to prepare for the predicted natural disaster.

Operators and the Nuclear Regulatory Commission spokes-folks like to remind all who will listen (or, at least, all who will transcribe) that nuclear reactors are the proverbial house of bricks–a hurricane might huff and puff, but the reinforced concrete that makes up a typical containment building will not blow in. But that’s not the issue, and the NRC, at least, should know it.

Loss of power (SBOs) and loss of coolant accidents (LOCAs) are what nuclear watchdogs were warning about in advance of Sandy, and they are exactly the problems that presented themselves in New York and New Jersey when the storm hit.

The engineers of the Titanic claimed that they had built the unsinkable ship, but human error, corners cut on construction, and a big chunk of ice cast such hubris asunder. Nuclear engineers, regulators and operators love to talk of four-inch thick walls and “defense-in-depth” backup systems, but the planet is literally littered with the fallout of their folly. Nuclear power systems are too complex and too dangerous for the best of times and the best laid plans. How are they supposed to survive the worst of times and no plans at all?

As World Honors Fukushima Victims, NRC Gives Them a One-Fingered Salute

Sign from Fukushima commemoration and anti-nuclear power rally, Union Square Park, NYC, 3/11/12. (photo: G. Levine)

Nearly a week after the first anniversary of the Japanese earthquake and tsunami that started the crisis at the Fukushima Daiichi nuclear power facility, I am still sorting through the dozens of reports, retrospectives and essays commemorating the event. The sheer volume of material has been a little exhausting, but that is, of course, compounded by the weight of the subject. From reviewing the horrors of a year ago–now even more horrific, thanks to many new revelations about the disaster–to contemplating what lies ahead for residents of Japan and, indeed, the world, it is hard just to read about it; living it–then, now, and in the future–is almost impossible for me to fathom.

But while living with the aftermath might be hard to imagine, that such a catastrophe could and likely would happen was not. In fact, if there is a theme (beyond the suffering of the Japanese people) that runs through all the Fukushima look-backs, it is the predictability–the mountains of evidence that said Japan’s nuclear plants were vulnerable, and if nothing were done, a disaster (like the one we have today) should be expected.

I touched on this last week in my own anniversary examination, and now I see that Dawn Stover, contributing editor at The Bulletin of the Atomic Scientists, draws a similar comparison:

Although many politicians have characterized 3/11 and 9/11 as bizarre, near-impossible events that could not have been foreseen, in both cases there were clear but unheeded warnings. . . . In the case of 3/11, the nuclear plant’s operators ignored scientific studies showing that the risks of a tsunami had been dramatically underestimated. Japan’s “safety culture,” which asserted that accidents were impossible, prevented regulators from taking a hard look at whether emergency safety systems would function properly in a tsunami-caused station blackout.

Stover goes on to explain many points where the two nightmare narratives run parallel. She notes how while governments often restrict information, stating that they need to guard against mass panic, it is actually the officials who are revealed to be in disarray. By contrast, in both cases, first responders behaved rationally and professionally, putting themselves at great risk in attempts to save others.

In both cases, communication–or, rather, the terrible lack of it–between sectors of government and between officials and responders exacerbated the crisis and put more lives at risk.

And with both 9/11 and 3/11, the public’s trust in government was shaken. And that crisis of trust was made worse by officials obscuring the facts and covering their tracks to save their own reputations.

But perhaps with that last point, it is more my reading my observations into hers than a straight retelling of Stover. Indeed, it is sad to note that Stover concludes her Fukushima think piece with a similar brand of CYA hogwash:

By focusing needed attention on threats to our existence, 3/11 and 9/11 have brought about some positive changes. The nuclear disaster in Japan has alerted nuclear regulators and operators around the world to the vulnerabilities of nuclear power plant cooling systems and will inevitably lead to better standards for safety and siting — and perhaps even lend a new urgency to the problem of spent fuel. Likewise, 9/11 resulted in new security measures and intelligence reforms that have thus far prevented another major terrorist attack in the United States and have created additional safeguards for nuclear materials.

When it comes to post-9/11 “security” and “intelligence reforms,” Stover is clearly out of her depth, and using the Bush-Cheney “no new attacks” fallacy frankly undermines the credibility of the entire essay. But I reference it here because it sets up a more important point.

If only Stover had taken a lesson from her own story. The Fukushima disaster has not alerted nuclear regulators and operators to vulnerabilities–as has been made clear here and in several of the post-Fukushima reports, those vulnerabilities were all well known, and known well in advance of 3/11/11.

But even if this were some great and grand revelation, some signal moment, some clarion call, what in the annals of nuclear power makes Stover or any other commentator think that call will be heard? “Inevitably lead to better standards”–inevitably? We’d all exit laughing if we weren’t running for our lives.

Look no further than the “coincidental” late-Friday, pre-anniversary news dump from the US Nuclear Regulatory Commission.

Late on March 9, 2012, two days before the earthquake and tsunami would be a year in the rear-view mirror, the NRC put on a big splashy show. . . uh, strike that. . . released a weirdly underplayed written announcement that the commission had approved a set of new rules drawing on lessons learned from the Fukushima crisis:

The Nuclear Regulatory Commission ordered major safety changes for U.S. nuclear power plants Friday. . . .

The orders require U.S. nuclear plants to install or improve venting systems to limit core damage in a serious accident and to install sophisticated equipment to monitor water levels in pools of spent nuclear fuel.

The plants also must improve protection of safety equipment installed after the 2001 terrorist attacks and make sure it can handle damage to multiple reactors at the same time.

Awwwrighty then, that sounds good, right? New rules, more safety, responsive to the Japanese disaster at last–but the timing instantly raised questions.

It didn’t take long to discover these were not the rules you were looking for.

First off, these are only some of the recommendations put before the commission by their Near-Term Task Force some ten months ago, and while better monitoring of water levels in spent fuel pools and plans to handle multiple disasters are good ideas, it has been noted that the focus on hardening the vents in Mark I and Mark II boiling water reactors actually misdiagnoses what really went wrong in two of the Fukushima Daiichi reactors.

Also, it should be noted this represents less than half the recommendations in last summer’s report. It also does not mandate a migration of spent fuel from pools to dry casks, an additional precaution not explicitly in the report, but stressed by NRC chief Gregory Jaczko, as well as many industry watchdogs.

But most important–and glaring–of all, the language under which these rules passed could make it that almost none of them are ever enforced.

This is a little technical, so let me turn to one of the few members of Congress that actually spends time worrying about this, Rep. Ed Markey (D MA-7):

While I am encouraged that the Commission supports moving forward with three of the most straightforward and quickly-issued nuclear safety Orders recommended by their own expert staff, I am disappointed that several Commissioners once again have rejected the regulatory justification that they are necessary for the adequate protection of nuclear reactors in this country. . . .

After the terrorist attacks of September 11, 2001, the NRC determined that some nuclear security upgrades were required to be implemented for the “adequate protection” of all U.S. nuclear reactors. This meant that nuclear reactors would not be considered to be sufficiently secure without these new measures, and that an additional cost-benefit “backfit” analysis would not be required to justify their implementation. The “adequate protection” concept is derived from the Atomic Energy Act of 1954, and is reflected in NRC’s “Backfit Rule” which specifies that new regulations for existing nuclear reactors are not required to include this extra cost-benefit “backfit” analysis when the new regulations are “necessary to ensure that the facility provides adequate protection to the health and safety of the public.”

Both the NRC Fukushima Task Force and the NRC staff who reviewed the Task Force report concluded that the new post-Fukushima safety recommendations, including the Orders issued today, were also necessary for the “adequate protection” of existing U.S. nuclear power plants, and that additional cost-benefit analysis should not be required to justify their implementation.

While Chairman Jaczko’s vote re-affirmed his support of all the Near-Term Task Force’s recommendations, including the need to mandate them all on the basis that they are necessary for the adequate protection of all U.S. nuclear power plants, Commissioner Svinicki did not do so for any of the Orders, Commissioner Magwood did not do so for two of the three Orders, and Commissioners Apostolakis and Ostendorff rejected that basis for one of the three. As a result, the Order requiring technologies to monitor conditions in spent nuclear fuel pools during emergencies will proceed using a different regulatory basis. More importantly, the inability of the Commission to unanimously accept its own staff’s recommendations on these most straightforward safety measures presents an ominous signal of the manner in which the more complicated next sets of safety measures will be considered.

In other words, last Friday’s move was regulatory kabuki. By failing to use the strictest language for fuel pools, plant operators will be allowed to delay compliance for years, if not completely excuse themselves from it, based on the argument that the safety upgrade is too costly.

The other two rules are also on shaky ground, as it were. And even if by some miracle, the industry chose not to fight them, and the four uber-pro-nuclear commissioners didn’t throw up additional roadblocks, nothing is required of the nuclear facilities until December 31, 2016.

So, rather than it being a salutary moment, a tribute of sorts to the victims in Japan on the anniversary of their disaster, the announcement by the NRC stands more as an insult. It’s as if the US government is saying, “Sure, there are lessons to be learned here, but the profits of private energy conglomerates are more important than any citizen’s quaint notions of health and safety. ”

As if any more examples were needed, these RINOs (rules in name only) demonstrate again that in America, as in Japan, the government is too close to the nuclear industry it is supposed to police.

And, for the bigger picture, as if any more examples were needed, be it before or after March 11, it really hasn’t been that hard to imagine the unimaginable. When an industry argues it has to forgo a margin of safety because of cost, there’s a good chance it was too dangerous and too expensive to begin with.

* * *

By way of contrast, take a look in at the some of the heartfelt expressions of commemoration and protest from New York’s Fukushima memorial and anti-nuclear rally, held last Sunday in Union Square Park.

Nuclear “Renaissance” Meets Economic Reality, But Who Gets the Bill?

Crystal River Nuclear Generating Plant, Unit 3, 80 miles north of Tampa, FL. (photo: U.S. NRC)

Crystal River is back in the news. Regular readers will recall when last we visited Progress Energy Florida’s (PEF) troubled nuclear reactor it was, shall we say, hooked on crack:

The Crystal River story is long and sordid. The containment building cracked first during its construction in 1976. That crack was in the dome, and was linked to a lack of steel reinforcement. Most nuclear plants use four layers of steel reinforcement; Crystal River used only one. The walls were built as shoddily as the dome.

The latest problems started when Crystal River needed to replace the steam generator inside the containment building. Rather than use an engineering firm like Bechtel or SGT–the companies that had done the previous 34 such replacements in the US–Progress decided it would save a few bucks and do the job itself.

Over the objections of on-site workers, Progress used a different method than the industry standard to cut into the containment building. . . and that’s when this new cracking began. It appears that every attempt since to repair the cracks has only led to new “delamination” (as the industry calls it).

Sara Barczak of CleanEnergy Footprints provides more detail on the last couple of years:

The Crystal River reactor has been plagued with problems ever since PEF self-managed a steam generation replacement project in September 2009. The replacement project was intended to last 3 months, until PEF informed the Commission that it had cracked the containment structure during the detensioning phase of the project. PEF subsequently announced that the CR3 reactor would be repaired and back in service by the 3rd quarter of 2010…then by the 4th quarter of 2010…and then by the first quarter of 2011. On March 15, 2011 PEF informed the Commission that it had cracked the reactor again during the retensioning process and subsequently told the Commission that it estimated repair costs of $1.3 billion and a return to service in 2014. Shortly thereafter, the Humpty Dumpty Crystal River reactor suffered yet another crack on July 26, 2011.

That July crack was later revealed to be 12-feet long and 4-feet wide–and here, at least when it came to notifying the Nuclear Regulatory Commission, “later” means much later. . . like four months later.

The issue, of course–as anyone with a lifetime crack habit will tell you–is that this all gets very expensive. Not only is there the cost of the repairs. . . and the repairs to the repairs. . . and the repairs to the repairs to the repairs. . . there is the cost of replacing the energy that was supposed to be supplied to PEF customers by the crippled reactor.

And then there is the cost of the new reactors. . . .

Wait, what?

Yes, based on the amazing success they have had managing Crystal River–and something called a “determination of need,” which was granted in 2008–Progress Energy holds out hope of someday building two of those trendy new AP1000 nuclear reactors at another Florida site, this one in Levy County.

And who is expected to pick up the tab? Who is on the hook, not just for repairs and replacement energy at Crystal River, but for PEF keeping its options open at Levy? Well, not surprisingly in “privatize profits, socialize risk” America, the plan was to stick Florida ratepayers with the bill (again Footprints provides the numbers):

Customer bills for instance, were expected to increase by $16/mo. in 2016; $26/mo. in 2017 and a whopping $49/mo. in 2020. Initially, Progress expected the proposed reactors to cost $4-6 billion each, coming online beginning in 2016. Just a few years later, the estimated costs have skyrocketed to over $22 billion and the online date, if the reactors ever even come online, has bumped back to 2021 and 2022. And the Office of Public Counsel believes that PEF may not intend to complete the reactors until 2027, if at all. The company has spent over $1 billion dollars on the Levy nuclear reactors and has yet to commit to build them. And the company is entitled to recover all its preconstruction and carrying costs from its customers before even a kilowatt of electricity is produced. In fact, even if the project is never completed PEF can recover all its construction costs from customers courtesy of the 2006 anti-consumer “early cost recovery” state law…essentially a nuclear tax scheme.

But now, as of this week, there is a new plan. . . stick Florida ratepayers with the bill:

The state Public Service Commission on Wednesday unanimously approved an agreement that will increase the power bills of Progress Energy Florida customers — who already pay among the highest rates in the state.

It is supposed to be a win for consumers.

The deal includes a $288 million “refund” of money customers were to pay to replace power from the crippled Crystal River nuclear plant, which has been offline since fall 2009 and might never return to service.

PSC staff concluded that customer rates still would increase. The average Progress customer’s bill on Jan. 1 is expected to increase $4.93 a month per 1,000 kilowatt hours of usage, from $123.19 to $128.12, subject to adjustments for fuel costs.

That’s a “win” for Floridians, it seems, because they are paying out something less for Progress Energy’s mistakes–at least in the near term. But even that caveat is subject to scrutiny:

While the agreement provides a replacement power cost refund over 3 years of $288 million to PEF customers (due to the CR3 outage) – it comes packaged with a base rate increase of $150 million and it precludes the parties from challenging up to $1.9 billion (yes, billion) fuel and replacement power costs from 2009 to 2016.

And that’s not all. Also in the agreement is a requirement that PEF start (yes, that is start) the latest repairs on Crystal River by the end of 2012; if they do not, Progress has to “refund” an additional $100 million to consumers. Missing, however, from the agreement is any new estimate (given the latest revelations, not to mention any post-Fukushima upgrades required) of the cost should PEF actually try to remedy all of Crystal River’s problems–and perhaps even more glaring, questions remain as to who will pay (and how much it will cost) should PEF decide to stop throwing good money after bad and decommission Crystal River reactor 3.

Also missing from the calculation is any determination of what PEF’s insurance will cover–Crystal River’s insurer stopped paying out in early 2011, and they have yet to decide if they will pay anything more. . . at all.

The agreement also fails to put an end to what is now becoming a regular part of the nuclear power finance scam–collecting public money for plants that will never be built. As the Southern Alliance for Clean Energy (SACE, which is affiliated with CleanEnergy Footprints) observed when it opted not to sign on to the Florida rate agreement:

PEF hasn’t committed to actually building the Levy Co. reactors. Having customers pay for the company just to maintain the “option” at a later date to build reactors is unfair to today’s customers – and runs counter to the Commission’s “intent to build” standard. The agreement allows the company to collect another $350 million from customers, presumably for pursuing their Nuclear Regulatory Commission license (without any prudency review) for reactors it hasn’t committed to build? In fact, the agreement contemplates that the company will cancel its engineering and procurement contracts as well, further demonstrating the unlikelihood of project completion.

If something sounds familiar here, it should. Southern Company has been using heaping helpings of Georgia ratepayer money to do all kinds of preliminary work on their Vogtle site, purportedly the future home of two new AP1000 reactors, just granted a combined construction and operating license by the NRC in January.

The big difference so far between Levy and Vogtle has been Southern’s ability to line up some financing for its Georgia construction–thanks to $8.33 billion in federal loan guarantees granted the project by the Obama administration almost two years in advance of the NRC approval.

PEF does not have this kind of guarantee, but that did not stop them from trading on the possibility:

Progress Energy Florida officials said Thursday that President Obama’s plan to offer federal loan guarantees to encourage investment in nuclear power plant construction will be a strong incentive to move forward with the company’s proposed Levy County plant.

The project, however, is facing delays of between 20 to 36 months due to economic and regulatory problems, making the plant’s future uncertain despite the company’s insistence the project isn’t cancelled.

“It (the loan guarantee program) will definitely play a role in that decision (whether to continue with the project). It is one of many, but a very important one,” said Progress Energy spokesman Mike Hughes.

That was in 2010, right after President Obama announced the new Department of Energy loan program–but two years later, PEF has not secured a federal guarantee, and so has not secured any financing. . . and thus has also not committed to ever building the Levy plant. But none of that has stopped Progress from collecting money from Florida consumers just to keep hope alive, as it were. And none of that has apparently stopped any of Florida’s public service commissioners from telling PEF that this practice is just jake with them.

Even with NRC approval and some federally guaranteed money, it is still not a sure bet that the Vogtle AP1000 reactors will ever come on line. PEF’s Levy project has no license and no loan guarantee.

The folks at Progress Energy are not stupid–at least not when it comes to short-term financial gain–they know how very slim their chances are of ever pushing even a single kilowatt out of Levy County, but they also know where the profit is in the nuclear power game. It is not, quite obviously, in the construction of nuclear power plants–rife as that process is with lengthy delays and massive cost overruns–and it is not, some might be surprised to learn, so much in electric generation, given that plants in the US are now suffering “unusual events” that force one or more of them offline pretty much every week. Unusual events cost money–in parts and labor, and in time lost to repairs and inspections–and, as has been demonstrated at Crystal River, there is the cost of replacement energy.

No, the real profits in the nuclear racket come from the ability to collect on services not rendered and a product not delivered, or at least not delivered regularly. Because the system backstops the financing of nuclear facilities while also allowing plant operators to pass both real and anticipated costs onto ratepayers, many American taxpayers are poised to pay twice for nuclear power plants that don’t produce power.

And it would be remiss to close without adding a few more points.

Much has been made of the failure of solar panel manufacturer Solyndra, which also received aid from the federal government in the form of loan guarantees. Solyndra ultimately got $527 million from the government; contrast that with what has been granted to Southern for Vogtle. Or, starker still, look at the entire alternative energy loan program, now projected to cost out at under $3 billion, and then look back to 2010, when Barack Obama pledged $54.5 billion to the DOE loan guarantee program designed to foster investment in nuclear power.

In addition, while the government will actually recoup most of the money lost on Solyndra when the factory and inventory are auctioned off, the “leftovers” from a failed nuclear plant–even the parts that are not contaminated with radioactivity–are much harder (if not impossible) to move.

The focus of this story has been on the costs–because the case of Progress Energy Florida is such a glaring example of how nuclear operators fleece America–but the fact that a company so focused on the bottom line, regardless of its effect on public safety, is still allowed to play with something as dangerous as a damaged nuclear power plant should not be overlooked. Alas, as was exposed last year, nuclear regulators and the nuclear industry seem to agree that safety should be addressed with an eye toward cost. So, while Crystal River is a scary mess, the reactor in question is actually offline right now. The same cannot be said, for example, about Ohio’s Davis-Besse plant, which has cracking problems of its own, but was allowed by the NRC to restart in January–over the vociferous objections of industry watchdogs, engineers, and Rep. Dennis Kucinich (D-OH).

And then there is Palisades, on the shores of Lake Michigan, where numerous events and releases of radioactivity in the last year caused the Nuclear Regulatory Commission to issue a downgrade of the plant’s safety rating–but the NRC did not order the plant to shut down. Palisades is owned by Entergy Nuclear, who was recently cited for “buying reactors cheap, then running them into the ground.” In addition to Palisades, Entergy owns nine other plants–Arkansas Nuclear One, Nebraska’s Cooper Nuclear Station, Fitzpatrick in upstate New York, Grand Gulf in Mississippi, Indian Point, just north of New York City, Pilgrim, outside of Boston, River Bend and Waterford, both in Louisiana, and Vermont Yankee.

The case of Vermont Yankee is especially upsetting. Yankee is a GE boiling water reactor, similar to the model that failed so catastrophically at Fukushima–but the NRC voted to extend its operating license just days after the Tohoku quake. The state of Vermont had a better idea, declaring that the nuclear plant should shut down by March 21, 2012. However, in January, federal district court judge J. Garvan Murtha ruled Entergy could ignore Vermont’s order and continue operating. The state is appealing the ruling, but in the meantime, Yankee continues to operate. . . and continues to leak tritium into the groundwater, and into the Connecticut River.

It is not clear who will be paying for any attempt to clean up the Vermont Yankee leak–though one can guess–nor is it clear what will happen to new nuclear waste produced after March 21, since the Vermont statehouse has forbidden any new waste storage on the site. Indeed, storing used nuclear fuel is a nationwide problem that poses real dangers in the near term, and will likely cost billions of public dollars in the long term.

And that’s the bottom line–the real bottom line–for the industry’s oft-ballyhooed “nuclear renaissance.” Plant operators and captured regulators can try to obscure the safety concerns with diversionary dustups and magical thinking, but economic realities, like facts, are stubborn. Without huge injections of public money, nuclear power simply cannot continue to function–and the public is in no mood for another multi-billion dollar government bailout.