Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

For Nuclear Power This Summer, It’s Too Darn Hot


You know that expression, “Hotter than July?” Well, this July, July was hotter than July. Depending on what part of the country you live in, it was upwards of three degrees hotter this July than the 20th Century average. Chicago, Denver, Detroit, Indianapolis and St. Louis are each “on a pace to shatter their all-time monthly heat records.” And “when the thermometer goes way up and the weather is sizzling hot,” as the Cole Porter song goes, demand for electricity goes way up, too.

During this peak period, wouldn’t it be great to know that you can depend on the expensive infrastructure your government and, frankly, you as ratepayers and taxpayers have been backstopping all these years? Yeah, that would be great. . . so would an energy source that was truly clean, safe, and too cheap to meter. Alas, to the surprise of no one (at least no one who watches this space), nuclear power, the origin of that catchy if not quite Porter-esque tripartite promise, cannot.

Take, for example, Braidwood, the nuclear facility that supplies much of Chicago with electricity:

It was so hot last week, a twin-unit nuclear plant in northeastern Illinois had to get special permission to continue operating after the temperature of the water in its cooling pond rose to 102 degrees.

It was the second such request from the plant, Braidwood, which opened 26 years ago. When it was new, the plant had permission to run as long as the temperature of its cooling water pond, a 2,500-acre lake in a former strip mine, remained below 98 degrees; in 2000 it got permission to raise the limit to 100 degrees.

The problem, said Craig Nesbit, a spokesman for Exelon, which owns the plant, is not only the hot days, but the hot nights. In normal weather, the water in the lake heats up during the day but cools down at night; lately, nighttime temperatures have been in the 90s, so the water does not cool.

But simply getting permission to suck in hotter water does not make the problem go away. When any thermoelectric plant (that includes nuclear, coal and some gas) has to use water warmer than design parameters, the cooling is less effective, and that loss of cooling potential means that plants need to dial down their output to keep from overheating and damaging core components. Exelon said it needed special dispensation to keep Braidwood running because of the increased demand for electricity during heat waves such as the one seen this July, but missing from the statement is that the very design of Braidwood means that it will run less efficiently and supply less power during hot weather.

Also missing from Exelon’s rationale is that they failed to meet one of the basic criteria for their exception:

At the Union of Concerned Scientists, a group that is generally critical of nuclear power safety, David Lochbaum, a nuclear engineer, said the commission was supposed to grant exemptions from its rules if there was no increase or only a minor increase in risk, and if the situation could not have been foreseen.

The safety argument “is likely solid and justified,’’ he wrote in an e-mail, but “it is tough to argue (rationally) that warming water conditions are unforeseen.’’ That is a predictable consequence of global warming, he said.

Quite. Lochbaum cites two instances from the hot summer of 2010–New Jersey’s Hope Creek nuclear station and Limerick in Pennsylvania each had to reduce output due to intake water that was too warm. In fact, cooling water problems at US thermoelectric generators were widespread along the Mississippi River during the hot, dry summer of 1988.

And the problem is clearly growing. Two months ago, a study published in Nature Climate Change predicted continued warming and spreading drought conditions will significantly reduce thermoelectric output in coming decades:

Higher water temperatures and reduced river flows in Europe and the United States in recent years have resulted in reduced production, or temporary shutdown, of several thermoelectric power plants, resulting in increased electricity prices and raising concerns about future energy security in a changing climate.

. . . .

[The Nature Climate Change study] projects further disruption to supply, with a likely decrease in thermoelectric power generating capacity of between 6-19% in Europe and 4-16% in the United States for the period 2031-2060, due to lack of cooling-water. The likelihood of extreme (>90%) reductions in thermoelectric power generation will, on average, increase by a factor of three.

Compared to other water use sectors (e.g. industry, agriculture, domestic use), the thermoelectric power sector is one of the largest water users in the US (at 40%) and in Europe (43% of total surface water withdrawals). While much of this water is ‘recycled’ the power plants rely on consistent volumes of water, at a particular temperature, to prevent overheating of power plants. Reduced water availability and higher water temperatures – caused by increasing ambient air temperatures associated with climate change – are therefore significant issues for electricity supply.

That study is of course considering all thermoelectric sources, not just nuclear, but the decrease in efficiency applies across the board. And, when it comes to nuclear power, as global temperatures continue to rise and water levels in rivers and lakes continue to drop, an even more disconcerting threat emerges.

When a coal plant is forced to shut down because of a lack of cool intake water, it can, in short order, basically get turned off. With no coal burning, the cooling needs of the facility quickly downgrade to zero.

A nuclear reactor, however, is never really “off.”

When a boiling water reactor or pressurized water reactor (BWR and PWR respectively, the two types that make up the total of the US commercial reactor fleet) is “shutdown” (be it in an orderly fashion or an abrupt “scram”), control rods are inserted amongst the fuel rods inside the reactor. The control rods absorb free neutrons, decreasing the number of heavy atoms getting hit and split in the fuel rods. It is that split, that fission, that provides the energy that heats the water in the reactor and produces the steam that drives the electricity-generating turbines. Generally, the more collisions, the more heat generated. An increase in heat means more steam to spin a turbine; fewer reactions means less heat, less steam and less electrical output. But it doesn’t mean no heat.

The water that drives the turbines also cools the fuel rods. It needs to circulate and somehow get cooled down when it is away from the reactor core. Even with control rods inserted, there are still reactions generating heat, and that heat needs to be extracted from the reactor or all kinds of trouble ensues–from too-high pressure breaching containment to melting the cladding on fuel rods, fires, and hydrogen explosions. This is why the term LOCA–a loss of coolant accident–is a scary one to nuclear watchdogs (and, theoretically, to nuclear regulators, too).

So, even when they are not producing electricity, nuclear reactors still need cooling. They still need a power source to make that cooling happen, and they still need a coolant, which, all across the United States and most of the rest of the world, means water.

Water that is increasingly growing too warm or too scarce. . . at least in the summer. . . you know, when it’s hot. . . and demand for electricity increases.

In fact, Braidwood is not the only US plant that has encountered problems this sultry season:

[A] spokeswoman for the Midwest Independent System Operator, which operates the regional grid, said that another plant had shut down because its water intake pipes were now above the water level of the body from which it draws its cooling water. Another is “partially curtailed.”

That spokeswoman can’t, it seems, tell us which plants she is talking about because that information “is considered competitive.” (Good to know that the Midwest Independent System Operator has its priorities straight. . . . Hey, that sounds like a hint! Anyone in the Midwest notice a nearby power plant curtailing operations?)

So, not isolated. . . and also not a surprise–not to the Nature Climate Change people this year, and not to the industry, itself. . . 17 years ago. The Electric Power Research Institute (EPRI), a non-profit group of scientists and engineers funded by the good folks who generate electricity (a group that has a noticeable overlap with the folks that own nuclear plants), released a study in 1995 that specifically warned of the threat a warming climate posed to electrical generation. The EPRI study predicted that rising levels of atmospheric carbon dioxide would make power production less efficient and more expensive, while at the same time increasing demand.

And climate predictions have only grown more dire since then.

Add to that mix one more complicating factor: when the intake water is warmer, the water expelled by the plant is warmer, too. And there are environmental protections in many areas that limit how hot that “waste” water can be. There have been instances in the past where thermoelectric plants have had to curtail production because their exhaust water exceeded allowable temperatures.

And yet, despite a myriad of potential problems and two decades of climate warnings, it is sobering to note that none of the US reactors were built to account for any of this. . . because all American nuclear reactors predate these revelations. That is not to say nuclear operators haven’t had 20 years (give or take) to plan for these exigencies, but it is to say that, by-and-large, they haven’t. (Beyond, that is, as described above, simply lobbying for higher water temperature limits. That’s a behavior all too recognizable when it comes to nuclear operators and regulators–when nuclear plants can’t meet requirements, don’t upgrade the procedures or equipment, just “upgrade” the requirements.)

But, rather than using all this knowledge to motivate a transition away from nuclear power, rather than using the time to begin decommissioning these dinosaurs, nuclear operators have instead pushed for license extensions–an additional 20 years beyond the original 40-year design. And, to date, the Nuclear Regulatory Commission has yet to reject a single extension request.

And now the nuclear industry–with the full faith and credit of the federal government–is looking to double down on this self-imposed ignorance. The “Advanced Passive” AP1000 reactors approved earlier this year for Georgia’s Plant Vogtle (and on track for South Carolina, too) may be called “advanced,” but they are still PWRs and they still require a large reserve of cool, circulating water to keep them operating and nominally safe.

The government is offering $8.3 billion of financing for the Georgia reactors at rock-bottom rates, and with very little cash up front from the plant owners. There have already been numerous concerns about the safety of the AP1000 design and the economic viability of the venture; factor in the impact of climate change, and the new Vogtle reactors are pretty much the definition of “boondoggle”–a wasteful, pointless project that gives the appearance of value while in reality delivering none. It is practically designed to fail, leaving the government (read: taxpayers and ratepayers) holding the bag.

But as a too-darn-hot July ends, that’s the woo being pitched by the nuclear industry and its government sweethearts. Rather than invest the money in technologies that actually thrive during the long, hot days of summer, rather than invest in improved efficiency and conservation programs that would both create jobs and decrease electrical demand (and carbon emissions), rather than seizing the moment, making, as it were, hay while the sun shines, it seems the US will choose to bury its head in the sand and call it shade.

Nuclear power was already understood to be dirty, dangerous and absurdly expensive, even without the pressures of climate change. Far from being the answer to growing greenhouse gas emissions, the lifecycle of nuclear power–from mining and milling to transport and disposal–has turned out to be a significant contributor to the problem. And now, the global weirding brought on by that problem has made nuclear even more precarious–more perilous and more pricy–and so an even more pernicious bet.

According to the Kinsey Report, every average man you know would prefer to play his favorite sport when the temperature is low. But when the thermometer goes way up and the weather is sizzling hot, a gob for his squab, a marine for his beauty queen, a GI for his cutie-pie–and now it turns out–the hour for nuclear power is not.

‘Cause it’s too darn hot.
It’s too. Darn. Hot.