Blogiversary: Sweet 16

Welp… another year in the books–though I won’t say it was one for the books. Not for writing them or reading them… or for any metaphysical personal ledger, really. But it has really been a year.

Reviewing what I wrote one year ago, I still must get some thrill out of shouting into the void, and I still, for better or worse, agree with Ms. Parker, as some of the best days this past year were best not because they were spent writing, but because they finished with the sense of having written.

I’m looking forward to more days like that, even if you, dear reader, might not know it for a while. I am not looking forward to another year like this last one, with just too darn many realizations I didn’t want to realize, and fewer and fewer who want to realize it all with me….

Which reminds me of a great line from a less-than-great film. It’s delivered by all-time great Gloria Grahame to the equally great Robert Mitchum: “They always warn you about solitary drinking,” she laments, “but they never tell you how to get people to stay up and drink with you.”

In case there’s any question, I’m Gloria here.

I could go on (and, somewhere else, I’m sure I will), but as that quote affirms, we are here for the drinks! I might only just crack open a beer myself, but I will share the recipe I concocted for Christmas, when I wanted another film reference—the “flaming rum punch” Clarence orders before getting tossed out of the Pottersville Martini’s in It’s a Wonderful Life—but I didn’t have oranges or overproof rum. I drank this instead:

Cherry Poppin’ Punch

1 oz White Rum
1 oz Dark Rum
3/4 oz Calvados
1/2 oz Napoleon Brandy
1/2 oz Port
Dash Almond Extract
1/4 oz Simple Syrup
1/2 oz Sour Cherry Syrup
1/2 oz Tart Cherry Juice

1/2 oz Black Cherry Juice

Place all ingredients in a shaker with ice & shake vigorously. Decant into two chilled Nick & Nora glasses. Garnish each with a Luxardo cherry and a sprinkle of cinnamon powder.

OK, no more links, just more drinks! Feel free to google me, if you dare; follow me, if you please. Celebrate, if you can.

Bob’s your uncle!

Blogiversary 15

If there’s one thing you can be sure of, as a writer, it is, after having written, the feeling that what you just wrote pretty much completely sucks. It is the anticipation of that feeling that serves as one of the primary deterrents to starting anything in the first place. I mean, who wants to go through all that effort just to disappoint yourself?

But I woke up today realizing it was my “blogiversary” (for lack of a better word—seriously, is there a better word?), and a big one, at that. Fifteen years ago, after much urging, and more frustration, I joined the madding crowd, and, though I didn’t realize it at the time, began one of my bigger life transitions—from god-knows-what I was, to journalist.

For the last half of these last 15 years, I haven’t done much actual blogging, feeling that, as I have often commented, the web is now lousy with hot takes. But I still defend the blog as a valid journalistic format, and, as much as I want to spend most of my time deep diving into dark waters, I still often miss the low-stakes thrill of an almost daily shout into the void.

Since I last posted at this place (or this place, or this place), I have had gigs at slightly more trafficked websites and magazines, and I have written a thing or two or three or four hundred. (Maybe I’ll post a few links.) And now, taking stock after a year that is pretty much defined by taking stock, I have decided that there is maybe a second thing writers can be sure of: after the certain initial disappointment, there is the Dorothy Parker-ian sense of satisfaction, and the conclusion, five, or 10, or 15 years later that maybe all of that work didn’t suck quite so completely after all.

I won’t toast to the next 15 years (like, crap, 15 years is a long fuckin’ time), but, in the grand tradition of my original blog (“a journal of politics, popular culture, and mixed drinks”), I will toast. And, in the tradition of previous blogiversaries, I will offer a cocktail recipe—this time one I crafted myself for drinking during those agonizing presidential debates. Cheers? Cheers!

The Black Helicopter

1 ½ oz Dark Rum
½ oz Rhum Agricole
1 ½ oz Amaro Nonino
1 ½ oz Aperol
1 oz lime juice

Mix all ingredients in a beaker, stir with ice, and strain into double old fashioned glasses each containing one large-format ice cube. Garnish with a lime twist.

Makes two.

Oh, yes—some of the last seven years… 

7 Years on, Sailors Exposed to Fukushima Radiation Seek Their Day in Court

Pilgrim’s Progress: Inside the American Nuclear-Waste Crisis

Key safety system not installed at site of deadly Amtrak derailment

Older safety technology could have prevented Amtrak tragedy

The Amtrak Tragedy Has Roots in the Swamp

Amtrak crash: state-of-the art safety gear was operational at time of fatal collision

States Are Using Taxpayer Money to Greenwash Dirty Nuclear Power 

Psychologists worked with CIA, Bush administration to justify torture

New ‘bomb train’ rules welcomed with a bang

The Brief Wondrous Life (and Long Dangerous Half-Life) of Strontium-90

Tooth to Science button2At roughly 5:30 in the morning on July 16, 1945, an implosion-design plutonium device, codenamed “the gadget,” exploded over the Jornada del Muerto desert in south-central New Mexico with a force equivalent to about 20,000 tons of TNT. It was the world’s first test of an atomic bomb, and as witnesses at base camp some ten miles away would soon relay to US President Harry Truman, the results were “satisfactory” and exceeded expectations. Within weeks, the United States would use a uranium bomb of a different design on the Japanese city of Hiroshima, and three days after that, a plutonium device similar to the gadget was dropped on Nagasaki, about 200 miles to the southwest.

Though Hiroshima and Nagasaki are the only instances where atomic weapons were used against a wartime enemy, between 1945 and 1963, the world experienced hundreds upon hundreds of nuclear weapons tests, the great majority of which were above ground or in the sea–in other words, in the atmosphere. The US tested atom and hydrogen bombs in Nevada, at the Nevada Test Site, and in the Pacific Ocean, on and around the Marshall Islands, in an area known as the Pacific Proving Grounds. After the Soviet Union developed its own atomic weapon in 1949, it carried out hundreds of similar explosions, primarily in Kazakhstan, and the UK performed more than 20 of its own atmospheric nuclear tests, mostly in Australia and the South Pacific, between 1952 and 1958.

Though military authorities and officials with the US Atomic Energy Commission initially downplayed the dispersal and dangers of fallout from these atmospheric tests, by the early 1950s, scientists in nuclear and non-nuclear countries alike began to raise concerns. Fallout from atmospheric tests was not contained simply to the blast radius or a region near the explosion, instead the products of fission and un-fissioned nuclear residue were essentially vaporized by the heat and carried up into the stratosphere, sweeping across the globe, and eventually returning to earth in precipitation. A host of radioactive isotopes contaminated land and surface water, entering the food chain through farms and dairies.

The tale of the teeth

In order to demonstrate that fallout was widespread and had worked its way into the population, a group of researchers, headed by Dr. Barry Commoner and Drs. Louise and Eric Reiss, founded the Baby Tooth Survey under the auspices of Washington University (where Commoner then taught) and the St. Louis Citizens’ Committee for Nuclear Information. The tooth survey sought to track strontium-90 (Sr-90), a radioactive isotope of the alkaline earth metal strontium, which occurs as a result–and only as a result–of nuclear fission. Sr-90 is structurally similar to calcium, and so, once in the body, works its way into bones and teeth.

While harvesting human bones was impractical, researchers realized that baby teeth should be readily available. Most strontium in baby teeth would transfer from mother to fetus during pregnancy, and so birth records would provide accurate data about where and when those teeth were formed. The tooth survey collected baby teeth, initially from the St. Louis area, eventually from around the globe, and analyzed them for strontium.

By the early ’60s, the program had collected well over a quarter-million teeth, and ultimately found that children in St. Louis in 1963 had 50 times more Sr-90 in them than children born in 1950. Armed with preliminary results from this survey and a petition signed by thousands of scientists worldwide, Dr. Commoner successfully lobbied President John F. Kennedy to negotiate and sign the Partial Test Ban Treaty, halting atmospheric nuclear tests by the US, UK and USSR. By the end of the decade, strontium-90 levels in newly collected baby teeth were substantially lower than the ’63 samples.

The initial survey, which ended in 1970, continues to have relevance today. Some 85,000 teeth not used in the original project were turned over to researchers at the Radiation and Public Health Project (RPHP) in 2001. The RPHP study, released in 2010, found that donors from the Baby Tooth Survey who had died of cancer before age 50 averaged over twice the Sr-90 in their samples compared with those who had lived past their 50th birthday.

But the perils of strontium-90–or, indeed, a host of radioactive isotopes that are strontium’s travel companions–did not cease with the ban on atmospheric nuclear tests. Many of the hazards of fallout could also be associated with the radiological pollution that is part-and-parcel of nuclear power generation. The controlled fission in a nuclear reactor produces all of the elements created in the uncontrolled fission of a nuclear explosion. This point was brought home by the RPHP work, when it found strontium-90 was 30- to 50-percent higher in baby teeth collected from children born in “nuclear counties,” (PDF) the roughly 40 percent of US counties situated within 100 miles of a nuclear power plant or weapons lab.

Similar baby teeth research has been conducted over the last 30 years in Denmark, Japan and Germany, with measurably similar results. While Sr-90 levels continued to decrease in babies born through the mid 1970s, as the use of nuclear power starts to spread worldwide, that trend flattens. Of particular note, a study conducted by the German section of the International Physicians for the Prevention of Nuclear War (winner of the 1985 Nobel Peace Prize) found ten-times more strontium-90 in the teeth of children born after the 1986 Chernobyl nuclear disaster when compared with samples from 1983.

While radioactive strontium itself can be linked to several diseases, including leukemia and bone cancers, Sr-90, as mentioned above, is but one of the most measurable of many dangerous isotopes released into the environment by the normal, everyday operation of nuclear reactors, even without the catastrophic discharges that come with accidents and meltdowns. Tritium, along with radioactive variants of iodine, cesium and xenon (to name just a few) can often be detected in elevated levels in areas around nuclear facilities.

Epidemiological studies have shown higher risks of breast and prostate cancers for those living in US nuclear counties. But while the Environmental Protection Agency collects sporadic data on the presence of radioactive isotopes such as Sr-90, the exact locations of the sampling sites are not part of the data made available to the general public. Further, while “unusual” venting of radioactive vapor or the dumping of contaminated water from a nuclear plant has to be reported to the Nuclear Regulatory Commission (and even then, it is the event that is reported, not the exact composition of the discharge), the radio-isotopes that are introduced into the environment by the typical operation of a reactor meet with far less scrutiny. In the absence of better EPA data and more stringent NRC oversight, studies like the Baby Tooth Survey and its contemporary brethren are central to the public understanding of the dangers posed by the nuclear power industry.

June and Sr-90: busting out all over

As if to underscore the point, strontium-90 served as the marker for troubling developments on both sides of the Pacific just this June.

In Japan, TEPCO–still the official operator of Fukushima Daiichi–revealed it had found Sr-90 in groundwater surrounding the crippled nuclear plant at “very high” levels. Between December 2012 and May 2013, levels of strontium-90 increased over 100-fold, to 1,000 becquerels per liter–33 times the Japanese limit for the radioactive isotope.

The samples were taken less than 100 feet from the coast. From that point, reports say, the water usually flows out to the Pacific Ocean.

Beyond the concerns raised by the affects of the strontium-90 (and the dangerously high amounts of tritium detected along with it) when the radioactive contamination enters the food chain, the rising levels of Sr-90 likely indicate other serious problems at Fukushima. Most obviously, there is now little doubt that TEPCO has failed to contain contaminated water leaking from the damaged reactor buildings–contrary to the narrative preferred by company officials.

But skyrocketing levels of strontium-90 could also suggest that the isotope is still being produced–that nuclear fission is still occurring in one or more of the damaged reactor cores. Or even, perhaps, outside the reactors, as the corium (the term for the molten, lava-like nuclear fuel after a meltdown) in as many as three units is believed to have melted through the steel reactor containment and possibly eroded the concrete floor, as well.

An ocean away, in Washington state, radiological waste, some of which dates back to the manufacture of those first atom bombs, sits in aging storage tanks at the Hanford Nuclear Reservation–and some of those tanks are leaking.

In truth, tanks at Hanford, considered by many the United States’ most contaminated nuclear site, have been leaking for some time. But the high-level radioactive waste in some of the old, single-wall tanks had been transferred to newer, double-walled storage, which was supposed to provide better containment. On June 20, however, the US Department of Energy reported that workers at Hanford detected radioactive contamination–specifically Sr-90–outside one of the double-walled tanks, possibly suggesting a breach. The predominant radionuclides in the 850,000-gallon tank are reported to be strontium-90 and cesium-137.

The tank, along with hundreds of others, sits about five miles from the Columbia River, water source for much of the region. Once contamination leaks from the tanks, it mixes with ground water, and, in time, should make its way to the river. “I view this as a crisis,” said Tom Carpenter, executive director of the watchdog group Hanford Challenge, “These tanks are not supposed to fail for 50 years.”

Destroyer of worlds

In a 1965 interview, J. Robert Oppenheimer, the Manhattan Project’s science director who was in charge of the Los Alamos facility that developed the first atomic bombs, looked back twenty years to that July New Mexico morning:

We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, “Now I am become Death, the destroyer of worlds.” I suppose we all thought that, one way or another.

“We knew the world would not be the same.” Oppenheimer was most likely speaking figuratively, but, as it turns out, he also reported a literal truth. Before July 16, 1945, there was no strontium-90 or cesium-137 in the atmosphere–it simply did not exist in nature. But ever since that first atomic explosion, these anthropogenic radioactive isotopes have been part of earth’s every turn.

Strontium-90–like cesium-137 and a catalog of other hazardous byproducts of nuclear fission–takes a long time to decay. The detritus of past detonations and other nuclear disasters will be quite literally with us–in our water and soil, in our tissue and bone–for generations. These radioactive isotopes have already been linked to significant suffering, disease and death. Their danger was acknowledged by the United States when JFK signed the 1963 Test Ban Treaty. Now would be a good time to acknowledge the perspicacity of that president, phase out today’s largest contributors of atmospheric Sr-90, nuclear reactors, and let the sun set on this toxic metal’s life.

 

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Two Years On, Fukushima Raises Many Questions, Provides One Clear Answer

Fukushima's threats to health and the environment continue. (graphic: Surian Soosay via flickr)

Fukushima’s threats to health and the environment continue. (graphic: Surian Soosay via flickr)

You can’t say you have all the answers if you haven’t asked all the questions. So, at a conference on the medical and ecological consequences of the Fukushima nuclear disaster, held to commemorate the second anniversary of the earthquake and tsunami that struck northern Japan, there were lots of questions. Questions about what actually happened at Fukushima Daiichi in the first days after the quake, and how that differed from the official report; questions about what radionuclides were in the fallout and runoff, at what concentrations, and how far they have spread; and questions about what near- and long-term effects this disaster will have on people and the planet, and how we will measure and recognize those effects.

A distinguished list of epidemiologists, oncologists, nuclear engineers, former government officials, Fukushima survivors, anti-nuclear activists and public health advocates gathered at the invitation of The Helen Caldicott Foundation and Physicians for Social Responsibility to, if not answer all these question, at least make sure they got asked. Over two long days, it was clear there is much still to be learned, but it was equally clear that we already know that the downsides of nuclear power are real, and what’s more, the risks are unnecessary. Relying on this dirty, dangerous and expensive technology is not mandatory–it’s a choice. And when cleaner, safer, and more affordable options are available, the one answer we already have is that nuclear is a choice we should stop making and a risk we should stop taking.

“No one died from the accident at Fukushima.” This refrain, as familiar as multiplication tables and sounding about as rote when recited by acolytes of atomic power, is a close mirror to versions used to downplay earlier nuclear disasters, like Chernobyl and Three Mile Island (as well as many less infamous events), and is somehow meant to be the discussion-ender, the very bottom-line of the bottom-line analysis that is used to grade global energy options. “No one died” equals “safe” or, at least, “safer.” Q.E.D.

But beyond the intentional blurring of the differences between an “accident” and the probable results of technical constraints and willful negligence, the argument (if this saw can be called such) cynically exploits the space between solid science and the simple sound bite.

“Do not confuse narrowly constructed research hypotheses with discussions of policy,” warned Steve Wing, Associate Professor of Epidemiology at the University of North Carolina’s Gillings School of Public Health. Good research is an exploration of good data, but, Wing contrasted, “Energy generation is a public decision made by politicians.”

Surprisingly unsurprising

A public decision, but not necessarily one made in the public interest. Energy policy could be informed by health and environmental studies, such as the ones discussed at the Fukushima symposium, but it is more likely the research is spun or ignored once policy is actually drafted by the politicians who, as Wing noted, often sport ties to the nuclear industry.

The link between politicians and the nuclear industry they are supposed to regulate came into clear focus in the wake of the March 11, 2011 Tohoku earthquake and tsunami–in Japan and the United States.

The boiling water reactors (BWRs) that failed so catastrophically at Fukushima Daiichi were designed and sold by General Electric in the 1960s; the general contractor on the project was Ebasco, a US engineering company that, back then, was still tied to GE. General Electric had bet heavily on nuclear and worked hand-in-hand with the US Atomic Energy Commission (AEC–the precursor to the NRC, the Nuclear Regulatory Commission) to promote civilian nuclear plants at home and abroad. According to nuclear engineer Arnie Gundersen, GE told US regulators in 1965 that without quick approval of multiple BWR projects, the giant energy conglomerate would go out of business.

It was under the guidance of GE and Ebasco that the rocky bluffs where Daiichi would be built were actually trimmed by 10 meters to bring the power plant closer to the sea, the water source for the reactors’ cooling systems–but it was under Japanese government supervision that serious and repeated warnings about the environmental and technological threats to Fukushima were ignored for another generation.

Failures at Daiichi were completely predictable, observed David Lochbaum, the director of the Nuclear Safety Project at the Union of Concerned Scientists, and numerous upgrades were recommended over the years by scientists and engineers. “The only surprising thing about Fukushima,” said Lochbaum, “is that no steps were taken.”

The surprise, it seems, should cross the Pacific. Twenty-two US plants mirror the design of Fukushima Daiichi, and many stand where they could be subject to earthquakes or tsunamis. Even without those seismic events, some US plants are still at risk of Fukushima-like catastrophic flooding. Prior to the start of the current Japanese crisis, the Nuclear Regulatory Commission learned that the Oconee Nuclear Plant in Seneca, South Carolina, was at risk of a major flood from a dam failure upstream. In the event of a dam breach–an event the NRC deems more likely than the odds that were given for the 2011 tsunami–the flood at Oconee would trigger failures at all four reactors. Beyond hiding its own report, the NRC has taken no action–not before Fukushima, not since.

The missing link

But it was the health consequences of nuclear power–both from high-profile disasters, as well as what is considered normal operation–that dominated the two days of presentations at the New York Academy of Medicine. Here, too, researchers and scientists attempted to pose questions that governments, the nuclear industry and its captured regulators prefer to ignore, or, perhaps more to the point, omit.

Dr. Hisako Sakiyama, a member of the Fukushima Nuclear Accident Independent Investigation Commission, has been studying the effects of low-dose radiation. Like others at the symposium, Dr. Sakiyama documented the linear, no-threshold risk model drawn from data across many nuclear incidents. In essence, there is no point at which it can be said, “Below this amount of radiation exposure, there is no risk.” And the greater the exposure, the greater the risk of health problems, be they cancers or non-cancer diseases.

Dr. Sakiyama contrasted this with the radiation exposure limits set by governments. Japan famously increased what it called acceptable exposure quite soon after the start of the Fukushima crisis, and, as global background radiation levels increase as a result of the disaster, it is feared this will ratchet up what is considered “safe” in the United States, as the US tends to discuss limits in terms of exposure beyond annual average background radiation. Both approaches lack credibility and expose an ugly truth. “Debate on low-dose radiation risk is not scientific,” explained Sakiyama, “but political.”

And the politics are posing health and security risks in Japan and the US.

Akio Matsumura, who spoke at the Fukushima conference in his role as founder of the Global Forum of Spiritual and Parliamentary Leaders for Human Survival, described a situation at the crippled Japanese nuclear plant that is much more perilous, even today, than leaders are willing to acknowledge. Beyond the precarious state of the spent fuel pool above reactor four, Matsumura also cited the continued melt-throughs of reactor cores (which could lead to a steam explosion), the high levels of radiation at reactors one and three (making any repairs impossible), and the unprotected pipes retrofitted to help cool reactors and spent fuel. “Probability of another disaster,” Matsumura warned, “is higher than you think.”

Matsumura lamented that investigations of both the technical failures and the health effects of the disaster are not well organized. “There is no longer a link between scientists and politicians,” said Matsumura, adding, “This link is essential.”

The Union of Concerned Scientists’ Lochbaum took it further. “We are losing the no-brainers with the NRC,” he said, implying that what should be accepted as basic regulatory responsibility is now subject to political debate. With government agencies staffed by industry insiders, “the deck is stacked against citizens.”

Both Lochbaum and Arnie Gundersen criticized the nuclear industry’s lack of compliance, even with pre-Fukushima safety requirements. And the industry’s resistance undermines nuclear’s claims of being competitive on price. “If you made nuclear power plants meet existing law,” said Gundersen, “they would have to shut because of cost.”

But without stronger safety rules and stricter enforcement, the cost is borne by people instead.

Determinate data, indeterminate risk

While the two-day symposium was filled with detailed discussions of chemical and epidemiologic data collected throughout the nuclear age–from Hiroshima through Fukushima–a cry for more and better information was a recurring theme. In a sort of wily corollary to “garbage in, garbage out,” experts bemoaned what seem like deliberate holes in the research.

Even the long-term tracking study of those exposed to the radiation and fallout in Japan after the atomic blasts at Hiroshima and Nagasaki–considered by many the gold-standard in radiation exposure research because of the large sample size and the long period of time over which data was collected–raises as many questions as it answers.

The Hiroshima-Nagasaki data was referenced heavily by Dr. David Brenner of the Center for Radiological Research, Columbia University College of Physicians and Surgeons. Dr. Brenner praised the study while using it to buttress his opinion that, while harm from any nuclear event is unfortunate, the Fukushima crisis will result in relatively few excess cancer deaths–something like 500 in Japan, and an extra 2,000 worldwide.

“There is an imbalance of individual risk versus overall anxiety,” said Brenner.

But Dr. Wing, the epidemiologist from the UNC School of Public Health, questioned the reliance on the atom bomb research, and the relatively rosy conclusions those like Dr. Brenner draw from it.

“The Hiroshima and Nagasaki study didn’t begin till five years after the bombs were dropped,” cautioned Wing. “Many people died before research even started.” The examination of cancer incidence in the survey, Wing continued, didn’t begin until 1958–it misses the first 13 years of data. Research on “Black Rain” survivors (those who lived through the heavy fallout after the Hiroshima and Nagasaki bombings) excludes important populations from the exposed group, despite those populations’ high excess mortality, thus driving down reported cancer rates for those counted.

The paucity of data is even more striking in the aftermath of the Three Mile Island accident, and examinations of populations around American nuclear power plants that haven’t experienced high-profile emergencies are even scarcer. “Studies like those done in Europe have never been done in the US,” said Wing with noticeable regret. Wing observed that a German study has shown increased incidences of childhood leukemia near operating nuclear plants.

There is relatively more data on populations exposed to radioactive contamination in the wake of the Chernobyl nuclear accident. Yet, even in this catastrophic case, the fact that the data has been collected and studied owes much to the persistence of Alexey Yablokov of the Russian Academy of Sciences. Yablokov has been examining Chernobyl outcomes since the early days of the crisis. His landmark collection of medical records and the scientific literature, Chernobyl: Consequences of the Catastrophe for People and the Environment, has its critics, who fault its strong warnings about the long-term dangers of radiation exposure, but it is that strident tone that Yablokov himself said was crucial to the evolution of global thinking about nuclear accidents.

Because of pressure from the scientific community and, as Yablokov stressed at the New York conference, pressure from the general public, as well, reaction to accidents since Chernobyl has evolved from “no immediate risk,” to small numbers who are endangered, to what is now called “indeterminate risk.”

Calling risk “indeterminate,” believe it or not, actually represents a victory for science, because it means more questions are asked–and asking more questions can lead to more and better answers.

Yablokov made it clear that it is difficult to estimate the real individual radiation dose–too much data is not collected early in a disaster, fallout patterns are patchy and different groups are exposed to different combinations of particles–but he drew strength from the volumes and variety of data he’s examined.

Indeed, as fellow conference participant, radiation biologist Ian Fairlie, observed, people can criticize Yablokov’s advocacy, but the data is the data, and in the Chernobyl book, there is lots of data.

Complex and consequential

Data presented at the Fukushima symposium also included much on what might have been–and continues to be–released by the failing nuclear plant in Japan, and how that contamination is already affecting populations on both sides of the Pacific.

Several of those present emphasized the need to better track releases of noble gasses, such as xenon-133, from the earliest days of a nuclear accident–both because of the dangers these elements pose to the public and because gas releases can provide clues to what is unfolding inside a damaged reactor. But more is known about the high levels of radioactive iodine and cesium contamination that have resulted from the Fukushima crisis.

In the US, since the beginning of the disaster, five west coast states have measured elevated levels of iodine-131 in air, water and kelp samples, with the highest airborne concentrations detected from mid-March through the end of April 2011. Iodine concentrates in the thyroid, and, as noted by Joseph Mangano, director of the Radiation and Public Health Project, fetal thyroids are especially sensitive. In the 15 weeks after fallout from Fukushima crossed the Pacific, the western states reported a 28-percent increase in newborn (congenital) hypothyroidism (underactive thyroid), according to the Open Journal of Pediatrics. Mangano contrasted this with a three-percent drop in the rest of the country during the same period.

The most recent data from Fukushima prefecture shows over 44 percent of children examined there have thyroid abnormalities.

Of course, I-131 has a relatively short half-life; radioactive isotopes of cesium will have to be tracked much longer.

With four reactors and densely packed spent fuel pools involved, Fukushima Daiichi’s “inventory” (as it is called) of cesium-137 dwarfed Chernobyl’s at the time of its catastrophe. Consequently, and contrary to some of the spin out there, the Cs-137 emanating from the Fukushima plant is also out-pacing what happened in Ukraine.

Estimates put the release of Cs-137 in the first months of the Fukushima crisis at between 64 and 114 petabecquerels (this number includes the first week of aerosol release and the first four months of ocean contamination). And the damaged Daiichi reactors continue to add an additional 240 million becquerels of radioactive cesium to the environment every single day. Chernobyl’s cesium-137 release is pegged at about 84 petabecquerels. (One petabecquerel equals 1,000,000,000,000,000 becquerels.) By way of comparison, the nuclear “device” dropped on Hiroshima released 89 terabecquerels (1,000 terabecquerels equal one petabecquerel) of Cs-137, or, to put it another way, Fukushima has already released more than 6,400 times as much radioactive cesium as the Hiroshima bomb.

The effects of elevated levels of radioactive cesium are documented in several studies across post-Chernobyl Europe, but while the implications for public health are significant, they are also hard to contain in a sound bite. As medical genetics expert Wladimir Wertelecki explained during the conference, a number of cancers and other serious diseases emerged over the first decade after Chernobyl, but the cycles of farming, consuming, burning and then fertilizing with contaminated organic matter will produce illness and genetic abnormalities for many decades to come. Epidemiological studies are only descriptive, Wertelecki noted, but they can serve as a “foundation for cause and effect.” The issues ahead for all of those hoping to understand the Fukushima disaster and the repercussions of the continued use of nuclear power are, as Wertelecki pointed out, “Where you study and what you ask.”

One of the places that will need some of the most intensive study is the Pacific Ocean. Because Japan is an island, most of Fukushima’s fallout plume drifted out to sea. Perhaps more critically, millions of gallons of water have been pumped into and over the damaged reactors and spent fuel pools at Daiichi, and because of still-unplugged leaks, some of that water flows into the ocean every day. (And even if those leaks are plugged and the nuclear fuel is stabilized someday, mountain runoff from the area will continue to discharge radionuclides into the water.) Fukushima’s fisheries are closed and will remain so as far into the future as anyone can anticipate. Bottom feeders and freshwater fish exhibit the worst levels of cesium, but they are only part of the picture. Ken Beusseler, a marine scientist at Woods Hole Oceanographic Institute, described a complex ecosystem of ocean currents, food chains and migratory fish, some of which carry contamination with them, some of which actually work cesium out of their flesh over time. The seabed and some beaches will see increases in radio-contamination. “You can’t keep just measuring fish,” warned Beusseler, implying that the entire Pacific Rim has involuntarily joined a multidimensional long-term radiation study.

For what it’s worth

Did anyone die as a result of the nuclear disaster that started at Fukushima Daiichi two years ago? Dr. Sakiyama, the Japanese investigator, told those assembled at the New York symposium that 60 patients died while being moved from hospitals inside the radiation evacuation zone–does that count? Joseph Mangano has reported on increases in infant deaths in the US following the arrival of Fukushima fallout–does that count? Will cancer deaths or future genetic abnormalities, be they at the low or high end of the estimates, count against this crisis?

It is hard to judge these answers when the question is so very flawed.

As discussed by many of the participants throughout the Fukushima conference, a country’s energy decisions are rooted in politics. Nuclear advocates would have you believe that their favorite fuel should be evaluated inside an extremely limited universe, that there is some level of nuclear-influenced harm that can be deemed “acceptable,” that questions stem from the necessity of atomic energy instead of from whether civilian nuclear power is necessary at all.

The nuclear industry would have you do a cost-benefit analysis, but they’d get to choose which costs and benefits you analyze.

While all this time has been and will continue to be spent on tracking the health and environmental effects of nuclear power, it isn’t a fraction of a fraction of the time that the world will be saddled with fission’s dangerous high-level radioactive trash (a problem without a real temporary storage program, forget a permanent disposal solution). And for all the money that has been and will continue to be spent compiling the health and environmental data, it is a mere pittance when compared with the government subsidies, liability waivers and loan guarantees lavished upon the owners and operators of nuclear plants.

Many individual details will continue to emerge, but a basic fact is already clear: nuclear power is not the world’s only energy option. Nor are the choices limited to just fossil and fissile fuels. Nuclear lobbyists would love to frame the debate–as would advocates for natural gas, oil or coal–as cold calculations made with old math. But that is not where the debate really resides.

If nuclear reactors were the only way to generate electricity, would 500 excess cancer deaths be acceptable? How about 5,000? How about 50,000? If nuclear’s projected mortality rate comes in under coal’s, does that make the deaths–or the high energy bills, for that matter–more palatable?

As the onetime head of the Tennessee Valley Authority, David Freeman, pointed out toward the end of the symposium, every investment in a new nuclear, gas or coal plant is a fresh 40-, 50-, or 60-year commitment to a dirty, dangerous and outdated technology. Every favor the government grants to nuclear power triggers an intense lobbying effort on behalf of coal or gas, asking for equal treatment. Money spent bailing out the past could be spent building a safer and more sustainable future.

Nuclear does not exist in a vacuum; so neither do its effects. There is much more to be learned about the medical and ecological consequences of the Fukushima nuclear disaster–but that knowledge should be used to minimize and mitigate the harm. These studies do not ask and are not meant to answer, “Is nuclear worth it?” When the world already has multiple alternatives–not just in renewable technologies, but also in conservation strategies and improvements in energy efficiency–the answer is already “No.”

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Fukushima Plus Two: Still the Beginning?

An IAEA inspector examines the remains of reactor 3 at Fukushima Daiichi (5/27/11) (photo: Greg Webb/IAEA imagebank)

An IAEA inspector examines the remains of reactor 3 at Fukushima Daiichi (5/27/11) (photo: Greg Webb/IAEA imagebank)

I was up working in what were in my part of the world the early morning hours of March 11, 2011, when I heard over the radio that a massive earthquake had struck northeastern Japan. I turned on the TV just in time to see the earliest pictures of the tsunami that followed what became known as the Tohoku quake. The devastation was instantly apparent, and reports of high numbers of casualties seemed inevitable, but it wasn’t until a few hours later, when news of the destruction and loss of power at the Fukushima Daiichi nuclear plant hit the English-language airwaves, that I was gripped by a real sense of despair.

I was far from a nuclear expert at the time, but I knew enough to know that without intact cooling systems, or the power to keep them running, and with the added threat of a containment breach, some amount of environmental contamination was certain, and the potential for something truly terrifying was high.

What started as a weekend of watching newswires and live streams, virtually around the clock, and posting basic tech and health questions on email lists, expanded as the Fukushima crisis itself grew. Two years later, I have written tens of thousands of words, and read hundreds of thousands more. I have learned much, but I think I have only scratched the surface.

We all might be a little closer to understanding what happened in those first days and weeks after the earthquake, but what has happened since is still, sadly, a story where much must be written. What the Daiichi plant workers really went through in those early days is just now coming to light, and the tales of intrigue and cover-up, of corruption and captured government, grow more complex and more sinister with each revelation. But what has happened to the environment, not just in the government-cordoned evacuation zone, but also throughout Japan, across the Pacific, and around the world, will likely prove the most chilling narrative.

Radiation levels in the quarantined parts of Japan are still far too high to permit any kind of human re-habitation, but exposure rates in areas far outside that radius are also well above what would have been considered acceptable before this disaster. And water, used to cool the molten cores and damaged spent fuel pools at Fukushima Daiichi, now dangerously radioactive itself, continues to leak into the ground and into the ocean at unprecedented rates.

Alas, the efforts of the Japanese government seem more focused on limiting the information, quieting dissent, and sharing the pain (by shipping radioactive detritus across the country for disposal and incineration), than it is on stopping the leaks, cleaning up the contamination, and eliminating future risks. Though originally pledged to quickly turn away from all nuclear power, a change of government in Japan has revived the incestuous relationship between the nuclear industry and the bureaucrats and politicians who are supposed to police it.

Across the Pacific, the United States has not exactly bathed itself in glory, either. Within days of the news of the explosions at Fukushima, President Barack Obama was the rare world leader that made a point of publicly assuring the nuclear industry that America’s commitment to this dangerous energy source was still strong. Just months after the start of the crisis, information on airborne radiation samples from across the country became less accessible to the public. And while industrialized countries like Germany work to phase out their nuclear plants, the US Nuclear Regulatory Commission actually approved construction of new reactors, and the federal government is poised to backstop the baldly risky investment to the tune of $8.3 billon.

But most disturbing of all, of course, will be the stories of the people. First, the stories we will hear from the families in Japan exposed to the toxic fallout in the immediate aftermath of the initial containment breaches and explosions–stories we are already hearing of children with severe thyroid abnormalities. But soon, and likely for decades to come, the stories of cancers and immune disorders, of birth defects and health challenges, elevated not only in northern Japan, but perhaps across the northern hemisphere.

Two years after the earthquake and tsunami, it is not the beginning of the end of this disaster, and, with apologies to Winston Churchill, it may not even be the end of the beginning. The spent fuel pool at Daiichi reactor 4 remains in precarious shape, and the state of the three molten cores is still shrouded in mystery. Radioactive dust and grime blanket large parts of Japan with no serious plan to remove it, and the waters off the northeast coast continue to absorb irradiated runoff, putting an entire aquatic food chain in peril.

On this second anniversary of the start of the Fukushima crisis, let us honor those who have suffered so far, review what we have learned to date, and endeavor to understand what is likely to come. But, most of all, let us renew our commitment to breaking with this dirty, dangerous and expensive technology.

* * *

To this end, on March 11 and 12, I will be attending a symposium at the New York Academy of Medicine, “The Medical and Ecological Consequences of the Fukushima Nuclear Accident,” sponsored by the Helen Caldicott Foundation and Physicians for Social Responsibility. If you are in the New York area, there is still space available; if you want to watch online, the organizers have promised a live stream. More information can be found on the Caldicott Foundation website.

The Long, Long Con: Seventy Years of Nuclear Fission; Thousands of Centuries of Nuclear Waste

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi's first atomic pile--and the start of the world's nuclear waste problem. (Photo: Nathan Guy via Flickr)

From here to eternity: a small plaque on the campus of the University of Chicago commemorates the site of Fermi’s first atomic pile–and the start of the world’s nuclear waste problem. (Photo: Nathan Guy via Flickr)

On December 2, 1942, a small group of physicists under the direction of Enrico Fermi gathered on an old squash court beneath Alonzo Stagg Stadium on the Campus of the University of Chicago to make and witness history. Uranium pellets and graphite blocks had been stacked around cadmium-coated rods as part of an experiment crucial to the Manhattan Project–the program tasked with building an atom bomb for the allied forces in WWII. The experiment was successful, and for 28 minutes, the scientists and dignitaries present observed the world’s first manmade, self-sustaining nuclear fission reaction. They called it an atomic pile–Chicago Pile 1 (CP-1), to be exact–but what Fermi and his team had actually done was build the world’s first nuclear reactor.

The Manhattan Project’s goal was a bomb, but soon after the end of the war, scientists, politicians, the military and private industry looked for ways to harness the power of the atom for civilian use, or, perhaps more to the point, for commercial profit. Fifteen years to the day after CP-1 achieved criticality, President Dwight Eisenhower threw a ceremonial switch to start the reactor at Shippingport, PA, which was billed as the first full-scale nuclear power plant built expressly for civilian electrical generation.

Shippingport was, in reality, little more than a submarine engine on blocks, but the nuclear industry and its acolytes will say that it was the beginning of billions of kilowatts of power, promoted (without a hint of irony) as “clean, safe, and too cheap to meter.” It was also, however, the beginning of what is now a, shall we say, weightier legacy: 72,000 tons of nuclear waste.

Atoms for peace, problems forever

News of Fermi’s initial success was communicated by physicist Arthur Compton to the head of the National Defense Research Committee, James Conant, with artistically coded flair:

Compton: The Italian navigator has landed in the New World.
Conant: How were the natives?
Compton: Very friendly.

But soon after that initial success, CP-1 was disassembled and reassembled a short drive away, in Red Gate Woods. The optimism of the physicists notwithstanding, it was thought best to continue the experiments with better radiation shielding–and slightly removed from the center of a heavily populated campus. The move was perhaps the first necessitated by the uneasy relationship between fissile material and the health and safety of those around it, but if it was understood as a broader cautionary tale, no one let that get in the way of “progress.”

A stamp of approval: the US Postal Service commemorated Eisenhower's initiative in 1955.

A stamp of approval: the US Postal Service commemorated Eisenhower’s initiative in 1955.

By the time the Shippingport reactor went critical, North America already had a nuclear waste problem. The detritus from manufacturing atomic weapons was poisoning surrounding communities at several sites around the continent (not that most civilians knew it at the time). Meltdowns at Chalk River in Canada and the Experimental Breeder Reactor in Idaho had required fevered cleanups, the former of which included the help of a young Navy officer named Jimmy Carter. And the dangers of errant radioisotopes were increasing with the acceleration of above-ground atomic weapons testing. But as President Eisenhower extolled “Atoms for Peace,” and the US Atomic Energy Commission promoted civilian nuclear power at home and abroad, a plan to deal with the “spent fuel” (as used nuclear fuel rods are termed) and other highly radioactive leftovers was not part of the program (beyond, of course, extracting some of the plutonium produced by the fission reaction for bomb production, and the promise that the waste generated by US-built reactors overseas could at some point be marked “return to sender” and repatriated to the United States for disposal).

Attempts at what was called “reprocessing”–the re-refining of used uranium into new reactor fuel–quickly proved expensive, inefficient and dangerous, and created as much radioactive waste as it hoped to reuse. It also provided an obvious avenue for nuclear weapons proliferation because of the resulting production of plutonium. The threat of proliferation (made flesh by India’s test of an atomic bomb in 1976) led President Jimmy Carter to cancel the US reprocessing program in 1977. Attempts by the Department of Energy to push mixed-oxide (MOX) fuel fabrication (combining uranium and plutonium) over the last dozen years has not produced any results, either, despite over $5 billion in government investments.

In fact, there was no official federal policy for the management of used but still highly radioactive nuclear fuel until passage of The Nuclear Waste Policy Act of 1982. And while that law acknowledged the problem of thousands of tons of spent fuel accumulating at US nuclear plants, it didn’t exactly solve it. Instead, the NWPA started a generation of political horse trading, with goals and standards defined more by market exigencies than by science, that leaves America today with what amounts to over five-dozen nominally temporary repositories for high-level radioactive waste–and no defined plan to change that situation anytime soon.

When you assume…

When a US Court of Appeals ruled in June that the Nuclear Regulatory Commission acted improperly when it failed to consider all the risks of storing spent radioactive fuel onsite at the nation’s nuclear power facilities, it made specific reference to the lack of any real answers to the generations-old question of waste storage:

[The Nuclear Regulatory Commission] apparently has no long-term plan other than hoping for a geologic repository. . . . If the government continues to fail in its quest to establish one, then SNF (spent nuclear fuel) will seemingly be stored on site at nuclear plants on a permanent basis. The Commission can and must assess the potential environmental effects of such a failure.

The court concluded the current situation–where spent fuel is stored across the country in what were supposed to be temporary configurations–“poses a dangerous long-term health and environmental risk.”

The decision also harshly criticized regulators for evaluating plant relicensing with the assumption that spent nuclear fuel would be moved to a central long-term waste repository.

A mountain of risks

The Nuclear Waste Policy Act set in motion an elaborate process that was supposed to give the US a number of possible waste sites, but, in the end, the only option seriously explored was the Yucca Mountain site in Nevada. After years of preliminary construction and tens of millions of dollars spent, Yucca was determined to be a bad choice for the waste:

[Yucca Mountain’s] volcanic formation is more porous and less isolated than originally believed–there is evidence that water can seep in, there are seismic concerns, worries about the possibility of new volcanic activity, and a disturbing proximity to underground aquifers. In addition, Yucca mountain has deep spiritual significance for the Shoshone and Paiute peoples.

Every major Nevada politician on both sides of the aisle has opposed the Yucca repository since its inception. Senate Majority Leader Harry Reid has worked most of his political life to block the facility. And with the previous NRC head, Gregory Jaczko, (and now his replacement, Allison Macfarlane, as well) recommending against it, the Obama administration’s Department of Energy moved to end the project.

Even if it were an active option, Yucca Mountain would still be many years and maybe as much as $100 million away from completion. And yet, the nuclear industry (through recipients of its largesse in Congress) has challenged the administration to spend any remaining money in a desperate attempt to keep alive the fantasy of a solution to their waste crisis.

Such fevered dreams, however, do not qualify as an actual plan, according to the courts.

The judges also chastised the NRC for its generic assessment of spent fuel pools, currently packed well beyond their projected capacity at nuclear plants across the United States. Rather than examine each facility and the potential risks specific to its particular storage situation, the NRC had only evaluated the safety risks of onsite storage by looking at a composite of past events. The court ruled that the NRC must appraise each plant individually and account for potential future dangers. Those dangers include leaks, loss of coolant, and failures in the cooling systems, any of which might result in contamination of surrounding areas, overheating and melting of stored rods, and the potential of burning radioactive fuel–risks heightened by the large amounts of fuel now densely packed in the storage pools and underscored by the ongoing disaster at Japan’s Fukushima Daiichi plant.

Indeed, plants were not designed nor built to house nuclear waste long-term. The design life of most reactors in the US was originally 40 years. Discussions of the spent fuel pools usually gave them a 60-year lifespan. That limit seemed to double almost magically as nuclear operators fought to postpone the expense of moving cooler fuel to dry casks and of the final decommissioning of retired reactors.

Everyone out of the pool

As disasters as far afield as the 2011 Tohoku earthquake and last October’s Hurricane Sandy have demonstrated, the storage of spent nuclear fuel in pools requires steady supplies of power and cool water. Any problem that prevents the active circulation of liquid through the spent fuel pools–be it a loss of electricity, the failure of a back-up pump, the clogging of a valve or a leak in the system–means the temperature in the pools will start to rise. If the cooling circuit is out long enough, the water in the pools will start to boil. If the water level dips (due to boiling or a leak) enough to expose hot fuel rods to the air, the metal cladding on the rods will start to burn, in turn heating the fuel even more, resulting in plumes of smoke carrying radioactive isotopes into the atmosphere.

And because these spent fuel pools are so full–containing as much as five times more fuel than they were originally designed to hold, and at densities that come close to those in reactor cores–they both heat stagnant water more quickly and reach volatile temperatures faster when exposed to air.

A spent fuel pool and dry casks. (Both photos courtesy of the US Nuclear Regulatory Commission)

A spent fuel pool and dry casks. (Both photos courtesy of the US Nuclear Regulatory Commission)

After spent uranium has been in a pool for at least five years (considerably longer than most fuel is productive as an energy source inside the reactor), fuel rods are deemed cool enough to be moved to dry casks. Dry casks are sealed steel cylinders filled with spent fuel and inert gas, which are themselves encased in another layer of steel and concrete. These massive fuel “coffins” are then placed outside, spaced on concrete pads, so that air can circulate and continue to disperse heat.

While the long-term safety of dry casks is still in question, the fact that they require no active cooling system gives them an advantage, in the eyes of many experts, over pool storage. As if to highlight that difference, spent fuel pools at Fukushima Daiichi have posed some of the greatest challenges since the March 2011 earthquake and tsunami, whereas, to date, no quake or flood-related problems have been reported with any of Japan’s dry casks. The disparity was so obvious, that the NRC’s own staff review actually added a proposal to the post-Fukushima taskforce report, recommending that US plants take more fuel out of spent fuel pools and move it to dry casks. (A year-and-a-half later, however, there is still no regulation–or even a draft–requiring such a move.)

But current dry cask storage poses its own set of problems. Moving fuel rods from pools to casks is slow and costly–about $1.5 million per cask, or roughly $7 billion to move all of the nation’s spent fuel (a process, it is estimated, that would take no less than five to ten years). That is expensive enough to have many nuclear plant operators lobbying overtime to avoid doing it.

Further, though not as seemingly vulnerable as fuel pools, dry casks are not impervious to natural disaster. In 2011, a moderate earthquake centered about 20 miles from the North Anna, Virginia, nuclear plant caused most of its vertical dry casks–each weighing 115 tons–to shift, some by more than four inches. The facility’s horizontal casks didn’t move, but some showed what was termed “cosmetic damage.”

Dry casks at Michigan’s Palisades plant sit on a pad atop a sand dune just 100 yards from Lake Michigan. An earthquake there could plunge the casks into the water. And the casks at Palisades are so poorly designed and maintained, submersion could result in water contacting the fuel, contaminating the lake and possibly triggering a nuclear chain reaction.

And though each cask contains far less fissile material than one spent fuel pool, casks are still considered possible targets for terrorism. A TOW anti-tank missile would breach even the best dry cask (PDF), and with 25 percent of the nation’s spent fuel now stored in hundreds of casks across the country, all above ground, it provides a rich target environment.

Confidence game

Two months after the Appeals Court found fault with the Nuclear Regulatory Commission’s imaginary waste mitigation scenario, the NRC announced it would suspend the issuing of new reactor operating licenses, license renewals and construction licenses until the agency could craft a new plan for dealing with the nation’s growing spent nuclear fuel crisis. In drafting its new nuclear “Waste Confidence Decision” (NWCD)–the methodology used to assess the hazards of nuclear waste storage–the Commission said it would evaluate all possible options for resolving the issue.

At first, the NRC said this could include both generic and site-specific actions (remember, the court criticized the NRC’s generic appraisals of pool safety), but as the prescribed process now progresses, it appears any new rule will be designed to give the agency, and so, the industry, as much wiggle room as possible. At a public hearing in November, and later at a pair of web conferences in early December, the regulator’s Waste Confidence Directorate (yes, that’s what it is called) outlined three scenarios (PDF) for any future rulemaking:

  • Storage until a repository becomes available at the middle of the century
  • Storage until a repository becomes available at the end of the century
  • Continued storage in the event a repository is not available

And while, given the current state of affairs, the first option seems optimistic, the fact that their best scenario now projects a repository to be ready by about 2050 is a story in itself.

When the Nuclear Waste Policy Act was signed into law by President Reagan early in 1983, it was expected the process it set in motion would present at least one (and preferably another) long-term repository by the late 1990s. But by the time the “Screw Nevada Bill” (as it is affectionately known in the Silver State) locked in Yucca Mountain as the only option for permanent nuclear waste storage, the projected opening was pushed back to 2007.

But Yucca encountered problems from its earliest days, so a mid-’90s revision of the timeline postponed the official start, this time to 2010. By 2006, the Department of Energy was pegging Yucca’s opening at 2017. And, when the NWPA was again revised in 2010–after Yucca was deemed a non-option–it conveniently avoided setting a date for the opening of a national long-term waste repository altogether.

It was that 2010 revision that was thrown out by the courts in June.

“Interim storage” and “likely reactors”

So, the waste panel now has three scenarios–but what are the underlying assumptions for those scenarios? Not, obviously, any particular site for a centralized, permanent home for the nation’s nuclear garbage–no new site has been chosen, and it can’t even be said there is an active process at work that will choose one.

There are the recommendations of a Blue Ribbon Commission (BRC) convened by the president after Yucca Mountain was off the table. Most notable there, was a recommendation for interim waste storage, consolidated at a handful of locations across the country. But consolidated intermediate waste storage has its own difficulties, not the least of which is that no sites have yet been chosen for any such endeavor. (In fact, plans for the Skull Valley repository, thought to be the interim facility closest to approval, were abandoned by its sponsors just days before Christmas.)

Just-retired New Mexico Senator Jeff Bingaman (D), the last chair of the Energy and Natural Resources Committee, tried to turn the BRC recommendations into law. When he introduced his bill in August, however, he had to do so without any cosponsors. Hearings on the Nuclear Waste Administration Act of 2012 were held in September, but the gavel came down on the 112th Congress without any further action.

In spite of the underdeveloped state of intermediate storage, however, when the waste confidence panel was questioned on the possibility, interim waste repositories seemed to emerge, almost on the fly, as an integral part of any revised waste policy rule.

“Will any of your scenarios include interim centralized above-ground storage?” we asked during the last public session. Paul Michalak, who heads the Environmental Impact Statement branch of the Waste Confidence Directorate, first said temporary sites would be considered in the second and third options. Then, after a short pause, Mr. Michalak added (PDF p40), “First one, too. All right. Right. That’s right. So we’re considering an interim consolidated storage facility [in] all three scenarios.”

The lack of certainty on any site or sites is, however, not the only fuzzy part of the picture. As mentioned earlier, the amount of high-level radioactive waste currently on hand in the US and in need of a final resting place is upwards of 70,000 tons–already at the amount that was set as the initial limit for the Yucca Mountain repository. Given that there are still over 100 domestic commercial nuclear reactors more or less in operation, producing something like an additional 2,000 tons of spent fuel every year, what happens to the Waste Confidence Directorate’s scenarios as the years and waste pile up? How much waste were regulators projecting they would have to deal with–how much spent fuel would a waste confidence decision assume the system could confidently handle?

There was initial confusion on what amount of waste–and at what point in time–was informing the process. Pressed for clarification on the last day of hearings, NRC officials finally posited that it was assumed there would be 150,000 metric tons of spent fuel–all deriving from the commercial reactor fleet–by 2050. By the end of the century, the NRC expects to face a mountain of waste weighing 270,000 metric tons (PDF pp38-41) (though this figure was perplexingly termed both a “conservative number” and an “overestimate”).

How did the panel arrive at these numbers? Were they assuming all 104 (soon to be 103–Wisconsin’s Kewaunee Power Station will shut down by mid-2013 for reasons its owner, Dominion Resources, says are based “purely on economics”) commercial reactors nominally in operation would continue to function for that entire time frame–even though many are nearing the end of their design life and none are licensed to continue operation beyond the 2030s? Were they counting reactors like those at San Onofre, which have been offline for almost a year, and are not expected to restart anytime soon? Or the troubled reactors at Ft. Calhoun in Nebraska and Florida’s Crystal River? Neither facility has been functional in recent years, and both have many hurdles to overcome if they are ever to produce power again. Were they factoring in the projected AP1000 reactors in the early stages of construction in Georgia, or the ones slated for South Carolina? Did the NRC expect more or fewer reactors generating waste over the course of the next 88 years?

The response: waste estimates include all existing facilities, plus “likely reactors”–but the NRC cannot say exactly how many reactors that is (PDF p41).

Jamming it through

Answers like those from the Waste Confidence Directorate do not inspire (pardon the expression) confidence for a country looking at a mountain of eternally toxic waste. Just what would the waste confidence decision (and the environmental impact survey that should result from it) actually cover? What would it mandate, and what would change as a result?

How long is it? Does this NRC chart provide a justification for the narrow scope of the waste confidence process? (US Nuclear Regulatory PDF, p12)

How long is it? Does this NRC chart provide a justification for the narrow scope of the waste confidence process? (US Nuclear Regulatory PDF, p12)

In past relicensing hearings–where the public could comment on proposed license extensions on plants already reaching the end of their 40-year design life–objections based on the mounting waste problem and already packed spent fuel pools were waived off by the NRC, which referenced the waste confidence decision as the basis of its rationale. Yet, when discussing the parameters of the process for the latest, court-ordered revision to the NWCD, Dr. Keith McConnell, Director of the Waste Confidence Directorate, asserted that waste confidence was not connected to the site-specific licensed life of operations (PDF p42), but only to a period defined as “Post-Licensed Life Storage” (which appears, if a chart in the directorate’s presentation (PDF p12) is to be taken literally, to extend from 60 years after the initial creation of waste, to 120 years–at which point a phase labeled “Disposal” begins). Issues of spent fuel pool and dry cask safety are the concerns of a specific plant’s relicensing process, said regulators in the latest hearings.

“It’s like dealing with the Mad Hatter,” commented Kevin Kamps, a radioactive waste specialist for industry watchdog Beyond Nuclear. “Jam yesterday, jam tomorrow, but never jam today.”

The edict originated with the White Queen in Lewis Carroll’s Through the Looking Glass, but it is all too appropriate–and no less maddening–when trying to motivate meaningful change at the Nuclear Regulatory Commission. The NRC has used the nuclear waste confidence decision in licensing inquiries, but in these latest scoping hearings, we are told the NWCD does not apply to on-site waste storage. The Appeals Court criticized the lack of site-specificity in the waste storage rules, but the directorate says they are now only working on a generic guideline. The court disapproved of the NRC’s continued relicensing of nuclear facilities based on the assumption of a long-term geologic repository that in reality did not exist–and the NRC said it was suspending licensing pending a new rule–but now regulators say they don’t anticipate the denial or even the delay of any reactor license application while they await the new waste confidence decision (PDF pp49-50).

In fact, the NRC has continued the review process on pending applications, even though there is now no working NWCD–something deemed essential by the courts–against which to evaluate new licenses.

The period for public comment on the scope of the waste confidence decision ended January 2, and no more scoping hearings are planned. There will be other periods for civic involvement–during the environmental impact survey and rulemaking phases–but, with each step, the areas open to input diminish. And the current schedule has the entire process greatly accelerated over previous revisions.

On January 3, a coalition of 24 grassroots environmental groups filed documents with the Nuclear Regulatory Commission (PDF) protesting “the ‘hurry up’ two-year timeframe” for this assessment, noting the time allotted for environmental review falls far short of the 2019 estimate set by the NRC’s own technical staff. The coalition observed that two years was also not enough time to integrate post-Fukushima recommendations, and that the NRC was narrowing the scope of the decision–ignoring specific instructions from the Appeals Court–in order to accelerate the drafting of a new waste storage rule.

Speed might seem a valuable asset if the NRC were shepherding a Manhattan Project-style push for a solution to the ever-growing waste problem–the one that began with the original Manhattan Project–but that is not what is at work here. Instead, the NRC, under court order, is trying to set the rules for determining the risk of all that high-level radioactive waste if there is no new, feasible solution. The NRC is looking for a way to permit the continued operation of the US nuclear fleet–and so the continued manufacture of nuclear waste–without an answer to the bigger, pressing question.

A plan called HOSS

While there is much to debate about what a true permanent solution to the nuclear waste problem might look like, there is little question that the status quo is unacceptable. Spent fuel pools were never intended to be used as they are now used–re-racked and densely packed with over a generation of fuel assemblies. Both the short- and long-term safety and security of the pools has now been questioned by the courts and laid bare by reality. Pools at numerous US facilities have leaked radioactive waste (PDF) into rivers, groundwater and soil. Sudden “drain downs” have come perilously close to triggering major accidents in plants shockingly close to major population centers. Recent hurricanes have knocked out power to cooling systems and flooded backup generators, and last fall’s superstorm came within inches of overwhelming the coolant intake structure at Oyster Creek in New Jersey.

The crisis at Japan’s Fukushima Daiichi facility was so dangerous and remains dangerous to this day in part because of the large amounts of spent fuel stored in pools next to the reactors but outside of containment–a design identical to 35 US nuclear reactors. A number of these GE Mark 1 Boiling Water Reactors–such as Oyster Creek and Vermont Yankee–have more spent fuel packed into their individual pools than all the waste in Fukushima Daiichi Units 1, 2, 3, and 4 combined.

Dry casks, the obvious next “less-bad” option for high-level radioactive waste, were also not supposed to be a permanent panacea. The design requirements and manufacturing regulations of casks–especially the earliest generations–do not guarantee their reliability anywhere near the 100 to 300 years now being casually tossed around by NRC officials. Some of the nation’s older dry casks (which in this case means 15 to 25 years) have already shown seal failures and structural wear (PDF). Yet, the government does not require direct monitoring of casks for excessive heat or radioactive leaks–only periodic “walkthroughs.”

Add in the reluctance of plant operators to spend money on dry cask transfer and the lack of any workable plan to quickly remove radioactive fuel from failed casks, and dry cask storage also appears to fail to attain any court-ordered level of confidence.

Interim plans, such as regional consolidated above-ground storage, remain just that–plans. There are no sites selected and no designs for such a facility up for public scrutiny. What is readily apparent, though, is that the frequent transport of nuclear waste increases the risk of nuclear accidents. There does not, as of now, exist a transfer container that is wholly leak proof, accident proof, and impervious to terrorist attack. Moving high-level radioactive waste across the nation’s highways, rail lines and waterways has raised fears of “Mobile Chernobyls” and “Floating Fukushimas.”

More troubling still, if past (and present) is prologue, is the tendency of options designed as “interim” to morph into a default “permanent.” Can the nation afford to kick the can once more, spending tens (if not hundreds) of millions of dollars on a “solution” that will only add a collection of new challenges to the existing roster of problems? What will the interim facilities become beyond the next problem, the next site for costly mountains of poorly stored, dangerous waste?

Hardened: The more robust HOSS option as proposed in 2003. (From "Robust Storage of Spent Nuclear Fuel: A Neglected Issue of Homeland Security" courtesy of the Nuclear Information and Resource Service)

Hardened: The more robust HOSS option as proposed in 2003. (From “Robust Storage of Spent Nuclear Fuel: A Neglected Issue of Homeland Security” courtesy of the Nuclear Information and Resource Service)

If there is an interim option favored by many nuclear experts, engineers and environmentalists (PDF), it is something called HOSS–Hardened On-Site Storage (PDF). HOSS is a version of dry cask storage that is designed and manufactured to last longer, is better protected against leaks and better shielded from potential attacks. Proposals (PDF) involve steel, concrete and earthen barriers incorporating proper ventilation and direct monitoring for heat and radiation.

But not all reactor sites are good candidates for HOSS. Some are too close to rivers that regularly flood, some are vulnerable to the rising seas and increasingly severe storms brought on by climate change, and others are close to active geologic fault zones. For facilities where hardened on-site storage would be an option, nuclear operators will no doubt fight the requirements because of the increased costs above and beyond the price of standard dry cask storage, which most plant owners already try to avoid or delay.

The first rule of holes

Mixed messages: A simple stone marker in Red Gate Woods, just outside Chicago, tries to both warn and reassure visitors to this public park. (Photo: Kevin Kamps, Beyond Nuclear. Used by permission.)

Mixed messages: A simple stone marker in Red Gate Woods, just outside Chicago, tries to both warn and reassure visitors to this public park. (Photo: Kevin Kamps, Beyond Nuclear. Used by permission.)

In a wooded park just outside Chicago sits a dirt mound, near a bike path, that contains parts of the still-highly radioactive remains of CP-1, the world’s first atomic pile. Seven decades after that nuclear fuel was first buried, many health experts would not recommend that spot (PDF) for a long, languorous picnic, nor would they recommend drinking from nearby water fountains. To look at it in terms Arthur Compton might favor, when it comes to the products of nuclear chain reactions, the natives are restless. . . and will remain so for millennia to come.

One can perhaps forgive those working in the pressure cooker of the Manhattan Project and in the middle of a world war for ignoring the forest for the trees–for not considering waste disposal while pursuing a self-sustaining nuclear chain reaction. Perhaps. But, as the burial mound in Red Gate Woods reminds us, ignoring a problem does not make it go away.

And if that small pile, or the mountains of spent fuel precariously stored around the nation are not enough of a prompt, the roughly $960 million that the federal government has had to pay private nuclear operators should be. For every year that the Department of Energy does not provide a permanent waste repository–or at least some option that takes the burden of storing spent nuclear fuel off the hands (and off the books) of power companies–the government is obligated to reimburse the industry for the costs of onsite waste storage. By 2020, it is estimated that $11 billion in public money will have been transferred into the pockets of private nuclear companies. By law, these payments cannot be drawn from the ratepayer-fed fund that is earmarked for a permanent geologic repository, and so, these liabilities must be paid out of the federal budget. Legal fees for defending the DoE against these claims will add another 20 to 30 percent to settlement costs.

The Federal Appeals Court, too, has sent a clear message that the buck needs to stop somewhere at some point–and that such a time and place should be both explicit and realistic. The nuclear waste confidence scoping process, however, is already giving the impression that the NRC’s next move will be generic and improbable.

The late, great Texas journalist Molly Ivins once remarked, “The first rule of holes” is “when you’re in one, stop digging.” For high-level radioactive waste, that hole is now a mountain, over 70 years in the making and over 70,000 tons high. If the history of the atomic age is not evidence enough, the implications of the waste confidence decision process put the current crisis in stark relief. There is, right now, no good option for dealing with the nuclear detritus currently on hand, and there is not even a plan to develop a good option in the near future. Without a way to safely store the mountain of waste already created, under what rationale can a responsible government permit the manufacture of so much more?

The federal government spends billions to perpetuate and protect the nuclear industry–and plans to spend billions more to expand the number of commercial reactors. Dozens of facilities already are past, or are fast approaching, the end of their design lives, but the Nuclear Regulatory Commission has yet to reject any request for an operating license extension–and it is poised to approve many more, nuclear waste confidence decision not withstanding. Plant operators continue to balk at any additional regulations that would require better waste management.

The lesson of the first 70 years of fission is that we cannot endure more of the same. The government–from the DoE to the NRC–should reorient its priorities from creating more nuclear waste to safely and securely containing what is now here. Money slated for subsidizing current reactors and building new ones would be better spent on shuttering aging plants, designing better storage options for their waste, modernizing the electrical grid, and developing sustainable energy alternatives. (And reducing demand through conservation programs should always be part of the conversation.)

Enrico Fermi might not have foreseen (or cared about) the mountain of waste that began with his first atomic pile, but current scientists, regulators and elected officials have the benefit of hindsight. If the first rule of holes says stop digging, then the dictum here should be that when you’re trying to summit a mountain, you don’t keep shoveling more garbage on top.

A version of this story previously appeared on Truthout; no version may be reprinted without permission.

Blogiversary VII: The FISA-ing

The main ingredient, un-7-up'd. (photo: Craig Duncan via Wikipedia)

The main ingredient, un-7-up’d. (photo: Craig Duncan via Wikipedia)

December 30, 2005–a day that will live in infamy.

Well, for me, anyway. (And maybe a few of you.) For it was on that day–seven years ago, today–that all of this began. . . all of this blogging thing.

No, not for everybody–this is about me!

A callow newbie to some, a grizzled vet to others, as of today, I have been in the blogging game for seven years, and so, in keeping with the tradition established by my original blog–guy2K: a journal of politics, popular culture, and mixed drinks–I give you a themed cocktail:

The Seven & Seven

Pour 2 oz. Seagram’s Seven Crown Whiskey into a highball glass; fill with ice.

Top with about 7 oz. 7-Up; stir lightly.

Garnish with a lemon slice.

I know, that seems pretty humdrum for this age of the artisanal cocktail. Whiskey and soda pop, how high school! But not only is it so seventh anniversary appropriate–so seven nice they seven’d it twice–it is also special for another reason. It is perhaps the most branded cocktail recipe I know. Sure, you could mix Jeremiah Weed and Bubble Up, and it might taste pretty darn similar, but what the hell are you going to call it? The Weed & Bubble?

That does not sound good.

And the Seven & Seven is not just a good drink for my seventh blogiversary (yes, I used to spell it “blogaversary,” but this seems to now be an actual thing, and the spelling with the “i” now seems to be the preferred one)–it being all seven-ie and all–the Seven & Seven’s specificity makes it a very appropriate cocktail for this last weekend of the year for a more, shall we say, “all inclusive” reason.

Friday morning, while some were distracted by Washington’s self-inflicted fiscal clusterfuck, and most were distracted by things that had nothing at all to do with Washington, the US Senate passed a five-year extension to the FISA Amendments Act (FAA)–the oversight-deficient warrantless surveillance program started by the George W. Bush administration. The vote was 73 to 23.

During my first few blogging years, I wrote a lot about the domestic surveillance, FISA (+here) and the Bush administration’s wholesale disregard for the Fourth Amendment. In 2007 and 2008, I hit these topics often, especially as Congress moved forward with the Protect America Act (PAA) and the original FAA, which were supposed to be ways for freshly installed Democratic majorities to expose and rein-in the Bush-Cheney surveillance state.

What actually happened–and you can watch this unfold across my old posts–was, of course, something else. Democrats, either out of expedience, cowardice or naked self-interest, wound up passing a law that went a long way toward legalizing what Bush’s bunch had only hoped to get away with in secret. And not to be missed in that pre-election-year and election-year dog and pony were the positions of certain Senate members who aspired to replace George W. Bush. Most notably, those of the guy that grabbed that brass ring: Barack Obama.

Senator Barack Obama (D-IL) made much of his public opposition to much of what the Bush administration had been doing surveillance-wise in the name of fighting terror (more on that in a moment), he opposed retroactive immunity for telecoms (a key feature of the act) and said he would support a filibuster of the bill threatened by one of his early rivals for the presidential nomination, Chris Dodd (D-CT). But when proverbial push came to proverbial shove a few months later, the distinguished gentleman from Illinois not only failed to push any meaningful changes to the FAA that might have restored some of the rule of law, he actually helped break the filibuster of the bill. Then Obama voted in favor of the nasty new act.

Such an obvious stiff-arm was this to a group of Democrats Obama hoped to have in his camp come election time, that, if my memory serves, he pretty immediately came out with a video (was it on YouTube? I think so. Anyway, here’s the text of his statement) where he said he of course had many problems with the blah blah blah, but because the tools were essential in the fight against terror blah blah blah, he would vote for it now, then work to fix it should he become president. . . blah. . . blah. . . blah.

Fast forward five years–that’s the equivalent of FIVE blog years–and you find a President Barack Obama that has not worked to fix it, but has arguably worked very hard to expand the abilities of the national security apparatus to spy on United States citizens. And on Friday, with the help of the Democratic leadership of the Democrat-controlled Senate, the president worked to beat back each and every amendment to the FAA extension–many of which were proposed by Democrats–that would have tried to if not fix the FISA, at least provide some access to some of the broad outlines of what has been done to Americans by the American government, in the hopes that this bit of sunshine could lead to better oversight.

Why the need for a “clean” bill RIGHT NOW!?!?!

Well, the GOP-controlled House passed this version way back in September, and what with the law sunsetting on December 31, there just wouldn’t be time now to send amendments to any kind of conference, and you can’t let the law expire cause then the terrorists. . . blah blah blah.

Of course, the terrorists nothing. If the Senate did let January 1 come without acting on this gun-to-their-head extension, absolutely nothing going down on behalf of the GWOT would change. All the permissions on all the ongoing investigations would remain good and open for many months to come. But that leads to an even more important point, one again almost completely–no, let’s, this time, just say completely–absent from the coverage of the FISA Amendments Act.

I do not right now have time to go back and refresh everyone’s memory on the history of FISA (I’ve got anniversary cocktails to drink, after all), but let’s just say that even the original 1978 law–though drafted in reaction to illegal Nixon-era domestic surveillance–still had plenty of room for national security intel interests to get legal cover for some types of domestic spying. But the law did impose some limits and some oversight.

But all that changed on. . . .

I know what you’re thinking. You’re thinking I’m going to say “All that changed on 9/11,” aren’t you? I don’t blame you. Back during the PAA/FAA battles five years ago, most reports spoke of the expansion of domestic surveillance in response to the terror attacks on September 11, 2001. Again today–and I’m not going to dig up any links, but throw a rock at any major newspaper, and if you throw it hard enough, hard enough, say, to get back to page A23, because that’s about as close to the front page as this story will get–today you will read that all of this last-minute congressional kabuki all stems from Bush’s original violations of the old FISA law post-9/11/01.

And that would be a big, fat lie. Now, maybe some of today’s reporters weren’t on this story in 2007 or 2008, and they just took for granted the sort of “war on terror” shorthand that comes affixed to this topic, so I guess that would just make it a big, fat, lazy lie–but this idea is just as untrue (and just as easy to uncover as untrue) today as it was way back when I wrote about it the first time.

So, let’s all get this straight for the record: The Bush administration’s expansion of domestic spying was not a response to the terror attacks of 9/11. The Bush admin’s expansion of domestic spying pre-dates those attacks. Bush’s expansion of domestic spying starts as early as February 2001–just weeks after W was inaugurated. Here’s what I wrote on October 19, 2007:

Really, enough with this fairy tale already. If the events of last week involving the statements of former Qwest CEO Joe Nacchio have taught you nothing, perhaps you should go back and read some of the press from early 2006, or, perhaps, James Risen’s book. But no matter which of these sources you read, you should come away with the same understanding: The Bush Administration began collecting phone and e-mail data without a warrant and/or began eavesdropping on US citizens inside the country without a warrant before the attacks of September 11, 2001. Surveillance might—might—have increased after 9/11, but it is now increasingly clear there was plenty going on from the earliest days of Bush-Cheney rule.

You can follow the links in that block quote for more detail.

What do those seven (there’s that number again) months tell you? It tells you that the ramp-up of illegal domestic surveillance was not about uncovering the next al-Qaeda cell (remember how hard it was for Clinton administration holdovers to get any of Bush’s team to care about this pre-9/11), it was about something else. What was that exactly? There are many theories–repression of dissent, intimidation of unfriendly media, opposition research, maybe all of the above–but the point to make is that when you heard folks back then insisting we needed the FAA to protect us from the terra-ists, you needed to call bullshit.

And the same applies today. Sure, there are still some wide-eyed Washington watchers among us who will say, “that was then, but this is now, and now is post-9/11, and now we have a guy who is not Bush in the White House, so now it is about the terra-ists,” but you need to call bullshit on that, too. First and foremost because no president should be above the Fourth Amendment, but also, and also importantly, because if the warrantless domestic surveillance was to keep us safe from terrorism, but the surveillance was expanded long before 9/11/01, and the attacks of 9/11 happened anyway, then this breach of our Constitution did not do what its advocates say it is supposed to do.

Here is where you say, “I’ll drink to that!”

But why drink a Seven & Seven, the world’s most specific cocktail? Because specificity is what it’s all about–or, more accurately, what this FAA is extension not all about.

The whole point of the Fourth Amendment is that Americans should not be subjected to un-checked government power. That if the government wants to search or surveil a US citizen in the US, it has to pick a specific person and a specific crime. The kinds of blanket permissions and basket warrants permitted under the FAA are the very kinds of things the Fourth Amendment is supposed to prevent.

Got that? No archiving of domestic data, no Total Information Awareness, no trawling for keywords in private emails, no “dossiers” on hundreds of millions of Americans who have done nothing except trip one of the NSA’s algorithmic flags–because that sort of non-specific surveillance doesn’t pass Constitutional muster, and it doesn’t protect America from enemies foreign or domestic.

So, I started this tome with a joking toast to the infamous birth of my blog, but, in all seriousness, with the president expected to sign the new FISA Amendment Act today, the day that should live in infamy is December 30, 2012. And that’s not just about me; that’s about everybody.

And I do mean everybody.

Yule Fuel

Yes, it’s time for that metaphor again. If you grew up near a TV during the 1960s or ’70s, you probably remember the ever-burning Yule Log that took the place of programming for a large portion of Christmas Day. The fire burned, it seemed, perpetually, never appearing to consume the log, never dimming, and never, as best the kid who stared at the television could tell, ever repeating.

Now, if you have been watching this space about as intently as I once stared at that video hearth, perhaps you are thinking that this eternal flame is about to reveal itself as a stand-in for nuclear power. You know, the theoretically bottomless, seemingly self-sustaining, present yet distant, ethereal energy source that’s clean, safe and too cheap to meter. Behold: a source of warmth and light that lasts forever!

Yeah. . . you wish! Or, at least you’d wish if you were a part of the nuclear industry or one of its purchased proxies.

But wishing does not make it so. A quick look at the US commercial reactor fleet proves there is nothing perpetual or predictable about this supposedly dependable power source.

Both reactors at San Onofre have been offline for almost a year, after a radioactive leak revealed dangerously worn heat transfer tubes. Nebraska’s Fort Calhoun plant has been shutdown since April of 2011, initially because of flooding from the Missouri River, but now because of a long list of safety issues. And it has been 39 months since Florida’s Crystal River reactor has generated even a single kilowatt, thanks to a disastrously botched repair to its containment that has still not been put right.

October’s Hurricane Sandy triggered scrams at two eastern nuclear plants, and induced an alert at New Jersey’s Oyster Creek reactor because flooding threatened spent fuel storage. Other damage discovered at Oyster Creek after the storm, kept the facility offline for five weeks more.

Another plant that scrammed during Sandy, New York’s Nine Mile Point, is offline again (for the third, or is it the fourth time since the superstorm?), this time because of a containment leak. (Yes, a containment leak!)

Other plants that have seen substantial, unplanned interruptions in power generation this year include Indian Point, Davis-Besse, Diablo Canyon, Hope Creek, Calvert Cliffs, Byron, St. Lucie, Pilgrim, Millstone, Susquehanna, Prairie Island, Palisades. . . honestly, the list can–and does–go on and on. . . and on. Atom-heads love to excuse the mammoth capital investments and decades-long lead times needed to get nuclear power plants online by saying, “yeah, but once up, they are like, 24/7/365. . . dude!”

Except, of course, as 2012–or any other year–proves, they are very, very far from anything like that. . . dude.

So, no, that forever-flame on the YuleTube is not a good metaphor for nuclear power. It is, however, a pretty good reminder of the still going, still growing problem of nuclear waste.

December saw the 70th anniversary of the first self-sustaining nuclear chain reaction, and the 30th anniversary of the first Nuclear Waste Policy Act. If the 40-year difference in those anniversaries strikes you as a bit long, well, you don’t know the half of it. (In the coming weeks, I hope to say more about this.) At present, the United States nuclear power establishment is straining to cope with a mountain of high-level radioactive waste now exceeding 70,000 tons. And with each year, the country will add approximately 2,000 more tons to the pile.

And all of this waste, sitting in spent fuel pools and above-ground dry casks– supposedly temporary storage–at nuclear facilities across the US, will remain extremely toxic for generations. . . for thousands and thousands of generations.

There is still no viable plan to dispose of any of this waste, but the nation’s creaky reactor fleet continues to make it. And with each refueling, another load is shoehorned into overcrowded onsite storage, increasing the problem, and increasing the danger of spent fuel accidents, including, believe it or not, a type of fire that cannot be extinguished with water.

So, if you want to stare at a burning log and think about something, think about how that log is not so unlike a nuclear fuel assembly exposed to air for a day or two. . . or think of how, even if it is not actually burning, the high levels of radiation tossed out from those uranium “logs” will create heat and headaches for hundreds of thousands of yuletides to come.

Oh, and, if you are still staring at the Yule log on a cathode ray tube television, don’t sit too close. . . because, you know, radiation.

Merry Christmas.

Writing on Shooting: Over Five Years Later, What Has Changed?

(photo: An Nguyen Photography via Flickr)

(photo: An Nguyen Photography via Flickr)

It has been over five-and-a-half years since a mass shooting on the campus of Virginia Tech in Blacksburg, Virginia, caused me to write:

. . . the terrible truth is that we only pay attention when our domestic murders come in multiples.

Gun violence is more than an everyday occurrence in this country, it is an hourly one. Correction: it is a quarter-hourly one. There are, roughly, 12,000 gun murders a year in the United States (if you are looking for contrasts, contrast that with the average 350 gun murders that occur annually in Canada, Great Britain, and Australia combined). If you watch the local TV news in the US, then you likely bear some sort of witness to numerous individual gun murders every week.

But it is only when six or twelve or twenty-two or thirty-three are shot that most of us even look up, take pause, or stop to think at all about what guns do.

And what guns do is kill people.

I’m sure there is somebody out there right now that is raising a finger in protest. Wait, there’s sport. . . competition shooting. . . hunting! And to that person I say: Knock it off! AK-47’s and their clones are not prized by biathletes, 9 mm semi-automatics are not hunting weapons, and you don’t need an extended clip to bring down a sixteen-point buck. You can make your arguments about self-defense and Second Amendment rights (though most of them would be wrong), but you cannot argue that it is either a right or a necessity to own the kinds of weapons that felled those at Columbine, or West Nickel Mines, or the unfortunate students and faculty at Virginia Tech.

And now we can add Sandy Hook Elementary in Newtown, Connecticut to that list–a list that had already grown much, much longer since 2007.

I wrote several posts around the time of the VaTech shootings–and several others about sadly similar events over the years–and went back to read them while thinking I would scrawl something today about the massacre in Connecticut. But you know what? I’m not sure I see the point of a new story–not when almost every single word I wrote back then is just as applicable now.

Sure, some of the names have changed. We have a different president; one who arguably struck the right emotional tone as he joined the country in mourning the senseless deaths of 20 young children. But a little while before Barack Obama spoke to the nation, his press secretary, Jay Carney, took to the White House briefing room to say that today was not the day to address the role that gun laws could play in preventing more mass shootings.

So, if you have the time, take a look at part of what was said some 17 domestic gun massacres ago:

Then, maybe ask, who do we have in elected government, or in a visible place in our country or our communities, who will rise up and say to Mr. Carney, or to the press corps, or to the president, “How about now? Can we talk about it now?”

I’ll leave you with the questions I asked back in 2007–and have asked so many times since–in an attempt to actually move this discussion beyond pearl-clutching and platitudes:

To those that love their guns. . .

Please don’t resort to screaming about how I want to take away your guns. . . I don’t. Just tell me why you oppose:

Gun registration,
Better background checks,
Additional licensing procedures for concealed weapons,
Mandatory waiting periods,
Restrictions on assault-style weapons, Saturday night specials, and extended clips,
Mandatory safety training and periodic recertification,
Closing so-called gun-show loopholes,
Legal liability for gun manufacturers commensurate with other consumer product liability,
And limits on the number of guns and rounds of ammo you can purchase at any given time and over the course of a year.

If you can address those points, we can have a discussion. . . or you can just scream that I want to take away your gun again if that makes you feel better.

And one more thought–something I tweeted earlier. Today, before the news of the Connecticut shooting broke, I heard a story about a man who went on a violent rampage at a school in China. He was armed with a knife. The result: 22 wounded; 0 dead.

Take Five for Dave Brubeck

Innovative Jazz pianist Dave Brubeck has died, one day shy of his 92nd birthday.

Though the melody of Take Five, arguably his most famous recording (featured above), is credited to his quartet’s saxophonist, Paul Desmond, it is Brubeck’s love of uncommon time signatures that lays the foundation for one of the most iconic musical works of the 20th Century.

But Brubeck wasn’t just a crusader for rhythm. During his service in World War II, Brubeck was spotted playing a Red Cross show and ordered to form a band. Brubeck chose a racially integrated lineup, a rarity for military acts. During the 1950s and ’60s, Brubeck is reported to have canceled appearances at venues that balked at the mixed racial makeup of his quartet.

Brubeck was also said to have been upset when he was featured on the cover of Time magazine in 1954 (only the second Jazz musician so honored), believing that the selection was influenced by race.

Though he disbanded the quartet in 1967, Brubeck continued to compose and perform into his 90s. He was the recipient of numerous accolades and awards, including a Kennedy Center honor and a Grammy for lifetime achievement.