Book Salon – Joseph Mangano, Author of Mad Science: The Nuclear Power Experiment

[Note: On Saturday afternoon, I hosted FDL Book Salon, featuring a live Q&A with Joseph Mangano, author of Mad Science: The Nuclear Power Experiment. This is a repost of that discussion.]

In December of 1962, Consolidated Edison, New York City’s main purveyor of electricity, announced that it had submitted an official proposal to the US Atomic Energy Commission (the AEC, the precursor to today’s Nuclear Regulatory Commission) for the construction of a nuclear power plant on a site called Ravenswood. . . in Queens. . . on the East River. . . directly across from the United Nations. . . within five miles of roughly five million people.

Ravenswood became the site of America’s first demonstrations against nuclear power. It inspired petitions to President John F. Kennedy and NYC Mayor Robert Wagner, and the possibility of a nuclear reactor in such a densely populated area even invited public skepticism from the pro-nuclear head of the AEC, David Lilienthal. Finally, after a year of pressure, led by the borough’s community leaders, Con Edison withdrew their application.

But within three years, reports suggested Con Ed had plans to build a nuclear plant under Central Park. After that idea was roundly criticized, the utility publicly proposed a reactor complex under Welfare Island (now known as Roosevelt Island), instead.

Despite the strong support of Laurence Rockefeller, the brother of New York State’s governor, the Welfare Island project disappeared from Con Ed’s plans by 1970. . . soon to be replaced by the idea of a nuclear “jetport”–artificial islands to be built in the ocean just south of New York City that would host a pair of commercial reactors.

Does that sound like madness? Well, from today’s perspective–with Three Mile Island, Chernobyl, and now Fukushima universally understood as synonyms for disaster–it probably does. But there was a time before those meltdowns when nuclear power still had a bit of a glow, when, despite (or because of) the devastation from the atomic bombs dropped on Japan, many believed that the atom’s awesome power could be harnessed for good; a time when dangerous and deadly mishaps at a number of the nation’s earlier reactors were easily excused or kept completely secret.

In Mad Science: The Nuclear Power Experiment, Joseph Mangano returns to that time, and then methodically pulls back the curtain on the real history of nuclear folly and failure, and the energy source that continues to masquerade as clean, safe, and “too cheap to meter.”

From Chalk River, in Canada, the world’s first reactor meltdown, through Idaho’s EBR-1, Waltz Mill, PA, Santa Susana’s failed Sodium Reactor Experiment, the Idaho National Lab explosion that killed three, Fermi-1, which almost irradiated Detroit, and, of course, Three Mile Island, Mad Science provides a chilling catalog of nuclear accidents, all of which were disasters in their own right, and all of which illustrate a troubling pattern of safety breeches followed by secrecy and lies.

Nuclear power’s precarious existence is not, of course, just a story for the history books, and Mangano also details the state of America’s 104 remaining reactors. So many of today’s plants have problems, too, but perhaps the maddest thing about the mad science of civilian atomic power is that science often plays a minor role in decisions about the technology’s future.

From its earliest days, this supposedly super-cheap energy was financially unsustainable. By the mid-1950s, private insurers had turned their back on nuclear facilities, fearing the massive payouts that would follow any accident. The nuclear industry turned to the US government, and in 1957, the Price-Anderson Act limited a plant’s liability to an artificially low but apparently insurable figure–any damage beyond that would be covered by US taxpayers. Shippingport, America’s first large-scale commercial nuclear reactor, was built entirely with government money, and that is hardly an isolated story. Even before the Three Mile Island meltdown, Wall Street had walked away from nuclear energy, meaning that no new reactors could be built without massive federal loan guarantees.

Indeed, the cost of construction, when piled on top of the cost of fueling, skilled labor, operation and upkeep, made the prospect of opening a new nuclear plant financially unpalatable. So, as Mangano explains, nuclear utilities turned to another strategy for making their vertical profitable, one familiar to any student of late Western capitalism. Rather than build, energy companies would instead buy. Since the 1990s, the nuclear sector has seen massive consolidation. Mergers and acquisitions have created nuclear mega-corporations, like Exelon, Duke, and Entergy, which run multiple reactors across many facilities in many states. And the supposed regulators of the industry, the NRC, has encouraged this behavior by rubberstamping dozens upon dozens of 20-year license extensions, turning reactors that were supposed to be nearing the end of their functional lives into valuable assets.

But the pain of nuclear power isn’t only measured in meltdowns and money. Whether firing on all cylinders (as it were) or falling apart, nuclear plants have proven to be dangerous to the populations they are supposed to serve. Joseph Mangano, an epidemiologist by trade, and director of the Radiation and Public Health Project (RPHP), has made a career out of trying to understand the immediate and long-term effects of nuclear madness, be it from fallout, leaks, or the “permissible levels” of radioactive isotopes that are regularly released from reactors as part of normal operation.

As I mentioned earlier this week, Mangano and the RPHP are the inheritors of the Baby Tooth Survey, the groundbreaking examination of strontium levels in children born before, during and after the age of atmospheric nuclear bomb tests. The discovery of high levels of Sr-90, a radioactive byproduct of uranium fission, in the baby teeth of children born in the 1950s and ’60s led directly to the Partial Test Ban Treaty in 1963.

Mangano’s work has built on the original survey, linking elevated Sr-90 levels to cancer, and examining the increases in strontium in the bodies of children that lived close to nuclear power plants. And all of this is explained in great detail in Mad Science.

The author has also applied his expertise to the fallout from the ongoing Fukushima disaster. Last December, Mangano and Janette Sherman published a peer-reviewed article in the International Journal of Health Sciences (PDF) stating that in the 14 weeks following the start of the Japanese nuclear crisis, an estimated 14,000 excess deaths in the United States could be linked to radioactive fallout from Fukushima Daiichi. (RPHP has since revised that estimate–upward–to almost 22,000 deaths (PDF).)

That last study is not specifically detailed in Mad Science, but I hope we can touch on it today–along with some of the many equally maddening “experiments” in nuclear energy production that Mangano carefully unwraps in his book.

[Click here to read my two-hour chat with Joe Mangano.]

Occupy Innovation

Actress Anne Hathaway marches with demonstrators on the two-month anniversary of Occupy Wall Street. (Photo: Elana Levin)

Two days after thousands of police broke up the around-the-clock occupation of New York’s Zuccotti Park, tens of thousands of demonstrators converged downtown to celebrate the two-month anniversary of Occupy Wall Street and stress that with or without Zuccotti, the protest and its message remained strong and relevant.

One of those in the march, the actress Anne Hathaway, carried a sign that read “Blackboards not Bullets,” and though much attention was predictably paid to the 29-year-old star’s presence, the message she carried that day shouldn’t be ignored.

A month earlier, shortly after a company called Boston Dynamics unveiled a prototype of its “Legged Squad Support System” AlphaDog, a walking robot financed by DARPA, the Defense Advanced Research Projects Agency, Rachel Maddow featured the technological marvel in a segment contrasting current advances in military hardware with what is currently on offer for consumers.

Her featured guest in that segment was US Rep. Rush Holt (D-NJ). Holt had, a month earlier still, gone on record in the midst of Washington’s deficit hysteria arguing that the government should actually spend more on scientific research, writing that the framing of the budget debate set up a false choice between basic science and elementary education.

Nothing demonstrates the effectiveness of the young Occupy movement more than the rapid shift in frames. The “cut, cut, cut” of the manufactured deficit crisis that Holt had to fight against has been largely drowned out by the chant of “banks got bailed out; we got sold out” and the reevaluation of spending priorities that came with nationwide demands for an accountable government acting in the service of the 99 percent.

But, to state the obvious, four months of Occupy has not been enough to really transform the way the federal government prioritizes spending, nor has the movement yet transformed the way the country evaluates real progress.

For instance, with December’s formal end of US military operations in Iraq, and a promised drawdown coming in Afghanistan, as well, has anyone in official Washington (or in the commentariat, for that matter) started talking about what America will do with its “peace dividend?”

In fact, the beneficiaries of profligate wartime spending are marshaling their surrogates to warn that cuts in Pentagon spending will actually subtract valuable research dollars from the economy. Citing large contractors like SAIC, Computer Sciences Corporation, and CACI International, a recent New York Times story tried to make the case that massive military spending in the last decade has been an important catalyst for the economy and for innovation in the broader marketplace. If these companies–all three of which have been involved in major scandals over the last several years–stand for anything, however, it is that the unchecked expansion of the defense budget is a catalyst for shameless corruption. (An observation glaringly absent from the Times piece.)

Indeed, for all the sincere excitement that might spring up around robot dogs, or a stream of other “war dividends” that might find some purpose in the consumer marketplace–from high-resolution cameras to improved prosthetic limbs–the wow factor not only obscures the full cost of the innovation, but distorts the measure of innovation itself.

A marketplace of ideas?

Many conservatives (and neo-liberals) love to argue that the marketplace is the best judge of winners and losers. Competition is the key to innovation, they argue. But in the consumer market, innovation isn’t always about providing a better product. Just as often, “innovation” means exploiting a leverageable point of difference or streamlining the manufacturing process in pursuit of better profit margins.

Was Coke Zero an innovation over Diet Coke? Was Nexium an innovation over Prilosec? Certainly, marketing said “yes,” but the research, at least in the case of the pharmaceuticals, stated otherwise:

It’s expensive to produce an innovative drug. On average, the bill runs to more than $400 million. So drug companies often take a less costly route to create a new product. They chemically rejigger an oldie but goodie, craft a new name, mount a massive advertising campaign and sell the retread as the latest innovative breakthrough.

This strategy has shown great success for turning profits. Nexium, a “me-too” drug for stomach acid, has earned $3.9 billion for its maker, AstraZeneca, since it went on the market in 2001. The U.S. Food and Drug Administration classified three-fourths of the 119 drugs it approved last year as similar to existing ones in chemical makeup or therapeutic value.
….

Nexium illustrates the drug makers’ strategy. Many chemicals come in two versions, each a mirror image of the other: an L-isomer and an R-isomer. (The “L” is for left, the “R” is for right.) Nexium’s predecessor Prilosec is a mixture of both isomers. When Prilosec’s patent expired in 2001, the drug maker was ready with Nexium, which contains only the L-isomer.

Is Nexium better? So far, there’s no convincing evidence that it is. . . .

The study goes on to point out that the money spent on developing and marketing a “me too” drug is money not spent researching truly new treatments.

The energy sector practically turns the notion of marketplace innovation on its head. From underpriced leases for drilling on federal land to underwritten loans for construction of new nuclear reactors, “innovation” has been about finding ways to preserve the status quo.

Take, for example, a so-called “next generation” reactor design, known as the AP1000, certified by the Nuclear Regulatory Commission just before Christmas. Though touted as a “radical” new design, in reality, the new model represents little more than a riff on 50-year-old pressurized water reactors. While manufacturers contend the AP1000 offers improved options for backup reactor cooling in an emergency, the real “innovations” for plant owners are the number of systems and components that do not differ from what has already been licensed and manufactured – shortening the approval process – and the promise that the new model requires fewer components, far less concrete and rebar, and smaller staffs to operate, thus saving on construction and labor costs.

Nothing about the new reactor design has actually impressed “the marketplace” one iota. No bank will indulge the risk – without billions in federal loan guarantees, none of the reactors now approved and fast-tracked for construction and operating licenses would ever come close to getting built. In addition, the federal government is still paying to clean up after mid-20th Century uranium mining, decades of nuclear fuel processing, and numerous radiation leaks of varying size, and will remain saddled with the burden of providing a long-term storage solution for nuclear waste. The Price-Anderson Act also shields the nuclear industry from the full liability of any accident. None of these costly government investments supports innovation – indeed, it could be argued they prevent it.

The military innovation complex

But as costly as it is to backstop the nuclear industry, it pales in comparison to the military model for innovation. Far from being tested in the “marketplace,” military innovations seem to go through a “spaghetti test” – throw multiple options at the wall, and see if any stick. This is a phenomenally expensive model, and can also cost dearly in terms of lives, but it is a system born of the “cost is no object” approach encouraged by the shock of war, and sustained by the Military Industrial Complex and its enablers.

As a result, as Rep. Holt observed, “The military spends a lot more on development than research. More on the ‘D’ than the ‘R.'” Tax dollars spent in this sector contribute little to primary research, the kind that has the potential to radically shift paradigms. Instead, military innovation focuses on incremental improvements, usable in theater as soon as possible.

And the incremental approach is notably desirable for military contractors. There is so much more profit in producing an assortment of tweaked prototypes and modest upgrades than there is in having a few scientists or engineers work on models. A war–and the war frame for thinking about innovation–might get you many relative improvements, but it rarely produces radical inventions.

In addition, Holt points out that military R&D is often less fruitful for the greater economy because much of the research is kept secret.

Military spending is also much more capital intensive than investments in other vital sectors – and a new study from the Political Economy Research Institute (PERI) quantifies just how much more.

The study, titled “The US Employment Effects of Military and Domestic Spending Priorities” (PDF), calculates the number of jobs created by a set amount of military spending, and then contrasts that with what the exact same amount would do when invested in four non-military alternatives: tax cuts for personal consumption, clean energy and efficiency, health care, and public education. The differences are staggering.

PERI found that $1 billion in military spending produced 11,200 jobs, which sounds impressive until it is placed next to the alternatives. Tax cuts, no reputable economist’s favorite way to create jobs, still outpaced military spending – $1 billion in tax cuts would create an estimated 15,100 jobs. But even that looks weak when compared with more direct government investment in crucial alternative sectors. A billion dollars spent on clean energy and improved efficiency would result in 16,800 new jobs. The same amount in the health care sector would mean 17,200 jobs. And $1 billion of government investment in education would create 26,700 jobs – well over twice the jobs created by the same amount spent on the military.

It is often argued that military jobs are better compensated than others – mostly because of the superior health benefits – however, the money spent in the private sector is so much more efficient at job creation that more jobs are created across the spectrum, even in the upper tiers. (Indeed, for jobs in the upper brackets, clean energy investment leaps ahead of health care – though, in every category, education is the leader in dollars-for-jobs effectiveness.)

Such numbers have major and obvious implications at a time when, in Washington, anyway, leaders talk of the need to cut government spending, but paradoxically also warn of the consequences – especially with regard to employment – of cutting the military.

Innovation and its discontents

So, while it is probably human nature – and an admirable part of it – to get excited about an agile and intuitive robot, the rapid improvements in mechanical dog technology raise important questions: What if the US had the same level of commitment to innovation outside the military? What if the government took just a fraction of the money it poured into Iraq and Afghanistan – wars arguably fought to help preserve America’s access to cheap oil – and instead invested it in renewable energy innovations and improvements in energy efficiency? What if some of the $2,200 spent by every resident in the United States on the military in 2010 (to use the most recently available figures) had been repurposed for education? How much lower would the jobless rate be? How many innovations and improvements equal to or better than AlphaDog would now be receiving the oohs and aahs of an amazed public?

Or is it, as Ezra Klein observed in a response to the Times story, that military R&D is “economically inefficient but politically efficient“–that there might be better ways to spend a research dollar, but the one that Beltway thinking can sustain goes through the Pentagon? Not only is that construct what Klein calls “depressing,” it makes several assumptions about the values of extravagant wartime budgets while ignoring the known and numerous downsides. (The Times piece itself includes two professors who posit without any apparent irony that the biggest economic benefit of inflated defense spending is “what it prevents”–presumably, war and other threats to domestic security.) But, worst of all, Klein’s take on realpolitik is myopic and shortsighted.

If the US fought for the post-carbon economy the way it fights for nebulous state-building goals in foreign wars, the future would be brighter, cleaner, safer and cheaper, with more jobs and perhaps – because it would need to secure less of that foreign oil -fewer wars. If the country built new classrooms with the same urgency it built armored vehicles, more American teens could be choosing between colleges instead of choosing between minimum and sub-minimum wage jobs – and fewer would eventually need public assistance. If the government spent more on blackboards and less on bullets, it would create more jobs today and more innovation in the future.

Neither the military nor the marketplace has proven to be the great incubator of innovation that proponents of defense spending and free markets wish to believe. Instead, both facilitate what author Malcolm Gladwell calls “tinkering” – refinements that might improve upon a big invention, rather than the big invention itself. This is not to slight tinkering: Edison’s light bulb and Apple’s iPod represent the kind of tinkering that can markedly affect everyday life. But those were improvements on earlier discoveries, not the world-changing discoveries, themselves (and, as has been demonstrated, not every tinkering innovation is for the benefit of the end-user).

No, if the United States truly wants to “think different” (as Apple’s advertising once implored), it needs to once again embrace the innovations upon which the country was founded: Real representative democracy, transparency, accountability, checks and balances of three co-equal branches of government, no taxation without representation, trial by jury, a wariness of foreign military entanglements, and, as was added soon after Independence, access to free public education as a right. They are ideals remarkably similar to those embraced by the Occupy movement – if not assumed by most Americans to be part of their national identity.

But they are precepts that have been tarnished by the masters of the marketplace and the adherents of the military industrial complex. Perhaps they don’t need re-invention, but it appears some serious re-dedication is in order. If the innovation in public discourse known as Occupy Wall Street can continue to re-frame the crisis, rethink the role of government and reinvigorate the democracy, then maybe America can re-occupy itself.

* * *

(A version of this story previously appeared on Truthout under the headline Occupy Innovation: Neither the Military Nor the Market Does)