Occupy Innovation

Actress Anne Hathaway marches with demonstrators on the two-month anniversary of Occupy Wall Street. (Photo: Elana Levin)

Two days after thousands of police broke up the around-the-clock occupation of New York’s Zuccotti Park, tens of thousands of demonstrators converged downtown to celebrate the two-month anniversary of Occupy Wall Street and stress that with or without Zuccotti, the protest and its message remained strong and relevant.

One of those in the march, the actress Anne Hathaway, carried a sign that read “Blackboards not Bullets,” and though much attention was predictably paid to the 29-year-old star’s presence, the message she carried that day shouldn’t be ignored.

A month earlier, shortly after a company called Boston Dynamics unveiled a prototype of its “Legged Squad Support System” AlphaDog, a walking robot financed by DARPA, the Defense Advanced Research Projects Agency, Rachel Maddow featured the technological marvel in a segment contrasting current advances in military hardware with what is currently on offer for consumers.

Her featured guest in that segment was US Rep. Rush Holt (D-NJ). Holt had, a month earlier still, gone on record in the midst of Washington’s deficit hysteria arguing that the government should actually spend more on scientific research, writing that the framing of the budget debate set up a false choice between basic science and elementary education.

Nothing demonstrates the effectiveness of the young Occupy movement more than the rapid shift in frames. The “cut, cut, cut” of the manufactured deficit crisis that Holt had to fight against has been largely drowned out by the chant of “banks got bailed out; we got sold out” and the reevaluation of spending priorities that came with nationwide demands for an accountable government acting in the service of the 99 percent.

But, to state the obvious, four months of Occupy has not been enough to really transform the way the federal government prioritizes spending, nor has the movement yet transformed the way the country evaluates real progress.

For instance, with December’s formal end of US military operations in Iraq, and a promised drawdown coming in Afghanistan, as well, has anyone in official Washington (or in the commentariat, for that matter) started talking about what America will do with its “peace dividend?”

In fact, the beneficiaries of profligate wartime spending are marshaling their surrogates to warn that cuts in Pentagon spending will actually subtract valuable research dollars from the economy. Citing large contractors like SAIC, Computer Sciences Corporation, and CACI International, a recent New York Times story tried to make the case that massive military spending in the last decade has been an important catalyst for the economy and for innovation in the broader marketplace. If these companies–all three of which have been involved in major scandals over the last several years–stand for anything, however, it is that the unchecked expansion of the defense budget is a catalyst for shameless corruption. (An observation glaringly absent from the Times piece.)

Indeed, for all the sincere excitement that might spring up around robot dogs, or a stream of other “war dividends” that might find some purpose in the consumer marketplace–from high-resolution cameras to improved prosthetic limbs–the wow factor not only obscures the full cost of the innovation, but distorts the measure of innovation itself.

A marketplace of ideas?

Many conservatives (and neo-liberals) love to argue that the marketplace is the best judge of winners and losers. Competition is the key to innovation, they argue. But in the consumer market, innovation isn’t always about providing a better product. Just as often, “innovation” means exploiting a leverageable point of difference or streamlining the manufacturing process in pursuit of better profit margins.

Was Coke Zero an innovation over Diet Coke? Was Nexium an innovation over Prilosec? Certainly, marketing said “yes,” but the research, at least in the case of the pharmaceuticals, stated otherwise:

It’s expensive to produce an innovative drug. On average, the bill runs to more than $400 million. So drug companies often take a less costly route to create a new product. They chemically rejigger an oldie but goodie, craft a new name, mount a massive advertising campaign and sell the retread as the latest innovative breakthrough.

This strategy has shown great success for turning profits. Nexium, a “me-too” drug for stomach acid, has earned $3.9 billion for its maker, AstraZeneca, since it went on the market in 2001. The U.S. Food and Drug Administration classified three-fourths of the 119 drugs it approved last year as similar to existing ones in chemical makeup or therapeutic value.
….

Nexium illustrates the drug makers’ strategy. Many chemicals come in two versions, each a mirror image of the other: an L-isomer and an R-isomer. (The “L” is for left, the “R” is for right.) Nexium’s predecessor Prilosec is a mixture of both isomers. When Prilosec’s patent expired in 2001, the drug maker was ready with Nexium, which contains only the L-isomer.

Is Nexium better? So far, there’s no convincing evidence that it is. . . .

The study goes on to point out that the money spent on developing and marketing a “me too” drug is money not spent researching truly new treatments.

The energy sector practically turns the notion of marketplace innovation on its head. From underpriced leases for drilling on federal land to underwritten loans for construction of new nuclear reactors, “innovation” has been about finding ways to preserve the status quo.

Take, for example, a so-called “next generation” reactor design, known as the AP1000, certified by the Nuclear Regulatory Commission just before Christmas. Though touted as a “radical” new design, in reality, the new model represents little more than a riff on 50-year-old pressurized water reactors. While manufacturers contend the AP1000 offers improved options for backup reactor cooling in an emergency, the real “innovations” for plant owners are the number of systems and components that do not differ from what has already been licensed and manufactured – shortening the approval process – and the promise that the new model requires fewer components, far less concrete and rebar, and smaller staffs to operate, thus saving on construction and labor costs.

Nothing about the new reactor design has actually impressed “the marketplace” one iota. No bank will indulge the risk – without billions in federal loan guarantees, none of the reactors now approved and fast-tracked for construction and operating licenses would ever come close to getting built. In addition, the federal government is still paying to clean up after mid-20th Century uranium mining, decades of nuclear fuel processing, and numerous radiation leaks of varying size, and will remain saddled with the burden of providing a long-term storage solution for nuclear waste. The Price-Anderson Act also shields the nuclear industry from the full liability of any accident. None of these costly government investments supports innovation – indeed, it could be argued they prevent it.

The military innovation complex

But as costly as it is to backstop the nuclear industry, it pales in comparison to the military model for innovation. Far from being tested in the “marketplace,” military innovations seem to go through a “spaghetti test” – throw multiple options at the wall, and see if any stick. This is a phenomenally expensive model, and can also cost dearly in terms of lives, but it is a system born of the “cost is no object” approach encouraged by the shock of war, and sustained by the Military Industrial Complex and its enablers.

As a result, as Rep. Holt observed, “The military spends a lot more on development than research. More on the ‘D’ than the ‘R.'” Tax dollars spent in this sector contribute little to primary research, the kind that has the potential to radically shift paradigms. Instead, military innovation focuses on incremental improvements, usable in theater as soon as possible.

And the incremental approach is notably desirable for military contractors. There is so much more profit in producing an assortment of tweaked prototypes and modest upgrades than there is in having a few scientists or engineers work on models. A war–and the war frame for thinking about innovation–might get you many relative improvements, but it rarely produces radical inventions.

In addition, Holt points out that military R&D is often less fruitful for the greater economy because much of the research is kept secret.

Military spending is also much more capital intensive than investments in other vital sectors – and a new study from the Political Economy Research Institute (PERI) quantifies just how much more.

The study, titled “The US Employment Effects of Military and Domestic Spending Priorities” (PDF), calculates the number of jobs created by a set amount of military spending, and then contrasts that with what the exact same amount would do when invested in four non-military alternatives: tax cuts for personal consumption, clean energy and efficiency, health care, and public education. The differences are staggering.

PERI found that $1 billion in military spending produced 11,200 jobs, which sounds impressive until it is placed next to the alternatives. Tax cuts, no reputable economist’s favorite way to create jobs, still outpaced military spending – $1 billion in tax cuts would create an estimated 15,100 jobs. But even that looks weak when compared with more direct government investment in crucial alternative sectors. A billion dollars spent on clean energy and improved efficiency would result in 16,800 new jobs. The same amount in the health care sector would mean 17,200 jobs. And $1 billion of government investment in education would create 26,700 jobs – well over twice the jobs created by the same amount spent on the military.

It is often argued that military jobs are better compensated than others – mostly because of the superior health benefits – however, the money spent in the private sector is so much more efficient at job creation that more jobs are created across the spectrum, even in the upper tiers. (Indeed, for jobs in the upper brackets, clean energy investment leaps ahead of health care – though, in every category, education is the leader in dollars-for-jobs effectiveness.)

Such numbers have major and obvious implications at a time when, in Washington, anyway, leaders talk of the need to cut government spending, but paradoxically also warn of the consequences – especially with regard to employment – of cutting the military.

Innovation and its discontents

So, while it is probably human nature – and an admirable part of it – to get excited about an agile and intuitive robot, the rapid improvements in mechanical dog technology raise important questions: What if the US had the same level of commitment to innovation outside the military? What if the government took just a fraction of the money it poured into Iraq and Afghanistan – wars arguably fought to help preserve America’s access to cheap oil – and instead invested it in renewable energy innovations and improvements in energy efficiency? What if some of the $2,200 spent by every resident in the United States on the military in 2010 (to use the most recently available figures) had been repurposed for education? How much lower would the jobless rate be? How many innovations and improvements equal to or better than AlphaDog would now be receiving the oohs and aahs of an amazed public?

Or is it, as Ezra Klein observed in a response to the Times story, that military R&D is “economically inefficient but politically efficient“–that there might be better ways to spend a research dollar, but the one that Beltway thinking can sustain goes through the Pentagon? Not only is that construct what Klein calls “depressing,” it makes several assumptions about the values of extravagant wartime budgets while ignoring the known and numerous downsides. (The Times piece itself includes two professors who posit without any apparent irony that the biggest economic benefit of inflated defense spending is “what it prevents”–presumably, war and other threats to domestic security.) But, worst of all, Klein’s take on realpolitik is myopic and shortsighted.

If the US fought for the post-carbon economy the way it fights for nebulous state-building goals in foreign wars, the future would be brighter, cleaner, safer and cheaper, with more jobs and perhaps – because it would need to secure less of that foreign oil -fewer wars. If the country built new classrooms with the same urgency it built armored vehicles, more American teens could be choosing between colleges instead of choosing between minimum and sub-minimum wage jobs – and fewer would eventually need public assistance. If the government spent more on blackboards and less on bullets, it would create more jobs today and more innovation in the future.

Neither the military nor the marketplace has proven to be the great incubator of innovation that proponents of defense spending and free markets wish to believe. Instead, both facilitate what author Malcolm Gladwell calls “tinkering” – refinements that might improve upon a big invention, rather than the big invention itself. This is not to slight tinkering: Edison’s light bulb and Apple’s iPod represent the kind of tinkering that can markedly affect everyday life. But those were improvements on earlier discoveries, not the world-changing discoveries, themselves (and, as has been demonstrated, not every tinkering innovation is for the benefit of the end-user).

No, if the United States truly wants to “think different” (as Apple’s advertising once implored), it needs to once again embrace the innovations upon which the country was founded: Real representative democracy, transparency, accountability, checks and balances of three co-equal branches of government, no taxation without representation, trial by jury, a wariness of foreign military entanglements, and, as was added soon after Independence, access to free public education as a right. They are ideals remarkably similar to those embraced by the Occupy movement – if not assumed by most Americans to be part of their national identity.

But they are precepts that have been tarnished by the masters of the marketplace and the adherents of the military industrial complex. Perhaps they don’t need re-invention, but it appears some serious re-dedication is in order. If the innovation in public discourse known as Occupy Wall Street can continue to re-frame the crisis, rethink the role of government and reinvigorate the democracy, then maybe America can re-occupy itself.

* * *

(A version of this story previously appeared on Truthout under the headline Occupy Innovation: Neither the Military Nor the Market Does)

Advertisements

The Party Line – April 29, 2011: And the Band Played On

After pausing for a day to placate another bleating billionaire, President Obama stepped to the first microphone Thursday to announce that Leon Panetta would soon sit where Bob Gates now sits, and that David Patraeus would sit in Panetta’s old chair, and that John Allen would grab King David’s throne, and so on and so forth until someone pulled the needle off the record. At which point we were told that the president had re-tooled his national security team for the challenges that lie ahead.

But if that sounds less like re-tooling and more like rearranging the deck chairs on the Titanic, well, that’s because it should.

At a place in history where the administration’s much ballyhooed Afghanistan strategy has proven another stutter-step in a long, bloody line of failed tactics, at a time when the entire US intelligence establishment seems to have been caught flat-footed by the uprisings of this Arab Spring, bringing us to a moment where being militarily overextended and signally under-informed has quickly left the US knee-deep in a Libyan quagmire, one might think that Obama would use the force of history as the perfect excuse to really change course. One might think that, but Obama did not do that.

Instead, the architect of our misfortune in Afghanistan is given control of the Central Intelligence Agency, and the guy who forsook the CIA’s intelligence gathering responsibilities to further strengthen covert ops will now run the whole shebang (emphasis on “bang”) at the Pentagon.

While “failing upward” seems to be the 21st Century way America tries to win the future, perhaps the even more disturbing theme is the further blurring of the distinction between the US military and national intelligence. Marcy, David, and Jim have all touched on aspects of this, but, in short, what were once the independent and sometimes competitive interests of the intelligence community, the diplomatic corps and the military have, in the interest of post-9/11 “coordination” or post-imperial expedience, been mixed into the what now looks like the world’s largest paramilitary.

Which is actually a pretty dangerous place to steer the ship of state. While America’s giant military industrial complex, its ability to reach across the globe and “hit ‘em there” (and often do so with only the push of a button) may give us the sense that we are insulated from the conflicts abroad, we are, in fact, staying a course rife with icebergs.

To use a more recent (if you consider 30 years ago “recent”) analogy, the US is not unlike the space ship in a game of Asteroids. It has enough torpedoes to whip around and fire at will at the interplanetary rocks heading its way, but each hit breaks an asteroid into dozens of smaller ones, and eventually there are just too many to dodge.

OK, where was I? Oh, yes. Darting back in time again, I often talk about a theory I call “The Sick Man of the Americas.” It is a play on “The Sick Man of Europe,” a term used to describe a declining and dangerous Ottoman Empire at the turn of the last century. At that point, the Ottomans had been on the downward slope of history for a long time, but what they lacked in political influence, they tried to make up for with military might.

The American Empire stands at a similar precipice. Feeling its diplomatic might on the wane, its industrial prowess now being outstripped by several regional powers, its economy stagnant, its technological edge blunted by a decade of anti-science leadership, and even its cultural significance questioned, the US still has one thing it knows it can do better than any other place in the world, and that is blow things up.

The problem is, lots of other countries find that tiresome. It might suffice for now, given expectations, trade deals, and pre-existing commitments, but eventually all this bounderism gets in the way of things like commerce, and when you screw with other people’s money, they get touchy.

There may not be some great army ready to advance on our shores, not yet, but there will come a point where doing things the American way becomes more trouble than its worth. And in an interconnected world, that will make it very hard to even play in the future, forget about, uh, winning.

The sad part is it doesn’t have to be this way. Though the establishment that just played musical chairs is entrenched, it is not immortal. There are actually people well on their way to being part of the establishment who also worry about an overly militarized American century. Note, for example, Mr. Y.

Mr. Y, in reality two senior members of the Joint Chiefs of Staff (the pseudonym is a play on a 64-year old essay by George Kennan), released a paper titled “A National Strategic Narrative” (PDF), and in it they spell out a part-primer, part-warning on the choices America is now making.

The paper is long, and I am still digesting it, but the takeaway relevant to this week’s events is the insistence that America needs to transition away, as they say, from a policy of “containment to sustainment,” and that the US needs to see that its security lies in its prosperity, as opposed to the other way around. The idea (and I am seriously shorthanding here) is that rather than using military might to keep perceived threats at arms’ length (pun intended), a focus on strong domestic institutions will serve American security much better.

It is not a surprising position from a generation of military leaders that have been put through the meat grinders of Iraq and Afghanistan. And it is a position that might seem consistent with what was promised by candidate Barack Obama back in 2008.

Yes, it is true that Obama did signal an escalation in Afghanistan during the campaign, but otherwise, the junior Senator from Illinois spoke of reclaiming America’s role in the world by investing in domestic industry and innovation, and leading by example rather than by ordinance.

Contrary to the Obama of April 2011, that future still seems winnable. The Mr. Ys of this world, bred of a professional military, tired of playing Pinky to the intel black-baggers’ Brain, provide a ready and powerful force on the inside. The Democratic base—the young new voters and the liberals of all ages that surged to the polls to give Obama his first term as president—would provide all the support Obama would need on the outside. But those dual constituencies, seemingly so perfectly primed to help the ’08 vintage Obama bring forth the change he once promised, find themselves alternately ignored or punched by the present president.

It is the macro-theme that played out in microcosm on Thursday. Obama, the captain on the bridge, promoted an intelligence director who turned a deaf ear to a global chorus of discontent, and a leader of military escalations—almost by definition a guy that shoots first and asks questions later—was given the responsibility of doing the required listening that lies ahead.

The band will play on, but will anyone on the promenade deck be able to recognize the tune?