Sunday, May 30, 2010


In the Week in Review section of today's New York Times we have this, from Elisabeth Rosenthal:

Our Fix-It Faith and the Oil Spill

...Americans have long had an unswerving belief that technology will save us -- it is the cavalry coming over the hill, just as we are about to lose the battle. And yet, as Americans watched scientists struggle to plug the undersea well over the past month, it became apparent that our great belief in technology was perhaps misplaced.

"Americans have a lot of faith that over the long run technology will solve everything, a sense that somehow we're going to find a way to fix it," said Andrew Kohut, president of the Pew Research Center for the People and the Press.

So, you see, we're to blame -- we have a naive faith in the ability of technocrats to every problem. (Rosenthal's evidence for this? Polls from Mr. Kohut's Pew Reasearch Center, which showed that we had unreasonable expectations for scientific progress on cancer and space exploration -- in 1999.)

But it doesn't matter whether there's any hard evidence that we in the general public really believe that scientists can fix everything that ails us. The idea is going to stay out there because it's useful. The idea is, by the way, related to what David Brooks was saying on Friday -- that we have a naive expectation that systems are comprehensible, when we just need to grow up and realize that they aren't:

Over the past decades, we've come to depend on an ever-expanding array of intricate high-tech systems. These hardware and software systems are the guts of financial markets, energy exploration, space exploration, air travel, defense programs and modern production plants.

These systems, which allow us to live as well as we do, are too complex for any single person to understand. Yet every day, individuals are asked to monitor the health of these networks, weigh the risks of a system failure and take appropriate measures to reduce those risks.

If there is one thing we've learned, it is that humans are not great at measuring and responding to risk when placed in situations too complicated to understand.

Except that that's not really true -- certainly not in this case. As we learn from the lead story in today's print Times, plenty of people understood BP's drilling technology -- or at least understood it well enough to know that BP was flirting with disaster:

Internal documents from BP ... show that in March, after several weeks of problems on the rig, BP was struggling with a loss of "well control." And as far back as 11 months ago, it was concerned about the well casing and the blowout preventer.

On June 22, for example, BP engineers expressed concerns that the metal casing the company wanted to use might collapse under high pressure.

"This would certainly be a worst-case scenario," Mark E. Hafle, a senior drilling engineer at BP, warned in an internal report. "However, I have seen it happen so know it can occur."

The company went ahead with the casing, but only after getting special permission from BP colleagues because it violated the company's safety policies and design standards....

In April of this year, BP engineers concluded that the casing was "unlikely to be a successful cement job," according to a document, referring to how the casing would be sealed to prevent gases from escaping up the well.

The document also says that the plan for casing the well is "unable to fulfill M.M.S. regulations," referring to the Minerals Management Service....

So, David Brooks, these systems were not too complex to be comprehensible. Plenty of people understood the problems. But that wasn't enough to prevent the drilling from going forward.

The problem wasn't a generalized national mood of optimism about the ability of technology to overcome any hurdle -- it was businessmen and their subordinates (from a U.K.-based company, please note) overruling scientists who had concerns based on knowledge and experience (and flouting regulations that could have prevented the disaster but weren't enforced).

But it's very useful to argue that Americans have unreasonable faith in technocrats, or that it's human nature to underestimate risk, because that lets the real culprits off the hook. Were just silly Americans or silly humans; we can't help being this way; as a result, shit happens.

And that leads to the argument that we want the damn oil, so we just have to grow up and accept the reality that shit happens.

Which is, in a way, the flip side of an argument Tom Friedman sees as a roadblock to progress on transitioning away from a petroleum-based energy policy (yeah, I know, it's Tom Friedman, but he's right about this):

... the "petro-determinists" ... never tire of telling us that we'll be dependent on oil for a "long, long time." That is true. The problem is, these same people have been telling us that ever since the first oil crisis in 1973, and their real objective in doing so is not to help us understand that breaking our oil addiction is difficult, but to make us think that it is impossible -- so don't bother.

I'd say it serves the interests of the "petro-determinsts" to argue that we just need to learn to tolerate more risk -- more emissions, more oil spills -- as the Age of Petroleum drags on and on indefinitely. We can't break our addiction to oil without destroying our economy! We can't get green technology to work, or work in an economically efficient way! And, yes, there are problems with drilling, but you're just naive if you think we can sidestep calamity! So shut up and eat your tarballs!

No comments: