John Hussman, via Zero Hedge:
While there are about 3,800 oil platforms in the Gulf of Mexico, only about 130 deep water projects have been completed, compared with just 17 a decade ago. So in 10 years, applying a new technology, we’ve had one major oil spill thus far. Unless there is some a priori reason to assume that the technology is pristine, despite the fact that it has failed spectacularly, the first back-of-the-envelope estimate a statistician would make would be to model deep water oil spills as a “Poisson process.” Poisson processes are often used to model things that arrive randomly, like customers in a checkout line, or insurance claims across unrelated policy holders. Given one major oil spill in 10 years, you probably wouldn’t be way off the mark using an average “arrival frequency” of 0.10 annually.
From that perspective, a simple Poisson estimate would suggest a 90.5% probability that we will see no additional major oil spills from deep water rigs over the coming year, dropping to a 36.8% chance that we’ll see no additional major oil spills from deep water rigs over the coming decade. Moreover, you’d put a 36.8% chance on having exactly one more major spill in the coming decade, an 18.4% chance on having two major spills, a 6.1% chance of having three major spills, and a 1.9% chance of having four or more major spills in the coming decade. This is quite a bit of inference from a small amount of data, but catastrophes contain a great deal of information when the “prior” is that catastrophes are simply not possible.
It’s not clear even from the context what significant growth in the number of deep-water projects would do to those odds, but common sense suggests that risk increases with the number of projects.
Last week, Steve Pearlstein at The Washington Post talked about risk and government’s role in its management:
The big flaw in the business critique of regulation is not so much that it overstates the costs, but that it understates its benefits — in particular, the benefits of avoiding low-probability events with disastrous consequences. Think of oil spills, mine explosions, financial meltdowns or even global warming. There is a natural tendency of human beings to underestimate the odds of such seemingly unlikely events — of forgetting that the 100-year flood is as likely to happen in Year 5 as it is in Year 95. And if there are insufficient data to calculate the probability of a very bad outcome, as is often the case, that doesn’t mean we should assume the probability is zero.
Another challenge in thinking about regulation is that any meaningful analysis has to go beyond merely toting up the costs and benefits to a consideration of how those costs and benefits are distributed. Regulations limiting derivatives trading, for example, may add costs or reduce profit for a bank or its corporate customers every year, but the benefits of that regulation would mostly accrue to taxpayers and the economy as a whole if it saves them from the occasional financial crisis that requires a bailout or triggers a recession. From the banks’ standpoint, such a regulation may well seem like a bad idea, but for society as a whole it would be a winner.
UPDATE: David Leonhardt at the NYT also has a good piece on this issue:
For all the criticism BP executives may deserve, they are far from the only people to struggle with such low-probability, high-cost events. Nearly everyone does. “These are precisely the kinds of events that are hard for us as humans to get our hands around and react to rationally,” Robert N. Stavins, an environmental economist at Harvard, says. We make two basic — and opposite — types of mistakes. When an event is difficult to imagine, we tend to underestimate its likelihood. This is the proverbial black swan. Most of the people running Deepwater Horizon probably never had a rig explode on them. So they assumed it would not happen, at least not to them.
Similarly, Ben Bernanke and Alan Greenspan liked to argue, not so long ago, that the national real estate market was not in a bubble because it had never been in one before. Wall Street traders took the same view and built mathematical models that did not allow for the possibility that house prices would decline. And many home buyers signed up for unaffordable mortgages, believing they could refinance or sell the house once its price rose. That’s what house prices did, it seemed.
On the other hand, when an unlikely event is all too easy to imagine, we often go in the opposite direction and overestimate the odds. After the 9/11 attacks, Americans canceled plane trips and took to the road. There were no terrorist attacks in this country in 2002, yet the additional driving apparently led to an increase in traffic fatalities.
When the stakes are high enough, it falls to government to help its citizens avoid these entirely human errors. The market, left to its own devices, often cannot do so. Yet in the case of Deepwater Horizon, government policy actually went the other way. It encouraged BP to underestimate the odds of a catastrophe.