Belmont Club

Fire and Ice

Der Spiegel examines the chain of events that led to the cancellation of 17,000 flights over Europe, including the diversion of medevacs from Afghanistan and rerouting of the German Chancellor’s flight home.  Ash clouds from an Icelandic volcano disrupted flights all over Europe. The question is whether the policy makers over-reacted to the thread.  As volcanoes go Eyjafjallajökull  was accounted by Icelandic volcanologists as “a weary old man”. It’s recent eruption was unremarkable.

Ash from the volcano’s plume has reached an altitude of only about 10 kilometers (six miles), not high enough to reach the stratosphere … images taken by the Eumetsat satellite … concluded that Iceland’s Eyjafjallajökull has spewed 2,000 tons of sulphur dioxide into the air. Pinatubo spouted 10,000 times that amount.

These facts are clear in hindsight. But when the eruption was first reported it triggered a series of remarkable precautionary events driven by predictions from the British meteorology office’s supercomputer. The Telegraph explains how that prediction cascaded through the European bureaucracy.

The decision was based on a computer model operated by the Meteorological Office’s Volcanic Ash Advisory Centre, which suggested there was a cloud of ash covering northern Europe. This prompted a warning from the Met Office, which triggered the wider European ban, via Eurocontrol, the Brussels-based air traffic control centre.

Once the estimate was taken as the best available knowledge, the shutdown of the air transportation system mirrored the spread of the alarm through the system. A cloud of information — whether right or wrong we will examine in a moment — drifted like a virtual ashfall across the continent’s airports. Spiegel takes up the story.

It all began midday on Wednesday, when a telephone rang in Exeter, southern England. Icelandic meteorologists were calling to inform their British colleagues at the Met Office, the United Kingdom’s national weather service, that Eyjafjallajökull was spouting ash and a cloud of volcanic dust was blowing eastward from Iceland. …

The meteorologists immediately put their supercomputer on the job, feeding it measurement data, weather forecasts and satellite images. Fifteen minutes later, they had their first forecast of how the dust cloud would probably spread. A warning was sent out to airlines at 2 p.m., long before the cloud reached the European continent…. On Thursday morning, air traffic authorities closed Scottish airspace. Shortly afterward, the skies above London also experienced a state of quiet such as the city hadn’t known in decades.

By the time the dust settled that single “weary old man” of a volcano had accomplished what all the terrorists in the world had failed to do. The director of Germany’s Cologne-Bonn airport said he had never seen anything like it, even on September 11.

Heathrow, Paris, Frankfurt, Schiphol and all of Europe’s other major hubs came to a standstill on Friday afternoon. Airlines canceled 17,000 flights, while Frankfurt and Amsterdam airports set up thousands of camp beds. Losses for airlines are estimated at up to a billion dollars.

German Chancellor Angela Merkel was forced to interrupt her flight home from a visit to the US, landing in Lisbon instead. A Medevac Airbus air ambulance carrying injured German soldiers home from Afghanistan only made it as far as Istanbul.

The military historian Max Hastings wrote that “the great volcanic shutdown was the price we pay for a society that overreacts to any risk”. Hastings argued that societies had forgotten the concept of accepting risk.  An accident, no matter how statistically insignificant, could be magnified by press coverage into a Grecian tragedy. The result was that many systems, including those which were unprecedentedly safe, spent huge marginal costs to attain the last word in perfection.

But perhaps risk is the wrong word. Tradeoffs only apply when choosing between outcomes governed by probabilities that are well known. We trade off a certain number of fatal reactions to vaccines because more lives are saved by its use than are lost thereby. But what if we don’t know the probabilities? What if we can’t know the probabilities?  This describes the worst examples of over-reaction which Hastings cites, where public policy is made on the basis of estimates, projections or models which are so inaccurate as to be meaningless. Hastings cites fiascos which are less failures to assess risk than to recognize uncertainty:

In 1988, health minister Edwina Currie almost destroyed Britain’s egg industry when she said that salmonella in eggs might cause a human catastrophe – only for it to be later discovered that salmonella could not get into eggs.

In 1996, Britain spent £7 billion killing millions of the nation’s cows in response to the alleged threat of CJD killing humans eating burgers made from cattle infected by BSE. We now know that the likelihood of this was almost infinitesimally slight.

In 2009, the government spent £1 billion on unneeded vaccines against swine flu, which we were told might kill half a million people. The SARS virus, said some ‘experts’, could prove more devastating to humanity than Aids. It was once suggested that bird flu might kill 150 million people worldwide.

University of Chicago Professor Frank Knight described the distinction between risk and uncertainty in these words. “Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated…. The essential fact is that ‘risk’ means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character … It will appear that a measurable uncertainty, or ‘risk’ proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all.” In other words, Knight wanted to differentiate between risks we could measure and those which we could not estimate. Donald Rumsfeld conveyed the same idea much more eloquently and comprehensively.

As we know,
There are known knowns.
There are things we know we know.
We also know
There are known unknowns.
That is to say
We know there are some things
We do not know.
But there are also unknown unknowns,
The ones we don’t know
We don’t know.

But Rumsfeld was one of the few officials willing to admit there were limits to his knowledge. He knew there were categories of risk and uncertainty and was prepared to deal with each. Lesser bureaucrats and many institutions are less secure. They simply cannot operate without a Number, even if that Number is hogwash. It was important to pretend to know, to “be in charge” rather than endure the humiliation of confessing ignorance.  But all the power of supercomputers cannot save policy makers from the fact that they will sometimes have to face Knightian uncertainty. In those circumstances leaders can sometimes get it wrong and should forgiven their error because they had no choice but to act in the presence of a threat, but in the absence of intelligence. But they would be more easily forgiven if they admitted upfront that they were spinning wheel, turning the cards and flying by the seat of their pants because they had no other choice. Perhaps the most interesting example of a decision under uncertainty is Pascal’s Wager.

Pascal then asks the reader to analyze his position. If reason is truly corrupt and cannot be relied upon to decide the matter of God’s existence, only a coin toss remains. In Pascal’s assessment, placing a wager is unavoidable, and anyone who is incapable of trusting any evidence either for or against God’s existence, must at least face the prospect that infinite happiness is at risk. The “infinite” expected value of believing is always greater than the expected value of not believing.

Frank ignorance is often preferable to feigned knowledge. Mark Twain knew this long ago. “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” Vanna, spin the wheel.

Tip Jar or Subscribe for $5