Get PJ Media on your Apple

The Precautionary Principle and Global Warming

How much risk from climate change should there be before we spend trillions of dollars to address the problem?

by
Rand Simberg

Bio

December 16, 2009 - 12:06 am

Those advocating that we upend the global (and particularly the U.S.) economy to stave off climate change resort to a concept called the “precautionary principle“. Simply stated, it is that if there is some risk of an irreversible disaster in taking an action, then that action should be foregone.

In this formulation, the risk is climate change that will be disastrous for humanity, and the action to be foregone is continuing to add the carbon dioxide that is ostensibly causing it to the planetary atmosphere. The beautiful thing about the principle (at least for them) is that, because it doesn’t assign any particular probability to the risk (i.e., it is uncertain), then it doesn’t matter whether the science backing it up is known to be valid, because even if the science has only a small probability of being correct, the principle applies.

The original advocate of the precautionary principle was the mathematician Blaise Pascal, who came up with a famous “wager.” To wit: we can’t calculate the probability of the existence of God, but if he exists, the cost of believing in him is small, and the wages for not doing so is eternal damnation. Therefore, it makes sense to believe.

Many in the centuries since have pointed out the flaws in the argument. For instance, there is a non-zero probability that God will consign you to perdition if and only if you believe in him. Thus, to avoid this fate, the only safe course is to be an atheist.

Which points out the flaw in the principle in general. While it doesn’t require a precise accounting of the odds, it also doesn’t necessarily provide guidance as to what to do if there’s any chance that the proposed cure (or “insurance policy”) is worse than the feared disease. And a good case can be made (as has been by people such as Bjorn Lomborg) that in fact there is not just an excellent chance, but almost a certainty that this is the case with most of the proposed solutions to anthropogenic global warming.

Despite those logical flaws (not that he has ever been a stranger to illogic), the latest AGW hystericist to employ the concept is Tom Friedman:

When I see a problem that has even a 1 percent probability of occurring and is “irreversible” and potentially “catastrophic,” I buy insurance. That is what taking climate change seriously is all about.

Well, I do that, too. But I buy insurance that has a price commensurate with the expected value (i.e., the cost of the disaster times the probability that it will occur). For instance, I’ll pay a few hundred bucks for a million-dollar policy against the small chance that I’ll kick off tomorrow. Presumably, Friedman assumes that the proposed palliatives of cap’n’tax or carbon taxes meet that criterion, but he doesn’t do the calculations for us, because he can’t. Warm mongers like him propose to spend trillions of dollars now to prevent an unknown amount of cost later, in defiance of the basic economic principle of discounting the value of future expenditures.

There is a variation on this fallacy, in fact. It goes: There is a crisis; something must be done! What we propose to do is something. Therefore, it must be done!

This invalid argument is otherwise known as false choice, of course, because the alternative to the particular something being proposed is not nothing (even if one accepts the initial premise that there is a crisis about which something must be done) — it is a variety of other somethings, some of which may be the something that is actually key to solving the problem, even if their own is not necessarily.

We saw this last January when many of the same people promoting AGW hysteria also used it to ram through the failed “stimulus” bill without reading it. It is now being used to justify taking over the sixth of the US economy represented by the health care industry. All the while, these people have been lambasting their political opponents who offer more sensible alternatives as proposing that we do “nothing.”

The big problem here is that what we’re dealing with is not risk, in which the probabilities can be reliably quantified, allowing an expected value to be computed, but uncertainty, in which they cannot.

As an example, a thirty percent chance of rain represents risk. “It might rain, or it might not, but we have no idea what the probability is” constitutes uncertainty. It’s much easier to decide whether or not to take an umbrella in the first circumstance than the second.

For this reason, economists have come up with a more sophisticated technique for decision making in the absence of probabilities of outcomes. Rather than simply looking for the lowest maximum cost, they instead try to minimize how bad you’ll feel if you make the wrong decision — they minimize “regret.”

It’s based on the notion that when you make a decision, you shouldn’t compare it to some unattainable ideal of zero cost; you should compare it to the best decision you could have possibly made under the circumstances, whatever they turn out to be. This eliminates the oversimplicity of the one-sided Pascal’s Wager.

Take a simple case — do you take an umbrella when it rains, or not?

Consider a classical game-theory cost matrix. The center columns are two potential states of the world, and the rows are the actions one can take. I’ve put in notional cost numbers simply to mathematically demonstrate the concept. For instance, there might be a higher cost of carrying an umbrella on a non-rainy day because of the increased risk of leaving it somewhere because you don’t need it. Note that we are not restricted to only two of either states or actions — there could be many more of each — I simply chose the simplest case for the purpose of illustration.

State 1 (It Will Rain) State 2 (It Won’t) Maximum Cost
Action 1(Take Umbrella)

3

4

4

Action 2 (Don’t)

5

0

5

It looks like we can minimize our maximum cost by choosing Action 1, since four is less than five. That is the so-called “minimax” solution. But is that really the right decision?

Let’s derive a “regret” matrix from it. This is done by finding the minimum cost for any state, and subtracting each cell of that state from it. The minimum cost for State 1 is 3, so the column would be three minus three for the first row (0) and five minus three (2) for the second row. That makes intuitive sense, since if you made the right decision for that state, you’ll have no regrets. The regret matrix for the example cost matrix is shown below:

State 1 State 2 Maximum Regret
Action 1

0

4

4

Action 2

2

0

2

Note now that if we want to minimize regret, we should actually choose Action 2. Note also that this is independent of the relative probabilities of the two states. The regret analysis clarifies the choices. It also, at least in this case, shows why we don’t carry umbrellas everywhere and when, unless we live in Seattle.

Assuming, of course, that we have the right numbers to put into the matrix. The problem is that, with the heretofore monomaniacal devotion to the flawed precautionary principle by the warm mongers, we haven’t even defined the rows and columns, let alone attempted to come up with the numbers.

Doing so would obviously be far beyond the scope of this brief essay, but I would suggest that there are at least four columns: the world is warming, the world is warming as a result of anthropogenic activities, the world is cooling, and the climate is varying up and down.

There are also at least four potential actions: slamming the brakes on carbon emissions, letting the market determine our energy choices, making plans for damage mitigation and remediation, and developing geoengineering means of global temperature control. Of course, there is a fifth choice: doing nothing (or rather, continuing on with our terrestrial affairs without regard to the future of the planetary climate).

That may be the best option, but there’s no way to know until we actually compare it to the others in an economically sensible way, which is to attempt to minimize how much regret we’ll feel if we end up, panicked, doing the wrong (some)thing. That should be the biggest precaution we want to take.

Rand Simberg is a recovering aerospace engineer and a consultant in space commercialization, space tourism and Internet security. He offers occasionally biting commentary about infinity and beyond at his weblog, Transterrestrial Musings.
Click here to view the 108 legacy comments

Comments are closed.