A Calculator Just Whupped UN Supercomputers at Accurately Modeling Climate

Climate is complex.

This is true both in the conventional “wow, this is hard to figure out!” sense, and in the technical sense that people mean when they talk about “complex systems theory.” It’s so sensitive to the initial assumptions that it’s never feasible to compute exactly how the system will behave. Sometimes this is called “sensitive dependence on initial conditions,” or SDIC.

Advertisement

This is basically why we can’t predict if it will rain on Monday, yet we can confidently predict that it’ll be colder in Boulder in December than it was in July.

The difference here is between what is exactly true, and what is statistically true.

We can have — and have had — a 60 degree high in Boulder in either December or July, but it’s probably going to be close to the the warmest day of the month in December and the coolest day of the month in July. So even with a system that is SDIC, it doesn’t mean you can’t deal with it scientifically at all, it just means you need to develop useful approximations — mathematical models of the system.

In an excellent blog post, Joseph Chipperfield lays out some ideas of what makes a good model. For the technical points, I’d refer you to the whole post, but he basically has four criteria:

1. Fit. The values the models compute have to be close to past observations.

2. Predictivity. The models then have to closely approximate observations in the future without further tuning.

3. Parsimony. The simpler the model, the better.

The trick here is what “simpler” means. This would make a good article in itself, but one key point is that the more parameters a mathematical model has, the easier it is to get a close fit by tuning the parameters. Sometimes you learn new things from this; other times, not so much. Tomas Milanovic’s post on simplicity at Climate Etc. goes into this. Another surprisingly good heuristic here is simply to ask how long it takes to compute an answer: the longer it takes, the more complex the model.

Advertisement

4. Sanity. You need to step back from the model and ask: “It’s this crazy or what?” For example, world population was doubling roughly every 30 years in the 20th century. That’s a mathematical model:

population = starting population × 2interval in years/30

But if you set that interval to 1000 years, you get 3.246791823 × 1019. That’s about 32.5 quintillion people.

This model predicts that in less than 1400 years, the mass of people on Earth would exceed the mass of Earth itself. The point here is that saying “population is doubling every 30 years” might fit the data well, but that model is too simple – carry it on very long, and it delivers obviously crazy results.

DraggedImage.6045427a29db44a68724a75b900f0eb1

The current climate models fueling belief in manmade global warming do have fairly good “fit” to the data on which they were tested. However, the predictivity isn’t that great — see the recent warming “pause” or have a look at the figure above. They’re also hella complex, requiring thousands of hours of supercomputer computations.

Early this year, Christopher Monckton of Brenchley, Willie Soon of the Harvard-Smithsonian Center for Astrophysics, David Legates of the University of Delaware, and Matt Briggs, “Statistician to the Stars” and sometimes PJM contributor, published a paper in Science Bulletin (the Chinese equivalent of Science) entitled “Why models run hot: results from an irreducibly simple climate model”.

Advertisement

They took a different approach. Observing the issues with the current climate models, they constructed a very simple model working from first principles. “Irreducibly” here means “it can’t get simpler and reflect basic physics.” (If you want a detailed discussion of of their model, read Rud Istvan’s post at Climate Etc.)

This model is about one step advanced from a “back of the envelope” calculation, since it requires taking a natural logarithm as well as some multiplication, but it’s easily done with a scientific calculator — or even a slide rule.

But it models actual temperature observations better than the complex models.

This is a pretty significant challenge to mainstream climate models: the Monckton, et al. model fits observations better and is wildly simpler. The mainstream models take thousands of hours of computation, while the “irreducibly simple” model takes a few multiplications and divisions. It’s arguably more predictive — it hasn’t been around a long time, but it’s not sensitive to previous data, and does “hindcast” the “pause” far better than traditional models.

This, as you can imagine, is causing great consternation in the world of mainstream climate science.

In another paper, “Keeping it simple: the value of an irreducibly simple climate model,” published on August 6, Monckton, et al. answer the criticisms. You can read the press release Monckton wrote at Matt Briggs’ blog. (It’s a lovely bit of prose; Chris is a master of polemic.)

Advertisement

Brutally simplified, the underlying question is about one term in both the Monckton et al. model and the mainstream models: climate sensitivity to changes in carbon dioxide concentration.

This is expressed as estimated carbon sensitivity (ECS), which is stated as the amount of increase in global average surface temperature expected over 100 years for a doubling of average CO2 concentration. The IPCC models use a value of between 2°C and 6°C for ECS. The “irreducibly simple” model uses a value of about 1.2°C, derived, as I said, from basic physical principles.

The implication of this is that the impact of CO2 concentration is much less than the IPCC and the anthropogenic-climate-change-crisis crowd in general assumes it to be. Which means that increasing CO2 concentration is not a crisis, and doing things like massively increasing energy costs in the U.S. while trying to force third-world countries to stay third-world countries are unwarranted.

Now, be careful with this, because it isn’t proof that anthropogenic climate change is a hoax: the “irreducibly simple” model still fits with the basic science. That is, we still know that climate has changed over the last several hundred years, increases in greenhouse gases like CO2 contribute to that, and that we’ve got good reason to think humans are contributing to that increase.

What Monckton et al. have done is construct a simple physical model and they’ve shown it’s a better fit than the complex models. In general, more parsimonious models are preferred, and models that better fit all observations are preferred.

Advertisement

Neither statement means this model is more “right” in some sense, but when the existing models are proving unpredictive, and are being preserved by means of occult heat in the oceans and pretty arbitrary adjustments to actual ocean observations, it’s time to think maybe, just maybe, a new model is worth looking at.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement