Swapping Climate Models For A Roll Of The Dice

  • Date: 21/06/14
  • Doug L Hoffman, The Resilient Earth

One of the greatest failures of climate science has been the dismal performance of general circulation models (GCM) to accurately predict Earth’s future climate. For more than three decades huge predictive models, run on the biggest supercomputers available, have labored mighty and turned out garbage.

Their most obvious failure was missing the now almost eighteen year “hiatus,” the pause in temperature rise that has confounded climate alarmists and serious scientists alike. So poor has been the models’ performance that some climate scientists are calling for them to be torn down and built anew, this time using different principles. They want to adopt stochastic methods—so called Monte Carlo simulations based on probabilities and randomness—in place of today’s physics based models.

It is an open secret that computer climate models just aren’t very good. Recently scientists on the Intergovernmental Panel on Climate Change (IPCC) compared the predictions of 20 major climate models against the past six decades of climate data. According to Ben Kirtman, a climate scientist at the University of Miami in Florida and IPCC AR5 coordinating author, the results were disappointing. According to a report in Science, “the models performed well in predicting the global mean surface temperature and had some predictive value in the Atlantic Ocean, but they were virtually useless at forecasting conditions over the vast Pacific Ocean.”

Just how bad the models are can be seen in a graph that has been widely seen around the Internet. Generated by John Christy, Richard McNider, and Roy Spencer, the graph has generated more heat than global warming, with climate modeling apologists firing off rebuttal after rebuttal. Problem is, the models still suck, as you can see from the figure below.

Regardless of the warmists’ quibbles the truth is plain to see, climate models miss the mark. But then, this comes as no surprise to those who work with climate models. In the Science article, “A touch of the random,” science writer Colin Macilwain lays out the problem: “researchers have usually aimed for a deterministic solution: a single scenario for how climate will respond to inputs such as greenhouse gases, obtained through increasingly detailed and sophisticated numerical simulations. The results have been scientifically informative—but critics charge that the models have become unwieldy, hobbled by their own complexity. And no matter how complex they become, they struggle to forecast the future.”

Macliwain describes the current crop of models this way:

One key reason climate simulations are bad at forecasting is that it’s not what they were designed to do. Researchers devised them, in the main, for another purpose: exploring how different components of the system interact on a global scale. The models start by dividing the atmosphere into a huge 3D grid of boxlike elements, with horizontal edges typically 100 kilometers long and up to 1 kilometer high. Equations based on physical laws describe how variables in each box—mainly pressure, temperature, humidity, and wind speed—influence matching variables in adjacent ones. For processes that operate at scales much smaller than the grid, such as cloud formation, scientists represent typical behavior across the grid element with deterministic formulas that they have refined over many years. The equations are then solved by crunching the whole grid in a supercomputer.

It’s not that the modelers haven’t tried to improve their play toys. Over the years all sorts of new factors have been added, each adding more complexity to the calculations and hence slowing down the computation. But that is not where the real problem lies. The irreducible source of error in current models is the grid size.

Indeed, I have complained many times in this blog that the fineness of the grid is insufficient to the problem at hand. This is because many phenomena are much smaller than the grid boxes, tropical storms for instance represent huge energy transfers from the ocean surface to the upper atmosphere and can be totally missed. Other factors—things like rainfall and cloud formation—also happen at sub-grid size scales.

“The truth is that the level of detail in the models isn’t really determined by scientific constraints,” says Tim Palmer, a physicist at the University of Oxford in the United Kingdom who advocates stochastic approaches to climate modeling. “It is determined entirely by the size of the computers.”

The problem is that to halve the sized of the grid divisions requires an order-of-magnitude increase in computer power. Making the grid fine enough is just not possible with today’s technology.

In light of this insurmountable problem, some researchers go so far as to demand a major overhaul, scrapping the current crop of models altogether. Taking clues from meteorology and other sciences, the model reformers say the old physics based models should be abandoned and new models, based on stochastic methods, need to be written from the ground up. Pursuing this goal, a special issue of the Philosophical Transactions of the Royal Society A will publish 14 papers setting out a framework for stochastic climate modeling. Here is a description of the topic:

This Special Issue is based on a workshop at Oriel College Oxford in 2013 that brought together, for the first time, weather and climate modellers on the one hand and computer scientists on the other, to discuss the role of inexact and stochastic computation in weather and climate prediction. The scientific basis for inexact and stochastic computing is that the closure (or parametrisation) problem for weather and climate models is inherently stochastic. Small-scale variables in the model necessarily inherit this stochasticity. As such it is wasteful to represent these small scales with excessive precision and determinism. Inexact and stochastic computing could be used to reduce the computational costs of weather and climate simulations due to savings in power consumption and an increase in computational performance without loss of accuracy. This could in turn open the door to higher resolution simulations and hence more accurate forecasts.

In one of the papers in the special edition, “Stochastic modelling and energy-efficient computing for weather and climate prediction,” Tim Palmer, Peter Düben, and Hugh McNamara state the stochastic modeler’s case:

[A] new paradigm for solving the equations of motion of weather and climate is beginning to emerge. The basis for this paradigm is the power-law structure observed in many climate variables. This power-law structure indicates that there is no natural way to delineate variables as ‘large’ or ‘small’—in other words, there is no absolute basis for the separation in numerical models between resolved and unresolved variables.

In other words, we are going to estimate what we don’t understand and hope those pesky problems of scale just go away. “A first step towards making this division less artificial in numerical models has been the generalization of the parametrization process to include inherently stochastic representations of unresolved processes,” they state. “A knowledge of scale-dependent information content will help determine the optimal numerical precision with which the variables of a weather or climate model should be represented as a function of scale.” It should also be noted that these guys are pushing “inexact” or fuzzy computer hardware to better accommodate their ideas, but that does not change the importance of their criticism of current modeling techniques.

So what is this “stochastic computing” that is supposed to cure all of climate modeling’s ills? It is actually something quite old, often referred to as Monte Carlo simulation. In probability theory, a purely stochastic system is one whose state is non-deterministic—in other words, random. The subsequent state of the system is determined probabilistically using randomly generated numbers, the computer equivalent of throwing dice. Any system or process that must be analyzed using probability theory is stochastic at least in part. Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Nowadays, the technique is used by professionals in such widely disparate fields as finance, project management, energy, manufacturing, engineering, research and development, insurance, oil & gas, transportation, and the environment.

Monte Carlo simulation generates a range of possible outcomes and the probabilities with which they will occur. Monte Carlo techniques are quite useful for simulating systems with many coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and weather forecasts. Other examples include modeling phenomena with significant uncertainty in inputs, which certainly applies to climate modeling. Unlike current GCM, this approach does not seek to simulate natural, physical processes, but rather to capture the random nature of various factors and then make many simulations, called an ensemble.

Since the 1990s, ensemble forecasts have been used as routine forecasts to account for the inherent uncertainty of weather processes. This involves analyzing multiple forecasts created with an individual forecast model by using different physical parameters and/or varying the initial conditions. Such ensemble forecasts have been used to help define forecast uncertainty and to extend forecasting further into the future than otherwise possible. Still, as we all know, even the best weather forecasts are only good for five or six days before they diverge from reality.

An example can be seen in the tracking of Atlantic hurricanes. It is now common for the nightly weather forecast during hurricane season to include a probable track for a hurricane approaching the US mainland. The probable track is derived from many individual model runs.

Can stochastic models be successfully applied to climate change? Such models are based on a current state which is the starting point for generating many future forecasts. The outputs are based on randomness filtered through observed (or guessed at) probabilities. This, in theory, can account for such random events as tropical cyclones and volcanic eruptions more accurately than today’s method of just applying an average guess across all simulation cells. The probabilities are based on previous observations, which means that the simulations are only valid if the system does not change in any significant way in the future.

And here in lies the problem with shifting to stochastic simulations of climate change. It is well know that Earth’s climate system is constantly changing, creating what statisticians term nonstationary time series data. You can fit a model to previous conditions by tweaking the probabilities and inputs, but you cannot make it forecast the future because the future requires a model of something that has not taken form yet. Add to that the nature of climate according to the IPCC: “The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.

If such models had been constructed before the current hiatus—the 17+ year pause in rising global temperatures that nobody saw coming—they would have been at as much a loss as the current crop of GCM. You cannot accurately predict that which you have not previously experienced, measured, and parametrized, and our detailed climate data are laughingly limited. With perhaps a half century of detailed measurements, there is no possibility of constructing models that would encompass the warm and cold periods of the Holocene interglacial, let alone the events that marked the last deglaciation (or those that will mark the start of the next glacial period).

Full story