By Rob Mitchum // October 21, 2013
The Earth’s climate is changing more dramatically than at any other point in recorded history. With no historical precedents to draw from, policy-makers have increasingly turned to computer models to help them strategize for an uncertain climate future. While these climate models have shown early success, scientists are constantly working to improve their accuracy, extend their predictions farther into the future and connect them across sectors to models of the economy, energy and agriculture.
Achieving these goals means getting under the hood of these models to better understand how they work – and how to make them better. At the third annual all-hands meeting of the Center for Robust Decision-Making on Climate and Energy Policy (RDCEP), held in early October in Chicago, computer scientists, economists, statisticians and geophysicists dug into the programming code and mathematical formulas that are the moving parts inside these complex models.
RDCEP’s mission is to create tools to help policy makers make informed climate-related decisions in light of the deep uncertainty that is inherent to modeling the future. Now in its fourth year, the center is actively rolling out those tools for academics, policy-makers and the general public to use.
Some tools provide the glue that connects models from different disciplines. The Climate Emulator, presented at the meeting by University of Chicago geophysicist Liz Moyer and statistician Michael Stein, simplifies the complex outputs of state-of-the-art climate models for easier import into economic models. That same process also makes the emulator a useful online tool for public use, allowing people to work with “home versions” of the climate models used by experts.
Other models presented at the meeting already weave together simulations of climate, economics, agriculture and policy. In CIM-EARTH, presented by CI fellow Todd Munson, users can look at how adjusting border and carbon taxes or environmental regulations affects international trade, biofuel production or the generation of electricity from traditional or alternative sources. FABLE, developed by a team led by Purdue University economist Thomas Hertel, models how different climate scenarios affect the balance of global land use for crops, livestock, energy and forestry.
Other RDCEP researchers (including Nobel laureate Lars Peter Hanson, a week before he received the honor) also proposed new strategies and methods to improve the next generation of models, such as analyzing temperature spectra to simulate future changes in seasonal variability, or the incorporation of economic principles such as shock price elasticity and robust decision theory that address structural uncertainty. Those innovations highlight the challenge of making a climate model built on physical laws interact in a meaningful way with models of economics, which run on less universal principles
“The climate system doesn’t give a damn who the Chairman of the Fed is,” said William “Buz” Brock from the University of Wisconsin. “Within three or four decades we have no idea what the future economy will look like.”
Some of these inconsistencies show up in multi-sector models that predict six degrees of warming – but, paradoxically, project the average American family income to reach $1.5 million by the year 2300. University of Chicago Law School Professor David Weisbach‘s preview of his “uncertainty in the social cost of carbon” paper with Moyer and RDCEP’s Michael Glotter seeks to address this contradiction by looking at indirect effects of climate on economic sectors such as productivity and research.
Another interesting discussion focused on the trade-offs and relative value of making models more and more complex. Alan Sanstad from Lawrence Berkeley National Laboratory argued that the climate modeling community needs to come up with empirical standards for determining what makes a certain model better than another and in what cases adding more detail to a model actually results in improved predictions.
For an extreme example of simplicity trumping complexity, Leonard Smithof the London School of Economics previewed an upcoming paper where he and colleagues determined that a pen-and-paper statistical model could perform as well, if not better, than computationally complex climate models on predicting temperature changes on decadal time scales. The take home message of this demonstration, he said, was that the big models of today are useful for general insight, but not so much for the more specific operational decisions that economists are interested in exploring.
“I think [the models] have only shown us that we haven’t found a reason not to be worried,” Smith said.
With the Intergovernmental Panel on Climate Change releasing the first installment of its fifth assessment report last month, climate models are poised at the start of another chapter. The RDCEP meeting demonstrated how much work will go into building the models that will drive the findings of the next report: improving their accuracy, compatibility and even their philosophical foundations. To give decision-makers the best tools possible to prepare for the changing climate, the tinkering never stops.
Photo by Mart Turnauckas, from Flickr.