Two scientists, Gerard Roe and Marcia Baker, have a paper in Science this week which is a must read for everyone in climate policy. Here’s the abstract:
Uncertainties in projections of future climate change have not lessened substantially in past decades. Both models and observations yield broad probability distributions for long-term increases in global mean temperature expected from the doubling of atmospheric carbon dioxide, with small but finite probabilities of very large increases. We show that the shape of these probability distributions is an inevitable and general consequence of the nature of the climate system, and we derive a simple analytic form for the shape that fits recent published distributions very well. We show that the breadth of the distribution and, in particular, the probability of large temperature increases are relatively insensitive to decreases in uncertainties associated with the underlying climate processes.
What that means, essentially, is that the climate system is so inherently complicated – because of internal variables like snow cover, clouds or water vapour in the atmosphere – that it’s just not possible to put specific numbers on ‘x amount of CO2 = y degrees C of warming’. Here’s Scientific American this week, with an interview of the paper’s authors:
Some of these feedback processes are poorly understood—like how climate change affects clouds—and many are difficult to model, therefore the climate’s propensity to amplify any small change makes predicting how much and how fast the climate will change inherently difficult. “Uncertainty and sensitivity are inextricably linked,” Roe says. “Some warming is a virtual certainty, but the amount of that warming is much less certain.” Roe and his U.W. co-author, atmospheric physicist Marcia Baker, argue in Science that, because of this inherent climate effect, certainty is a near impossibility, no matter what kind of improvements are made in understanding physical processes or the timescale of observations.
Why does this matter for policymakers? Because it puts a question mark over the current emphasis in policy debates on limiting warming to 2 degrees C (c.f. discussion of temperature limits at the recent UN climate summit). How can you limit warming to 2 degrees, or any other number, if you’re not sure what that equates to in terms of a parts per million ceiling on carbon dioxide in the air? Here’s Scientific American again:
…targets such as stabilizing atmospheric concentrations of CO2 at 450 parts per million (nearly double preindustrial levels) to avoid more than a 3.6 degree F (2 degree C) temperature rise are nearly impossible as well. There is no guarantee that such a target would achieve its stated goal. “Policymakers are always going to be faced with uncertainty and so the only sensible way forward to minimize risk is to adopt an adaptive policy,” argues climatologist Gavin Schmidt of the NASA Goddard Institute of Space Studies, “which adjusts emissions targets and incentives based on how well, or badly, things are going.”
So, while it still makes sense to use the IPCC estimate that limiting warming to between 2.0 and 2.4 degrees C means limiting CO2e levels to between 445 and 490ppm (and hence reducing global emissions by between 60 and 85 per cent by 2050), what Roe and Baker’s research really underlines is: focus on the CO2, not the temperature. For all that 2 degrees makes a nice advocacy position for NGOs, the problem with temperature limits has been that they just don’t equate directly to emissions targets in the way that concentration limits to.
Which is why David and I have always called for stabilisation limits in ppm terms, not degrees C – and have also stressed that the key thing is to make the CO2 ceiling revisable in the light of emerging science.