Tamino

My 2012 temperature prediction: +0.65 C (for GISS)

It's mid-February 2012 and the various groups reporting global surface temperature data have all posted numbers that are on the low side compared to thus far in the 21st century, though still several tenths of a degree above the 20th century average. NASA's Goddard Institute for Space Studies (GISS) in particular, which I've been following and posting guesses/predictions on for several years now, has just posted a January 2012 number of 0.36 as the global land + sea temperature anomaly, 0.15 degrees C below the 2011 annual average of 0.51 and 0.27 degrees below the 2010 (and 2005) record of 0.63 degrees.

So naturally, those who deny the scientific evidence for human-caused warming have been going on about cooling, how warming has stopped for many years, and so forth. Though not as adamantly as they were back in 2008 - that cooling proved very brief; perhaps they have learned to be more cautious. If I was in mind to go with simple linear trends and didn't believe the science on warming myself, the January 2012 number of 0.36 sounds like a pretty good bet for the whole of 2012 at this point, maybe it should be even lower.

But I understand the science and know quite clearly warming will continue (with some minor ups and downs) until we stop with the fossil carbon burning - and it may take a while after that to actually stop. I've plugged some numbers in and come up with a science-based prediction for 2012: 0.65 C (plus or minus about 0.07). That middle value would break all previous records. Even on the low end it would put 2012 at the 5th or 6th warmest year on record (just slightly cooler than the remarkable early warm year of 1998 in the GISS index). Why am I so sure 2012 will be so warm?

Predicting future temperatures

A few months ago Tamino at Open Mind posted a fascinating analysis of warming obtained by fitting the various observational temperature series to a linear combination of El Nino, volcano, and solar cycle variations (using sun spots as a proxy for the latter), plus an underlying trend, allowing for some offsets in time between the causative series and the temperature. Year to year global average temperatures fluctuate significantly, by several tenths of a degree. Taking into account these "exogenous" factors, however, greatly reduced the level of variation. Not only does this more clearly show the underlying trend, once the "exogenous" components are removed, but it occurred to me this also allows prediction of future temperatures with considerably more confidence than the usual guessing (though I've done well with that in the past), at least for a short period into the future.

See below for a detailed discussion of what I've done with Tamino's model, for the GISS global surface temperature series. In brief, however, I present the results of two slightly different models of the future, first with no El Nino contribution beyond mid-2011, and second with a pure 5-year-cycle El Nino (starting from zero in positive phase) from mid-2011 on.

Year Model 1 prediction for GISS Jan-Dec
global average temperature anomaly
Model 2 prediction
2011 0.576 0.578
2012 0.687 0.785
2013 0.718 0.840

While there's some variation in future years, the final average temperature for 2011 should be close to 0.58 (similar to the temperatures in 2009). Temperatures in 2012 are likely to be much warmer - at least breaking the record of 0.63 set in 2005 and 2010, possibly (model 2) by as much as 0.15 degrees. With the continued waxing of the solar cycle and continued increases from CO2 effects, however warm 2012 is, 2013 should be even warmer (unless we get a big volcano or another strong La Nina then).

One more reasonable constraint on two-box empirical models of Earth's climate

One of my more recent posts on the two-box model explored the space of possible underlying models for a given empirical fit by fixing heat capacities of the two boxes and varying the heat transfer rate. Keeping the time constants positive restricts the range of allowed heat capacities considerably, while forcing fraction (x) and temperature measurement fraction (y) also provide some constraints given the expectation they must lie between 0 and 1 (and must have actual solutions). Even among solutions satisfying those constraints, there is a further condition that the results look reasonable - as pointed out there and by Lucia here, some of the solutions produce wildly different response levels for the two boxes, which seems unrealistic for systems that should roughly correspond to sub-components of Earth's climate.

Physics and its mathematical abstractions

I've been discussing in some detail here a mathematical model of the response of Earth's climate to radiative forcings, trying to address some of the concerns expressed elsewhere on the need for such a model to be "physically realistic". In the case of the two-box model, a given fit of the response function to a two-time-constant decay curve could come from one of many different underlying physical models that correspond to a partitioning of Earth's climate system into two parts with different response rates. So the question has been whether any of these possible underlying physical models are in some way "realistic" or not. That essentially reduces to criteria on the magnitude of the various constants and partial outcomes in the model relative to real components of our planet.

Heat transfer in the two-box model

The following proved a little long to be just an update to the previous post; I guess one should never say never. Nevertheless I don't anticipate a need for anything more on this model.

Playing with R trying to fit the modern temperature record to forcings

Ok, this time I'm going to start with the graph, and explain what's going on after. Seems to work for other folks... :
Global mean temperature and a fit to forcings

Syndicate content