Roy Spencer's six trillion degree warming

In some followup discussion with Barry Bickmore (by email) and Kevin C (at Skeptical Science) it became clear we were missing something in the analysis of Roy Spencer's climate model. Specifically, Bickmore's Figure 10:

differed significantly from Spencer's 20th century fit even though he was ostensibly using the same parameters. If you look at the year 1900 in particular, Bickmore's Figure 10 has a temperature anomaly slightly above zero, while Spencer's is pegged at -0.6 C. Bickmore did use a slightly different PDO index than Spencer for his 1000-year graph (figure 10) - but more importantly, he took Spencer's -0.6 C temperature as the starting point in the year 993 AD, rather than as a constraint on temperature in the year 1900, as it actually was in Spencer's analysis. It turns out that to actually match Spencer's 20th century temperature fit the starting temperature in 993 AD needs to be extraordinarily, far beyond impossibly, low. We'll get to the details shortly.

First, returning to the results of my previous post, Spencer's model reduces to Eq. 18:

(18) T(t) = Te + A0e-t/τ + β Q(t)/c h

(see the previous post for definitions). Note that since c h/α = τ the factor multiplying Q(t) can also be written as β/τ α. The convoluted PDO index (PDOI) function, Q(t), is defined as (see eq. 16):

(19) Q(t) = ∫-∞t PDOI(s) e- (t - s)/τ ds

and note that it depends on the PDO index going back in time (to negative infinity in the integral) - though values from more than a few multiples of the time constant τ in the past contribute only a minuscule amount to the total. Spencer seems to have used a particular source for the PDO index (downloadable here) which begins in 1900, so at least for the first few decades of the 20th century Q(t) is missing important information about pre-1900 index values. Bickmore came up with an alternate source (MacDonald et al 2005 - reconstructed from tree ring proxies) which goes back to the year 993 AD. These sources aren't identical in the 20th century where they overlap:

Figure 1: The two raw PDO indexes (R code)

but at least from about 1925 on they seem to be very close. They do strongly disagree (going in opposite directions) in the 1910-1920 period and there are some other short periods of divergence, but generally the proxy index seems to match the recent PDOI quite well.

Let's see what the convolution Q(t) looks like for these:

Figure 2: Recent PDO with convolutions Q(t), for several different time constants τ (R code)

Note that PDOI is unitless, while Q(t) has units of years. If PDOI was a constant value, then Q(t) would be constant a factor of τ years larger. If PDOI was a step function going from 0 to 1 in a certain year, Q(t) would be a smoothed step approaching that constant value τ exponentially over a few multiples of τ years. So in general the convolution should have the effect of smoothing, amplifying, and shifting forward in time the original index values, as we see in this figure.

The effect over the 1000-year record is even more dramatic:

Figure 3: Proxy PDO with convolutions Q(t), for several different time constants τ (R code).

The red curve (τ = 30 years) represents the value Spencer found in fitting 20th century temperatures, and corresponds to Bickmore's Figure 10 curve (reproduced above) except over the first few decades where the transient from model temperature initialization has an impact. In particular you can see that 20th century peaks in Q(t) is nothing special, in fact (with this 30-year time constant) it has been almost stable since about 1700, with about half the variation of the high peak around 1600 or the dip around 1200. And as Bickmore noted Q(t) had completely the wrong sign, and even been strongly negative, during the putative medieval warm period (typically attributed to the period 950 to 1250 AD).

To provide some clarity and avoid the slight differences between the recent PDO and the 1000-year proxy PDO, I spliced the two series together in the following graph, which compares Q(t) values for the recent PDOI alone, and including pre-1900 values from the MacDonald data set (this time only looking at τ = 30 years):

Figure 4: Spliced PDOI with convolution Q(t) for the spliced series, compared with the recent-data-only Q(t), for τ = 30 years (R code).

The pre-1900 data has much the same effect on Q(t) as Spencer's added initial temperature T0: the differences disappear exponentially over time, so that by mid-century the two curves are quite close. The main difference is that using the real (MacDonald) data, positive pre-1900 PDO values push temperatures in 1900 more positive, not negative as Spencer chose to do in his fit.

We can now use Spencer's parameters and equation 18 above to turn these Q(t) curves directly into fitted temperature curves. Spencer's parameter values were:
α = 3.0 W/m^2K
β = 1.17 W/m^2
τ = 30 years
A0 = -0.6 C (temperature in year 1900, if we define time as t = (year - 1900))

and using these with the spliced and recent-only Q's gives:

Figure 5: Temperature from Eq. 18 with Spencer's parameters, using two different forms for Q(t). The red curve uses the spliced Q(t) with an additional initial downward shift so A0 = -0.711245 C (R code).

Note that Spencer's model using the spliced PDO index as forcing (the green curve) shifts temperatures up slightly from the result with a PDO of zero before 1900, so the "initial" temperature with that PDO is no longer -0.6 in 1900, and the model doesn't quite match what we had for the 20th century. That can be fixed by making the A0 parameter in Spencer's model slightly more negative, resulting in the red curve here, which matches the original 20th century fit beautifully.

Bickmore was originally looking at this, with his figure 10 shown at the start here, to see how well Spencer's model did in hindcasting past temperatures. Obviously, given the Medieval Warm Period issue, it doesn't do very well to start with. But now with the analytic form we can hindcast perfectly, rather than having to initialize at some point in the year 993 and integrate forward. Spencer's model (remember fitted to 20th century temperatures) does the following when projected backward to the 19th century:

Figure 6: Temperature from Eq. 18 with same parameters as Figure 5, projecting back a century(R code).

Oops. That A0 term is wreaking havoc here. While the PDO index and Q(t) on their own are meandering around in a way that keeps temperatures somewhere close to zero anomaly, the actual model Spencer used projects that global temperatures in the year 1800 were 20 to 25 degrees C colder than present! That's several times larger than the global temperature change associated with the ice ages! I think we can rule that out.

I asked Dr. Bickmore to re-do his Figure 10 calculation to see how low a temperature was needed in his starting year 993 to match Spencer's 20th century temperature graph, with a value of -0.6 C in 1900. He sent me the following:

Figure 7: The black curve is observed temperatures since 1850; red curves are temperatures from Bickmore's matlab implementation of Spencer's model with initial temperature anomalies of -1 to -6 trillion degrees C in the year 993 AD.

It turns out you need to set the starting temperature to negative six trillion degrees in 993, in order to match Spencer's model for the 20th century. 6 trillion degrees. Wow. Now that's global warming!


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

OK, my turn to play devil's

OK, my turn to play devil's advocate.

Suppose we didn't have the McDonald data. What sort of PDO would be required before 1900 to produce the initial transient required by Spencer's graph?

I did a quick test of that, just eyeballing the result. It looks like 30 years of a constant PDOI of -0.7 will do the trick. Alternatively, if you paste the PDOI data from 1945-1975 into 1870-1900, then you get the right sort of shape.

So I guess I've changed my mind. The transient term does not automatically invalidate Spencer's model, although he does have to reject the MacDonald data. What is left is a model which says that climate is driven by PDO through a lag term.

Almost right, but you're

Almost right, but you're forgetting the β/α factor. From Eq. 18, if A0 = 0 then if T(1900) - Te= -0.6 C, Q(1900) = -0.6 * τ α / β. τ is the amplification factor from the convolution integral, so the steady value of PDOI(t < 1900) that would be needed to give this Q(t) (whatever the value of the time constant τ) is:

PDOI(t < 1900) = -0.6 * α/β = -1.54

using Spencer's parameters. That's lower than the MacDonald curve ever goes in the 20th century, and only a couple of years in the recent index go that low. Over the whole 1004-year MacDonald record their are only 46 years with PDO index below -1.5, i.e. 4.6%. The chance of staying that low 30+ years in a row (in order to get Q(1900) that low you need to have it constant for longer than τ) is pretty astronomically low. It might be a little easier to believe with a shorter time constant, but it's still pretty implausible.

And in any case, an exceedingly low late 19th-century PDO directly contradicts the data we do have (MacDonald)...

Oh, in retrospect it looks

Oh, in retrospect it looks like my problem is simpler. I assumed I'd got my calculation right because my curves looked right. However the details of the curves were subtly wrong, because I was using the wrong tau (30 not 23).

Having fixed that I get an almost perfect match to Barry's curves. The shorter decay period means a much bigger transient, so I now need a PDO value of around -1.6 over 30 years to replace the transient, confirming your calculation.

And it is indeed unrealistic.

Nice work, Arthur! I think

Nice work, Arthur! I think it is very useful you guys are "auditing" [ ;-) ] Spencer's work like this. Unlike the extreme nuttiness of our friends G&T, Spencer's ideas for a negative cloud feedback and confusion of cause and effect in the diagnosis of cloud feedbacks from the data seemed at least not completely implausible, so I am glad to see them getting looked at in more detail. I'll definitely be referring people here when they bring up Spencer in the future!

It indeed seems like his book title "...Blunder..." may have been a very apt description, albeit not quite in the way that he intended.

Very nice analysis, thanks

Very nice analysis, thanks for making it so clear!

Arthur in the Figure 5

Arthur in the Figure 5 caption you write

A0 = -7.11245 C

I think you mean (as the code says)

A0 = -0.71245 C

Nice work.

Thanks, corrected now!

Thanks, corrected now!

Hmmm... the next thing we

Hmmm... the next thing we could do is deconvolute the millenial temperature record to find out what values for the PDO would be required to drive temperature in Spencer's model.

I used to do that sort of thing in Fortran: we could use an interative Van Cittert-type iterative method (which would require ad-hoc smoothing to stabilise it), or a Fourier method (with a simple Gaussian smooth), or maximum entropy with the convolution as a set of restraint equations.

It's not trivial, but probably much easier in R these days. But I'm still hacking stuff together in python and C++, I'm afraid. I could have a go at it though if you think we might learn anything worthwhile?

Q(t) is multiplied by β/τ α,

Q(t) is multiplied by β/τ α, so maybe comparative graphs of Q(t)/τ for various τ would be more informative than those of Q(t).

I actually tried that first,

I actually tried that first, but at least for the longer time periods the PDO tends to oscillate enough that the magnitude of Q(t) scales greater than τ^0 but less than τ^1, so it's hard to see what Q(t) is doing at all since those graphs (divided by τ) look rather flat. Maybe I'll post a few though.

A0 isn't a physical parameter

A0 isn't a physical parameter of the model, it's an initial value used to wrap up the contribution of the unknown forcing prior to the start of the process, and it isn't valid before t = 0. You can't hindcast with this model.

It's exactly the same with an ordinary reversion to equilibrium. You start off out of equilibrium and approach it exponentially - but if you hindcast that model you find that the history has to be further and further out of equilibrium the further back you go, ultimately descending to temperatures below absolute zero. That's nonsense, of course. In reality, some small (unmodelled) perturbation shortly before t = 0 pushed it a small distance out of equilibrium, during which the differential equation was not valid. A0 is only used to set up the state at the first point the equation is considered valid, so you can't use it prior to that.

I somehow don't think Roy is going to be worried by any of this.

Yes - that was what my

Yes - that was what my discussion with Kevin C up a little higher in the comment thread was about. The real problem is not the 6 trillion degrees, it's that Spencer was not free to choose A0 to be anything at all; rather it needed to be physically based on previous forcings, but if he is serious about PDO being the only forcing (as his model implies for 20th century) it requires a very unrealistically low value for the PDO during the 19th century - when the MacDonald reconstruction (available long before Spencer wrote his book) sees a positive, not negative, PDO during that time.

That's a valid criticism - in

That's a valid criticism - in a physical sense A0 should ideally be set equal to zero, and the PDOI function extended back to minus infinity. In the absence of such data, it should go back as far as you have valid data for, and there's an argument to be had over what you initialise it to. However, while I don't know about the MacDonald series, just looking at the data gives me reasons for suspicion. When data series from separate sources move in tandem, and then just at the point where there differ one makes a huge step jump, I have to suspect an inhomogeneity - a contributing data series has come to a start or end, or the source shifted, or instrumentation/methodology or some other source of systematic error has changed. If so, the values prior to the jump are suspect. This may be why Spencer chose to ignore the series.

I can't speak for Spencer on whether he is serious about a PDO-only model, but I rather doubt it. This sort of model is usually only put forward as a sort of proof-of-concept. The claim is commonly made that the modern temperature rise cannot be modelled by any means other than the CO2 rise; these sorts of models are demonstrations that it can be done with other quantities too. I know of no serious sceptic who thinks climate is a simple one-dimensional system with only one master cause to explain everything, or for that matter who asserts that increasing CO2 will have no effect. I cannot conceive of a knowledgeable scientist who would seriously claim to be able to model the whole world's climate with a simple one-line equation (although I could well believe that he considers it a serious candidate for a major contributing factor). A PDO correlation, or a solar correlation, or a reciprocal-pirates correlation for that matter are only a starting point for discussion. They're only meant to offer alternative hypotheses, to show that the CO2-climate correlation is not conclusive, rather than having only a single hypothesis being considered and thereby winning by default.

Personally, I regard all this sort of stuff as "curve-fitting", whether applied to CO2 or anything else, and take von Neumann's dictum about being able to fit an elephant with sufficiently many free parameters as the most appropriate point of view. I'd be more interested in his observations in Spencer and Braswell 2010 where he uses a similar sort of model (and which was what I assumed at first you were talking about), than curve-fitting the global temperature anomaly to the PDO in some book.

As for the questions about whether he had his ideas reviewed or checked before publishing, I'm pretty sure he would have. But I know that he's complained about the difficulty of getting others in the mainstream to do so, as they seem to think that engaging with sceptics is a bad idea for whatever reason. There are a number of people who will try to shoot his ideas down, but few who will take the time to actually understand them first. Criticisms have to address the primary points he's making, which few that I've seen do.

It's good to see someone engaging with his ideas seriously. Sceptics need critical scrutiny to be kept on a straight track too. Please, keep going.

Between your and Bickmore's

Between your and Bickmore's sanity-checking of Spencer's model I think this makes an excellent case for adhering to the peer-review process, even informally. I have a hard time believing that Dr. Spencer actually spent time with other knowledgeable people going over this before either submitting it for publication or writing it into his book. If he has, as you say he was working with an earlier model a few years back, then it doesn't seem as though he's taken much of the criticism to heart and might just have dismissed it for some reason or other.

Unfortunately it seems the rejection of his model has soured him on peer-review at all. He's stated quite publicly that he will be avoiding that kind of stringent policy, apparently in favor of less rigorously-vetted Letters format publications like GRL. (see: ). Something's off when a scientist thinks he can circumvent one of the most necessary steps in modern research while slyly implicating his peers in either being totally dishonest or totally incompetent. As someone said in the previous thread, the idea that he might just be wrong doesn't seem to have gained a foothold for him. Maybe I should find a copy of his book to borrow and see if it yields any enlightenment in that regards.

What worries me is that this whole "take my message to the people" approach resembles the strategy used by anti-evolution proponents, along with the accusations of widespread fraud/mental deficiency in the mainstream scientists who obviously "don't get it." Intelligent Design advocates seem to have a similar allergy against publishing in peer-reviewed venues while promulgating a massive public campaign, with books and blogs being the primary outlet. It didn't surprise me to learn that Dr. Spencer is also an ID convert, but the similarities between his anti-climate change rhetoric and his views on mainstream biology are notable (see here: calling evolution "faith-based," and here: in which he called AGW a "faith-based" institution). It's one thing for a scientist to have wacky ideas about other fields outside of their specialty, but in this case I think it warrants closer examination because of the overlap between his opinions on evolution and his opinion of climate science, his own field. To me that's an interesting current to follow, but more for the sake of understanding the person than for revealing short-comings in his modeling prowess. Sorry if that seems unduly off-topic or ad hom.

In any case, I'm glad there have been several understandably-written critiques of Spencer's methodology. Ultimately these are far more useful in clearing up the "controversy."

It's hardly surprising that

It's hardly surprising that when t is allowed to get large and negative, the A0 term in T = Te + A0e-t/τ + β Q/c h blows up. Holding τ fixed at 30 years, 993CE is at t/τ ~= -30, so the exponential term is around e30, i.e. 11 trillion. Given that we know A0 has to be around -0.6 to fit Spencer's chosen reference point T(1900CE) ~= 0.6, T(993CE) has to be around -6 or -7 trillion.
What this demonstrates is obvious, really: equation 18 cannot be made to fit temperatures well over a long timescale, many times τ. The magnitude of the A0 term will blow up for part of the period. This is unsurprising: temperatures just aren't an exponential return to equilibrium from some other value. To make equation 18 work at all when |t| ≫ τ, we have to set A0 identically to zero.
What is surprising is not that this model doesn't work, or even that Spencer thought it could work. We all make mistakes. The surprising thing is that Spencer persisted in his error for long enough to write a whole book about it. The analytical approach is so easy, and demonstrates the problem so quickly, and his results about a,b,h strongly suggest an analytical solution: why didn't he consider it?

(Reposted from SKS) Now that

(Reposted from SKS)

Now that Arthur has shown us that the relationship between Spencer's forcing term (for which he uses the PDO) and his temperature series is a convolution, we can go the other way and predict what forcing would be required to produce a given temperature series, using a method called deconvolution.

There is a slight difficulty in the deconvolution can be a bit unstable, however if we stick to reconstructing a smoothed temperature series rather than trying to fit every annual variation, then it is fairly easy.

So here is the HADCRUT2v data back to 1856, supplemented by the CRUTEMP2v data back to 1781. I used the smoothed data from here. I've predicted Spencer's forcing required to fit the data, and then re-run it through Arthur's version of Spencer's model to produce the temperature series. Here's the result:

What forcing is required to produce this temperature series? Here is the forcing, compared to the actual PDO (11 year smooth) over the last century:

There are a few interesting features: The match in the mid 20thC is reasonable as you would expect. A much higher forcing is required to reproduce all of the steep rise after 1970.

More interesting is the beginning of the 20thC. We see the strong and prolonged negative PDO up to 1900 required to produce the cooler temperatures of the late 19th and early 20th century. Not only is this an exceptionally deep PDO (unlike anything in the measured period), it also bears no relation to the measured data from 1900 to 1920.

Between 1850 and 1800 the values are very different from the MacDonald data.

Going back further to pre-1850 we see wild fluctuations in the smoothed PDO, to produce the sharp changes in the instrumental temperature record. However the record back then is based on only a few stations, so these ripples may not be genuine.

Details: I used 250 cycles of Van Cittert deconvolution with a 5 year moving average and a relaxation factor of 0.25. Using the smoothed temperature record means that stability isn't an issue as long as you stop before the RMS error starts increasing (due to the introduction of ripples). I padded the beginning of the series with 80 years of linear temperature starting from zero to minimize the transient, but nevertheless the first 5 years or so of the reconstruction should be ignored.