Multi-proxy reconstruction of historic temperature data

Ref: Moberg et al Nature 433 2005 613 and commentary by Anderson and Woodhouse p587 same volume

The theory that humans are altering heat retention in the atmosphere by excessive release of gases such as carbon dioxide, Global Warming, does not live or die by whether the temperatures seen at the beginning of the 21st century exceed those seen since the last ice age ended. But, its extreme interpretations are supported by the idea that no rapid variation nor temperature extremes like those of the last century have been seen in the previous 2000 years. This is the so called “hockey stick” graph so graphically used in “An inconvenient Truth”. The temperature over the past 1000 years doesn’t vary much until the industrial revolution when it starts to shoot up alarmingly.

The “hockey stick” has scientific basis in the work of Mann and his co-workers whose reconstruction of the last 1000 years worth of temperatures from multiple proxies for temperature (principally tree rings [often bristle cone pine because of their long life]) was heavily used in the influential IPCC (Intergovernmental panel on Climate Change 2001 Climate Change: The Scientific Basis) report. If you look on the web you can find that Mann’s work has huge numbers of detractors centring around claims that Mann’s proxy data can only be reconstructed into the hockey stick shape with a method that has a considerable bias towards showing a large 20th century anomaly. Unfortunately the language used by Mann’s and by extension the IPCC’s detractors is rather intemperate and therefore allows the Global Warming extremists to suggest that opposition to Mann’s work is itself extremism.

Of course Mann’s work runs counter to the historical perception that there was a climatic optimum in the middle ages when monks grew grapes in Britain and a little ice age in and after the 17th century when the Thames regularly froze. But it is quite possible to point out that the examples I’ve given here refer to Britain or at least Northern Europe and that the rest of the world may not have been experiencing any such temperature variability. And so the western world’s historical cultural bias towards Europe explains the apparent contradiction between history and Mann.

So is Mann correct? Well Mann and Jones had another go at a 2000 year multi proxy analysis in 2003 which shows more variability – about 0.5oC – and so the hockey stick is less pronounced. Pollack and Smerdon have tried a different proxy, they have sensitively measured borehole temperatures from originally 300 sites, now over 800, and have calculated equivalent surface temperatures back on the premise that

Departures from the expected increase in temperature with depth (the geothermal gradient) can be interpreted in terms of changes in temperature at the surface in the past, which have slowly diffused downward, warming or cooling layers meters below the surface.” ref

Their data suggests a 1oC difference between the 1961-90 average and the little ice age, it has been said that Pollack sees this as a confirmation of global warming because 80% of their 1oC change took place after 1750, the onset of the industrial revolution.

An interesting approach to reconstructing temperature data from proxies is from Moberg et al (and see paper reference above), they have taken the view that proxy information falls into two groups; that with good high frequency resolution (tree rings), and that with good low frequency trends (lake, ice or seafloor layering). They have actually said slightly more than that, that the tree ring data does not offer reliable data over the longer time frame and the layering data is unreliable in the short term because Moberg et al have abandoned the low frequency (>80 year) portion and high frequency (<80 year) portions of their respective datasets. That ice or sediment layering is unreliable on shorter time scales is perfectly reasonable because of layer mixing, but I don’t know if abandoning the longer term trends of the tree data is valid. I suppose that over longer time scales the physical environment that the trees find themselves in can change?? However this does mean as one blog put it:

that means the low-res proxies are doing all the work, and the tree rings are just adding a pretty-looking fringe of noise that your eye reads as sort-of error bars”

I do not understand why the authors did not allow some overlap in their blending of data sets. They did not, so although the high frequency data is interesting, they have effectively produced a new method for blending various layering or sediment proxies that retains more detail than simple averaging can.

How did they do it? Well the layering proxies will be stronger on some time scales than others and averaging as can be seen from their figure 2a just smoothes away detail that any one proxy might validly contain. The method Moberg et al chose was wavelet analysis, this is a transformation technique that resolves both frequency and time information from a signal, so that differing frequencies of temperature variation that emerge from the different proxies are not smoothed but retained in the post-wavelet transform averaging.

The authors’ mother wavelet was the Mexican hat – and their scale (1/frequency) was resolved into 22 “voices” 4 to 8000 years. Voices 1-9 (<80yr) from the tree ring data were averaged and Voices 10-22 (>80yr) from the layering data were averaged and then put together to provide a 1-22 series that could be inverse transformed to create a dimensionless signal. This was scaled against Moberg’s revision of the Northern hemisphere 1979-1856 instrument record to provide a temperature trace for the past 2000 years. The data was split between hemispheres (actually hemidemispheres! All their data was from the northern hemisphere), but they state that this made little difference and I suspect was only done because “gridding” was key to Mann’s analysis. Either end was padded to an extra 300 years by using the proxy average from the previous 50 years – this was to avoid boundary effects but probably explains the flatness of the 0AD end and the authors’ exclusion of the 1925-2000 end of the LF graph and 1979-2000 end of the HF graph from their results, the padding almost certainly meant that their reconstruction did not show a great match for the instrument record at the ends.

The Moberg et al data is plotted here.

In their paper the resulting temperature curve including tree rings was not shown in enough detail to see whether the high frequency oscillations from the tree rings matched the instrumental record or computer model data so I have redrawn it here. I have compared it with the Hadley Centre’s reconstruction of global temperatures from the instrument record which is not the data Moberg et al used to scale their curve. The match-up is not at all bad except that Moberg et al show temperatures around 1920 as much warmer and in the 1860s Moberg shows a strong warm peak when the Hadley shows a cold trough.

The low frequency curve is striking in that it showed a pronounced 0.8oC drop from the 1961-90 average to a double minimum at 1600 and 1700 which in the paper the authors show to lay well within errors of the Pollack and Smerdon borehole data. A double maximum falls at 1000 and 1100 with temperatures around those of the 1961-1990 average.

In other words the historical perception was reproduced although the medieval climatic optimum temperatures do not exceed those of the 20th century.

From 1000 the temperature declines by 0.5oC to 600AD with the exception of a peak at 700AD and is then flat/rising slightly back towards 100AD.

Moberg et al conclude that:

“We find no evidence for any earlier periods in the last two millennia with warmer conditions than the post-1990 period... The main implication of our study, however, is that natural multicentenial climate variability may be larger than commonly thought, and that much of this variability could result from a response to natural changes in radiative forcings. ..... our findings underscore a need to improve scenarios for future climate change by also including forced natural variability”

So are Moberg et al right and Mann and his co-workers wrong? Well their proxies average to give a much larger range than Mann found without even the need for wavelet analysis, but it is probable that only the interpretation of new proxies will tell. However, the congruence of Moberg’s data with borehole data and historical perception [it would be interesting to see if the other peaks and troughs in their data can be traced historically in events like the movement of Germanic tribes westward or Roman retrenchment] makes it the more attractive dataset. A dataset that absolutely cannot be described as a “hockey stick”.

 

Phew ......I didn’t expect to write that much.....hardly a fragment! SC 2008

 

index