the Air Vent

Because the world needs another opinion

Who’s in Denial

Posted by Jeff Id on November 5, 2009

William Connolly has been gracious and not snipped any of my comments on the Tiljander debate at his blog.    I’m about ready to add him to my blogroll so this isn’t an attempt to bash his blog, however he wrote a beauty of an explanation as to why it’s ok to read thermometers upside down again.   It never ceases to amaze how far people can go to reason into almost any position. It’s like people who complain about the idiocy of government and continue to vote for more, like that will fix it.  Anyway his rationale is entertaining.

Imagine a climate proxy, accurate over the last 2kyr, that shows (for example, let us suppose) a warm period around 1000 AD and which, undisturbed, would show the recent warming. Further suppose for definiteness that this proxy is of such a nature that increases in the proxy value represent increases in temperature. Imagine this proxy is contaminated with non-climatic signal over the last 200 years, enough that the climatic signal is overwhelmed. Suppose that this contamination is of such a nature that it leads to a strong decrease in the values of the proxy over the last 200 years. Such a proxy (call it A), fed into the Mea algorithm, will be flipped over (due to its negative correlation with recent instrumental temperature) and will contribute a net cold influence around 1000 AD. How much it contributes will depend on how well it correlates to recent times.

Now there isn’t anything wrong with this paragraph that I can see.  He’s got a good handle on the multivariate nature of some of the regressions.  Consider that last sentence (which is correct) how much it contributes depends on how well it correlates.   Dead on for one of my biggest criticisms, these regressions are a form of data sorting  and are just as significant a no-no as the data elimination sorting where data is physically thrown away.  I think of the weightings of  a MV regression like modulating the data  (information) partially away rather than fully.  The fact that the modulation occurs on a correlation basis makes the complaint about the method exactly the same as the correlation based elimination methods.

Not good in my opinion but what can we do.

The next paragraph:

Now imagine a similar proxy, except the nature of the non-climatic contamination is such as to add a strong increase over the last 200 years. We’ll call it B. This time, the proxy won’t be flipped over, because its correlation to recent times will be positive. But the variance into the past will be strongly de-weighted (because we’ve just added an artificially large postiive trend). So it will imply not much change around AD 1000. But now we see this, we can see that the same problem applies to series A: unless, by bizarre co-incidence, the negative non-climate signal just happens to match the true positive instrumental signal, the variance in the past will be wrong. And since we’ve had to assume that the non-climate signal overwhelms the climate one, its likely that the recent variance will be too large, so the past will be de-weighted.

So he makes the assumption that the positive trend in the contaminated signal is larger than the result.  I’m not sure he’s looked at HadCRUT lately but I’m not certain the assumption is valid.  Let’s assume it is though.  So he’s saying that Tiljander doesn’t matter upside down because the real climate signal – which there isn’t one – is deweighted by some fraction (say 50 percent) and averaged.  So the inverted signal is deweighted to a point that William has defined it as a red – herring.

Well everything was going well until the conclusion.  There are 4 tiljander proxies in Mann 08 three of them were inverted and they all have huge blades even after the log of the varve thickness is taking one of the biggest leaps of faith I’ve witnessed to convert it to temp.

It’s rather amusing considering that the assumption is for deweighting where there is absolutely no evidence that deweighting occured.  In fact the Tiljander proxies (specifically the known bad data portion) correltated well (strongly negative) to temperature and therefore were flipped over and probably received reasonable sized weights.  Even that’s not the point though.  We already know that each tiljander series is 1 of 484 which passed correlation screening.  Even with full weighting equal to every other proxy it’s 1/484th of the reconstruction.  Nobody ever expected fixing Tiljander to fix Mann08.  Mann 08 is a disastrous attempt to kluge together a hockey stick, nothing more.  It’s worth adding that there is a good chance the series was more heavily weighted than William asserts but it still wouldn’t move the result very much.

The point is that the thing was used upside down.  It is known, thermometers read upside right (even in kindergarten) and therefore flipping them is an error. Even in kindergarten the teacher would tell you to flip it the other way round Mikey.  The physical meaning of the Tiljander proxy is warm weather melts more glacier and carries more sediment.  Flipping it upside down changes the physical interpretation to warm weather freezes more water and carries less sediment.

In case you were wondering, this does not make sense. The most Bizzarre thing about it, is the continued defense of this pile of poo for a paper.

Yesterday, in a fit of idiocy I took a stroll around Real Climate.  I found Michael Mann (who claims to be somewhat conservative) pounding on every conservative institution he could find about climate disinformation.   How many analogies are there, pot and kettle, glass house, pants on fire.  Why he didn’t just say oops, flip it, claim no change and move on is beyond me.  It could have been a huge PR win that would make the important, equally flawed yet difficult for laypeople to figure out part of his case sound reasonable to the public.  Instead we get denial, absolutely goofy answers from Mann and now an odd hand waiving half-endorsement by a well educated AGW believer of an obvious mistake.

19 Responses to “Who’s in Denial”

  1. MikeN said

    Jeff, he has inadvertently endorsed your hockeystickization posts.

    WC does do an amount of snipping, but it seems to be more out of ignorance than censorship.

  2. curious said

    …. this stretches my understanding of the word “proxy” to the point where my ears hurt! 🙂

    In case I’m being dense – does anybody have examples of proxy use from disciplines other than climate where this type of thing is considered valid?

  3. Sean Houlihane said

    My interpretation of the 2nd para. is different to my interpretation of your view. His series B has positive modern noise, so the past variance will be de-emphasised by the scaling, not the series weighting (as previously detailed by your posts).
    Now, for series A, (and modern noise Z), the assumption is that |Znow| > |Anow|, so hte series gets flipped. A further assumption is made that the resultant (Znow+Anow) outweighs -Anow, so old values for series A are not only inverted, they are scaled down because the modern corrupted record is not just the inverse of the primary series, but actually a magnification thereof. Seems to just say that old values mean nothing when processed like this. I’m not sure there is any series weighting going on here, just scaling.

    Does it make any sense to calibrate the series in the pre-industrial era, turning the process on its head? Sure, that would give each series an equal weight, but you could maybe weight them by correlation with the resultant mean – then attempt to deduce any recent irregularities.

  4. curious said

    Sean – what could proxy A be? In reality? Something available with an “accurate” 2000yr record which has been “contaminated” in the past 200yr? Something with a positive correlation to temperature? What would be an example of the “recent contamination” and how would this be distinguishable from contamination for any other period in the record?

  5. michel said

    The more you read about this, the simpler it seems to get. You have a reasonable physical theory to explain why higher temps should always have driven more thickness. So its reasonable to take thickness readings for periods where you don’t have temp readings, and infer temps. Thicker readings will lead to inference of higher temps.

    This is perfectly reasonable so far. You then decide that recently the relationship, for good physical reasons, no longer holds. Fine, that means you cannot use thickness to infer temps in recent times, but you have a reasonable account of why this factor could not have applied at any other period, so you can still infer temps from thicknesses outside the recent period. Again, higher thicknesses still mean higher temps, its just that since (eg) 1950, they are not the only or dominant cause, so you can’t use the fact that thicknesses have risen recently, to make inferences about whether temps have.

    I don’t see how anyone can argue with this so far. Its the next step that seems so strange that talk of denialism is not off the mark. If I understand it. Can it be true? Is the next step that you conclude that, despite the well founded physical theory you started out with, and despite your account of the factor present since 1950 which leads you to think the relationship no longer applies, you decide that higher temps in fact lead to lower, not higher, thickness readings? So you take the thickness readings in the past, and conclude that where the stuff is thinner, temps were higher? But you do this while neither having nor offering any reason to doubt your original account of the physical relationship, that is, how it happens that higher temps cause, and have always been followed by, higher thicknesses, except in the very recent past.

    Jeff, is this really what is in fact being done? Or have I got totally the wrong end of the proverbial stick? Is this the essential step in the argument? That we have no idea why it should be, and it conflicts with the physical theory which explains the phenomena we are working with, but the relationship is assumed to be the reverse of what we had reason to think it was?

    Imagine a country with periodic epidemics. There’s a death rate from them. Imagine that from time to time the country goes through periods of neglecting immunization. Hold virulence constant, and there should be a relationship: more immunization, less deaths, less immunization, more deaths. We now publish a complex study, and one of the key steps in the argument is to assume that more immunization means more, not less, deaths. So where more people died, we assume there had been more immunization.

    We call everyone who points out this step ‘denialists’. Is this what is going on?

  6. The dilemma for them is that if they admit to any mistake, a blow will be dealt to their True Believer following. For their sake, they must appear infallible. Privately, they know as well as everyone else, that their position is indefensible.

  7. Jeff Id said

    #5 As far as the sediment proxy goes. Yes, that’s exactly what’s going on.

  8. Jeff Id said

    #3 Sean,

    WRT your first point scaling and weighting are the same math step y = ax, a is a so there’s no difference between weighting or scaling.

    As far as the second point about flipping, you show the absolute value of the noise which is different from what happens. In CPS version reconstructions, the correlation of the proxy to temp is strong negative. There is an absolute flip (vector y = -y style) not an assumed one of the data and the series is then averaged with the rest of the data.

    In the EIV method the math is more subtle but the effect is the same.

    As far as the de-emphasis, that is an assumption which is likely the opposite of the reality IMO. These proxies had high negative correlation so they are probably weighted heavily in the reconstructions. Whether they are or not though is not the point. If we say they received a weighting or scaling of zero and 4 other proxys have a value of 3 after weighting you get a value of (3 + 3 + 3 + 3 + 0)/5 = 2.4 so the de-weighting would tend to flatten the historic handle while the high correlation portion adds to the HS.

    My very rough guess is a 2-3% shift in hs amplitude is probably about where these 4 series landed WRT effect on Mann08

  9. amac78 said


    I’ve been a pretty frequent commenter on the three Tiljander threads, and I’m glad to hear that your comments have all been accepted as written. Most but not all of mine were accepted–but it is discouraging to write carefully, with the knowledge that part or all of the text may be disallowed, or broken up with interspersed remarks.

    Stoat his blog, Air Vent yours, you both do as you see fit. I’m not asking for a Waah-mbulance, but pointing out that these threads are not level playing fields. If I care that much, I’ll “get my own blog.”

    I might have made some of the arguments you present–less the statistics–but there seemed little point, under the circumstances.

    A view of 20-year-averaged varve measurements for three of the Tiljander proxies is juxtaposed with the new Figure S8a in an Excel file “Tiljander-cf-Mann-CPS.xls”, at this BitBucket archive (yeah, Excel for graphics; there might be other choices). No matter how sophisticated the statistical treatment one ends up choosing, I think it is often useful to visually inspect the data prior to its transformation.

  10. Jeff Id said

    #9 I just let out the strongest vent I’ve ever done. This is a bit of a switch, people don’t realize how much is put into those things.

    Basically, I don’t snip here unless things get really wild, I’ve said it enough times but probably 10 times in 9000 posts. Feel free to comment.

    I’ve run the tiljander proxies through the log correction presented at CA with plots before and after. I recommend you try it yourself b/c it’s free and easy. Steve has left all the code on CA. I agree visualizing data is absolutely necessary you’ll find that I plot the data first in most everything done here. There is much to be learned from plotting, far more than by reading. The next post on Yamal has had probably 1000 or more plots of which only a few will be presented. It’s all about getting familiar with the data.

  11. Geoff Sherrington said

    Mike N

    “WC does do an amount of snipping, but it seems to be more out of ignorance than censorship.”

    Ponder that an ignorant person is incapable of effective censorship, not knowing what is important to censor.

    The excuses given above are fantasy. A necessary condition is a useable relationship between local temperature and a tree ring property, confidently stripped of confounding variables if needed, that can be shown to exist through the entire study period, iccluding the confounding variables corrections.

    “Then you should say what you mean”, the March Hare went on. “I do”, Alice hastily replied; “at least – at least I mean to say what I say – that’s the same thing, you know.” “Not the same thing a bit!” said the Hatter. “Why, you might just as well say that ‘I see what I eat’ is the same thing as ‘I eat what I see!'”.

  12. AMac said

    Jeff Id —

    Speaking in the most general terms, something that fascinates me is the power of statistical treatment to show meaning where a casual inspection of the data doesn’t show any. And to assign significance (e.g. P) to the relationship being considered.

    This is a great thing, because correlations in large and complex data sets are typically difficult or impossible to see. Without statistical analysis, their significance is usually impossible to grasp.

    The other edge of this sword is that improper application of statistical analysis could highlight relationships within the data where none exist.

    In this regard, I looked at near-raw Lake Korttajarvi proxy data graphically, setting the X-axis (Time BP) the same as for Mann et al’s Fig. S8a. “Another Correction” thread, Comment #9.

    In Fig. S8a, the calculated temperature anomaly takes a pronounced dip from 810 to 850. The Green Trace and the Black Trace are essentialy the same for the eighth through tenth centuries. This shows that the calculation of the anomaly during the dip is not affected by inclusion of the Lake Korttajarvi proxies in the input to the CPS algorithm.

    Yet there is no obvious “cooler” signal in the raw Lake Korttajarvi proxy data for 810-850, compared to preceding and subsequent decades.

    This is counterintuitive. Intuitively, if there’s no strong “cooler” signal in the four added-in proxies, the Black Trace should show a more modest dip than the Green Trace.

    It may be that correct application of signal-processing methods is bringing out non-obvious but real information. If you’re so inclined, could you offer your perspective on this question (or suggest a link)?

  13. AMac said

    One explanation would be if the regional grid temperature calculated by RegEM for Finland 810-850 was about the same as preceding and subsequent decades. In that case, the pronounced cooling that is calculated by the CPS algorithm to have taken place (elsewhere in the northern hemisphere) should not be reflected in the Tiljander varves.

  14. AMac said

    I have worked through the first of the three Stoat threads on the use of the Lake Korttajarvi (Tiljander) proxies by Mann et al (PNAS, 2008), Oh dear oh dear oh dear oh dear. It may be helpful for readers who aren’t completely familiar with the controversies pertaining to Mann et al’s use of these data series to see the major thrusts of the pro-AGW-Consensus arguments and the Skeptical arguments in this abridged and annotated form.

  15. windy said

    Sorry for the late input, but a small clarification:
    The physical meaning of the Tiljander proxy is warm weather melts more glacier and carries more sediment. Flipping it upside down changes the physical interpretation to warm weather freezes more water and carries less sediment.

    There’s no glacier there. The idea is that colder winters lead to a more pronounced spring flood from snowmelt, and create thicker varves with more mineral matter. But human activity has also produced increasingly thick mineral varves recently. So the mistake is rather if you don’t flip it over and correlate the measured values to temperature directly, instead of their inverse.

  16. Jeff Id said

    #15, Are you trying to say flipping is right?

    The original authors felt the contaminated part should be chopped off and the data read in the normal manner. My own opinion is that it’s not temp.

  17. windy said

    I’m saying that these particular proxies may have been misinterpreted without flipping, since several of the proxies were originally interpreted to be inversely correlated to temperature (if indeed they reflect temp at all), but looking at the recent spike in raw values can give the impression that they are positively correlated (as Kaufman has acknowledged). So, assuming that the recent contaminated period was used to calibrate, wouldn’t it just seize on the spurious positive correlation without flipping the sign? The interpretation would then be “flipped” as a result, but not the data.

  18. Jeff Id said


    Even Mann isn’t making that argument. What happened as I understand it, was farming caused the recent years to have no real meaning. Flipping the curve means warmer temps with more melt result in less sediment thickness. This is a nonsensical result as far as I know, unless you are proposing a new mechanism for varve formation.

    When you look at the math applied next to the proxy the complete insanity of the thing really becomes apparent. They don’t have any rational method for linearization of temp whatsoever.

    The point of this whole thing boils down pretty simple. If we cannot even agree on the basic mechanism proposed by the original authors on why this terrible proxy is temp, and it doesn’t match temp- then what we are looking at is just dirt.

  19. AMac said


    Imagine a monastery in some Northern European country noted when the ice in the nearby river broke up every spring. Further imagine that the Abbott had a record of this annual occurrence dating back to 389.

    Suppose that in the decade of the 1220s, the ice broke up as early as Feb. 17 (1229), and as late as Feb. 28 (1225). Further suppose that in the 1390s, the earliest year of the ice breakup was 1395 (March 20), while the latest was 1391 (April 15).

    Somebody interested in paleoclimate reconstruction might be very interested in this record. While lots of factors could influence the date of ice breakup, winter temperature probably is an important influence. All else being equal, harsher winters would mean a later ice breakup.

    From that, one could reason that the decade of the 1390s was probably substantially colder than the decade of the 1220s.

    As you will have guessed, these figures from the 1220s and 1390s weren’t picked at random. I took Tiljander’s Lake Korttajarvi varve series on X-Ray Density–the same proxy used by Mann et al (PNAS, 2008) and Kaufman et al (Science, 2009)–and transposed her numbers into “day of the year.” So the Tiljander XRD value for 1229 is 48. The ice-breakup date for 1229 is Feb. 17 (the 48th day of the year).

    See here for details.

    Now, you could argue, “The Abbott’s interpretation is wrong: on this river, ice breaks up earlier in years with colder winters.”

    Likewise, Mike Mann is free to argue, “Tiljander’s interpretation is wrong: in these varves, X-Ray Density is lower in years with colder winters.”

    So far, the parallels are exact. The main difference is that common sense offers us guidance on the likely meaning of ice-breakup dates, while varve density is a distant, abstract phenomenon.

    If you asserted “Earlier ice breakup means colder winter,” you expect to be challenged. What’s your reasoning? What’s your evidence?

    Mann calibrated Tiljander’s varve series such that “Lower X-Ray Density means colder winter.”

    I’ll quote Mann’s response to McIntyre and McKitrick’s challenge.

    McIntyre and McKitrick raise no valid issues regarding our paper…
    The claim that “upside down” data were used is bizarre. Multivariate regression methods are insensitive to the sign of predictors. Screening, when used, employed one-sided tests only when a definite sign could be a priori reasoned on physical grounds. Potential nonclimatic influences on the Tiljander and other proxies were discussed in the SI, which showed that none of our central conclusions relied on their use.

    Bizarre or not, Mann uses the Tiljander XRD proxy in an upside-down fashion.
    The upside-down-ness of the proxy isn’t affected one whit by whether Mann used one-sided tests (etc.), or not.
    The claim that none of Mann et al’s central conclusions relied on the use of the Tiljander proxies is bizarre, and likely wrong.

    If Mann’s way of reading the Lake Korttajarvi proxy thermometer is correct, then Tiljander’s and Kaufman’s is necessarily wrong. Upside-down, to be exact.

    To date, neither Mann nor anybody else has offered a coherent explanation of his use of these proxies.

    The responses of PNAS’ editors, their peer reviewers, Gavin Schmidt (, William Connolley (Stoat), and the rest of the AGW-Consensus climatology community to Mann’s bravado are quite informative. What this says about the state of paleoclimatology isn’t pretty.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: