OLMC 10, What does it mean…
Posted by Jeff Id on December 10, 2010
Nic and Ryan both emailed their views to Andy Revkin and gave permission to post their emails here. Rather than post in their entirety, I’ll find the good parts so you don’t miss your hockey or football games. Nic’s letter first, my bold throughout.
Please note that if you quote any comparisons of our and Steig et al’s 2009 Nature paper (S09) regional temperature trends, to be comparable the S09 figures should be those quoted in our paper, which have been recomputed using the same natural geographical boundaries used in our study. The regional boundaries used in the original S09 study were slightly different, so the regional temperature trends stated therein are, without recomputation, not comparable to those per our study. (NB Eric Steig may prefer his own definition of the boundary between West Antarctica and the Antarctica Peninsula, but the boundary we use is supported by, for instance, Wikipedia’s page on the Antarctic Peninsula.)
Judging from a recent post at RealClimate, there appears to be an attempt to gloss over the differences between the results of our study (per the main RLS reconstruction) and that of S09 (per its main reconstruction) , some of which are pretty fundamental. For instance:
- we show no statistically significant warming for the continent as a whole over 1957-2006 (our finding is 0.06±0.08 degrees C/ decade, using a standard 95% confidence interval; I state all subsequent trends on this basis), whereas S09 showed statistically significant warming of 0.12±0.08. S09’s central estimate of the continental trend is double ours, and the difference between the central trend estimates is statistically significant (0.06±0.05).
- S09 showed that warming in West Antarctica was considerably greater than in the Peninsula; we show the contrary.
- S09 show fast warming in West Antarctica, with a central estimate over twice its lower 95% confidence limit (0.20±0.09, using our geographical definitions). Our central estimate is half S09’s and is only marginally statistically significant (0.10±0.09). Again the difference in the two central trend estimates is statistically significant (0.09±0.06).
You can see that some have been missing the point from the beginning. I warned that the reconstructions are statistically significantly different yet some can’t stop parroting the line that these results confirm S09. It’s half the trend in the majority of the area folks, not very darned close. I wonder how many Joules that represents.
Nic writes further, a point which I had not considered.
As I see it, one of the most important points to come out of all this is that Nature’s peer review process completely failed to prevent a mathematically badly flawed paper being published. And it took a bunch of amateur researchers to publish a paper that brought these flaws to light and correct them – no mathematically competent professional climate scientist did so, perhaps because of fear it would do their careers no good. One has to wonder how many other papers with incorrect results have been published by authors who go along with ‘consensus’ views, and have never been corrected. Papers such as ours that question the status quo, on the other hand, are subject to a stringent peer review process, in our case involving one reviewer who from some of his extensive critical comments (almost all of which were invalid) clearly had a personal interest in avoiding a paper contradicting S09 being published! IMHO, in the interests of science the peer review process needs to be made more transparent and even handed.
There were other points but diluting Nic’s commentary is not in the cards today.
Ryan’s email was a little different but he also made similar points. First since this was an email to Andy, a little commentary for the two bloggers on the paper.
Unfortunately, some of my coauthors have been portrayed as trying to “dig” for evidence of “cooling”, as if by way of showing a cooling Antarctic, the demon of anthropogenic global warming would be banished. I believe this characterization is unfair, as Steve McIntyre publicly posted on Climate Audit early in this process that he felt it should not be surprising that the Antarctic was warming along with the rest of the world, and his goal was to determine if the method used by S09 was appropriate. Nicholas Lewis, Jeff Condon, and I have all publicly stated that we believe anthropogenic activities contribute to warming, though we may disagree with the consensus on the magnitude of that contribution. Finally, the “digging” we had to do was not in finding something wrong with the S09 method (it was rather easy to verify the S09 method improperly spreads Peninsula warming throughout the continent) but in designing a method that avoided many of the deficiencies of the S09 method.
Then in his matter of fact style Ryan followed up with this:
In my opinion, the important and significant differences between our paper and S09 include:
1. Improvements to the method, which include demonstrating that certain steps performed by S09 were not mathematically valid (regardless of whether they “worked” in terms of results)
2. Demonstrating that the S09 method does, indeed, cause the Peninsula warming to be geographically relocated to the rest of the continent
3. Demonstrating that the strong warming throughout West Antarctica shown in S09 – which was the primary claim in that paper – is an artifact, and the only statistically significant warming in West Antarctica is occurring in Ellsworth Land and the northern portion of Marie Byrd Land immediately adjacent
4. Demonstrating that the seasonal patterns of change in S09 (which are important for distinguishing between possible physical mechanisms for the changes in Antarctic climate) were strongly influenced by the Peninsula contamination, particularly in West Antarctica and the half of East Antarctica from the south pole to the Weddel Sea
See, the seasonal trends of the peninsula represent certain physical warming processes and these trends appeared in S09 all across the West Antarctic. This was simply evidence of peninsula station information spreading across the continent. Ryan follows the above up with this:
While both studies show statistically significant warming in Ellsworth Land (which is what RealClimate seems to be focused on right now, as a way of saying our work “confirms” S09), evidence that Ellsworth Land was warming rather significantly was already present in the literature (e.g., Shuman and Stearns, 2001; Kwok and Comiso, 2002; King and Comiso, 2003; Chapman and Walsh, 2007). Even a paper entitled “Antarctic Climate Cooling” (Nature, 2002, Doran et al.) shows warming in Ellsworth Land. The novelties of S09 were statistically significant warming throughout the rest of West Antarctica, a statistically significant continental average, and a seasonal pattern of change that differed from previous gridded reconstructions (Chapman and Walsh, 2007 and Monaghan et al., 2008). Our paper demonstrates that all of these novel results in S09 are artifacts.
Certainly other portions of S09 are confirmed by our paper (such as overall positive trends). However, we note that earlier studies also showed the same things, so these were not newly introduced with S09. The results that were newly introduced with S09, on the other hand, are all shown to be artifacts.
There you go. No matter how you spin that kind of confirmation of result, it is hard to separate from the fact that the S09 method simply smeared the peninsula information across the continent. It was demonstrated in the over high trend in the east, the overly low trend in the peninsula and more obviously in the seasonal trend distortions which caused the continent in S09 to match the peninsula.
All the authors have their own opinions, mine is that this is more than a simple improvement.
What is still missing from this discussion though is the description of the multiple novel methods Ryan came up with for solving the problem and the one finally settled on. There were definite improvements in the algorithms, Nic also contributed to these and I hope that we will hear more on that after the paper is published. Ryan’s code is very clean and despite the fact that R isn’t my favorite language, easy to follow. Can’t wait for that.