Maximum Triviality Reconstruction

I did this reconstruction of Antarctic temperatures some time ago. It is a true Voroni area weighted reconstruction using only surface station data. Instead of infilling NA values, this reconstruction simply uses the closest station weighted by the area of the polygon. Many advocates have suggested that I am a denier yet I’ve known about this simple evidence that the Antarctic isn’t warming at 0.12C/Decade +/-0.07 for some time. The fact that I haven’t presented it yet is because of the lack of complete data for the trend calcs at some stations. This lack of trend results in extreme slopes at certain stations, so I don’t like this recon as much as others. This is despite the fact that this recon presents the lowest average trend of any reconstructions – denier food. Still it’s not bad though simply because it represents the least fooled around with reconstruction I know of.

id-recon-total-trend-closest-station

The Antarctic temperature distribution shows the difference. Here’s the most simple reconstruction.

id-recon-trend-1957-2007-spatial-closest-station

It’s a lot more colorful than the closest station reconstruction which infilled missing data. This next plot is from my previous post using the next closest station to infill missing values.

id-recon-spatial-trend-by-distance-weight-1956-2006

The number of polygons and reduction of extreme trends caused by missing data is apparent. That’s why I prefer ‘this previous’ reconstruction despite the higher trend. Still this recon may actually be a more correct trend because there are error factors which occur from infilling missing values. Either way, we know to a high degree of certainty, the Antarctic is not warming as Steig et al. and RealClimate claim. Below is the trend from 1967 onward.

id-recon-trend-1967-2007-closest-station2

This is a pretty strong negative trend. You would think these baselines would tend to slow down the AGW scientists, but apparently thermometers aren’t as reliable as RegEM.

The next plot is the distribution of temps in the Antarctic.

id-recon-trend-1967-2007-spatial-closest-station

Now that plot is very colorful with deep red’s and blues adjacent to each other. Temperature trends certainly didn’t vary by that much. They are an artifact of the missing data. In this case, if the data is missing at random from the trend’s perspective, the average trend could be close.

Next I’ll do RegEM with no islands or peninsula, then I think it’s time for a summary.

6 thoughts on “Maximum Triviality Reconstruction

  1. You are not making any major claims for this reconstruction variation, but all these variations are important and in my view like a comprehensive sensitivity test — that someone is going to have to summarize one of these days.

  2. #1 For sure. I’ve got to do the RegEM with no islands first. Then a summary. There are so many reconstructions now and none, not one, created a higher trend than Steig et al. That really bothers me.

  3. And it actually is kind of cool that the different reconstructions are fairly similar despite the difference in the amount of data used.

  4. the value of .0340 may not meet precision standers when using 51.1C
    as the most we can claim is 0.05( not .04 or .06)
    I hope I don’t piss ya off but this is getting more technical.

    Links for the science challenged
    this reminds me of significant figures (significant digits) http://en.wikipedia.org/wiki/Significant_figures
    “”spurious digits introduced, for example, by calculations carried out to greater accuracy than that of the original data, or measurements reported to a greater precision than the equipment supports.”” http://en.wikipedia.org/wiki/Accuracy_and_precision

    “”A common convention in science and engineering is to express accuracy and/or precision implicitly by means of significant figures. Here, when not explicitly stated, the margin of error is understood to be one-half the value of the last significant place. For instance, a recording of 843.6 m, or 843.0 m, or 800.0 m would imply a margin of 0.05 m (the last significant place is the tenths place), while a recording of 8436 m would imply a margin of error of 0.5 m (the last significant digits are the units).””

  5. #4, No problem. It’s generally true but consider this series of measurements.

    .1
    .2
    .1
    .1
    .2
    .1
    .2

    Say there were 100 of these measurements both precise and accurate to +/- 0.05 with an average of .1124.

    Would you still say the absolute precision is +/- 0.05 or have we achieved a more precise measurement?

  6. the average in scientific terms is .1
    if half of the # are .1 and .2 then in precision terms .15 +/- .05
    which is also the answer.
    I can go to the collage here and ask about this for clarification it has been 25 years from when I have attended….LOL…

    thank you for the hard work on all of this.

Leave a comment