## Maximum Triviality Reconstruction

Posted by Jeff Id on April 27, 2009

I did this reconstruction of Antarctic temperatures some time ago. It is a true Voroni area weighted reconstruction using only surface station data. Instead of infilling NA values, this reconstruction simply uses the closest station weighted by the area of the polygon. **Many **advocates have suggested that I am a denier yet I’ve known about this simple evidence that the Antarctic isn’t warming at 0.12C/Decade +/-0.07 for some time. The fact that I haven’t presented it yet is because of the lack of complete data for the trend calcs at some stations. This lack of trend results in extreme slopes at certain stations, so I don’t like this recon as much as others. This is despite the fact that this recon presents the lowest average trend of any reconstructions – denier food. Still it’s not bad though simply because it represents the least fooled around with reconstruction I know of.

The Antarctic temperature distribution shows the difference. Here’s the most simple reconstruction.

It’s a lot more colorful than the closest station reconstruction which infilled missing data. This next plot is from my previous post using the next closest station to infill missing values.

The number of polygons and reduction of extreme trends caused by missing data is apparent. That’s why I prefer ‘this previous’ reconstruction despite the higher trend. Still this recon may actually be a more correct trend because there are error factors which occur from infilling missing values. Either way, we know to a high degree of certainty, the Antarctic is not warming as Steig et al. and RealClimate claim. Below is the trend from 1967 onward.

This is a pretty strong negative trend. You would think these baselines would tend to slow down the AGW scientists, but apparently thermometers aren’t as reliable as RegEM.

The next plot is the distribution of temps in the Antarctic.

Now that plot is very colorful with deep red’s and blues adjacent to each other. Temperature trends certainly didn’t vary by that much. They are an artifact of the missing data. In this case, if the data is missing at random from the trend’s perspective, the average trend could be close.

Next I’ll do RegEM with no islands or peninsula, then I think it’s time for a summary.

## Kenneth Fritsch said

You are not making any major claims for this reconstruction variation, but all these variations are important and in my view like a comprehensive sensitivity test — that someone is going to have to summarize one of these days.

## Jeff Id said

#1 For sure. I’ve got to do the RegEM with no islands first. Then a summary. There are so many reconstructions now and none, not one, created a higher trend than Steig et al. That really bothers me.

## Ryan O said

And it actually is kind of cool that the different reconstructions are fairly similar despite the difference in the amount of data used.

## Fluffy Clouds (Tim L) said

the value of .0340 may not meet precision standers when using 51.1C

as the most we can claim is 0.05( not .04 or .06)

I hope I don’t piss ya off but this is getting more technical.

Links for the science challenged

this reminds me of significant figures (significant digits) http://en.wikipedia.org/wiki/Significant_figures

“”spurious digits introduced, for example, by calculations carried out to greater accuracy than that of the original data, or measurements reported to a greater precision than the equipment supports.”” http://en.wikipedia.org/wiki/Accuracy_and_precision

“”A common convention in science and engineering is to express accuracy and/or precision implicitly by means of significant figures. Here, when not explicitly stated, the margin of error is understood to be one-half the value of the last significant place. For instance, a recording of 843.6 m, or 843.0 m, or 800.0 m would imply a margin of 0.05 m (the last significant place is the tenths place), while a recording of 8436 m would imply a margin of error of 0.5 m (the last significant digits are the units).””

## Jeff Id said

#4, No problem. It’s generally true but consider this series of measurements.

.1

.2

.1

.1

.2

.1

.2

Say there were 100 of these measurements both precise and accurate to +/- 0.05 with an average of .1124.

Would you still say the absolute precision is +/- 0.05 or have we achieved a more precise measurement?

## Fluffy Clouds (Tim L) said

the average in scientific terms is .1

if half of the # are .1 and .2 then in precision terms .15 +/- .05

which is also the answer.

I can go to the collage here and ask about this for clarification it has been 25 years from when I have attended….LOL…

thank you for the hard work on all of this.