I just learned something today which has left a sick feeling in my stomach. I learned how the latest Mann et al hockey stick graph was created. So you don’t have to read the paper HERE, I’m going to summarize the result.
This time Mann avoided all the tree ring reconstructions from the old hockey stick because of a problem with what is termed Divergence. Divergence means the data don’t correlate well in recent times (last 130 years). Scientists know that a high percentage of trees “temperature” data don’t fit well when compared to the warming trend of earth – measured data. To sort the data, they correlate the data, which is very noisy and doesn’t look like a temperature measurement at all by measuring its statistical significance to the 130 year measured temperature record. Data which doesn’t match to a minimal extent with current temperatures is rejected. The remaining data sets are averaged together (often using very strange techniques) to create a final signal.
As an exercise assume you start with 1000 sets of random very noisy set of data which swings up and down by 4 degrees C and you average them. You should get a relatively flat line with wiggles of a magnitude much smaller than any of the individual peaks.
If you take the same random data, calibrate its endpoint to today’s temperature (offset it so the end matches today’s temperature) and then sort it (throw data out) so that only data which correlate to a temperature rise at the end 5% of the dataset remains. Then you average the remaining data you would get a relatively flat line with an upward spike at the end . The averaged data would have an end spike which would almost certainly be of greater magnitude than the rest of the curve.
I had no idea that is what these guys are doing. I am absolutely pissed off at this kind of junk science. This wouldn’t pass the smell test in any field except climatology and it only passes here because there is motive.
Ok Deep breath….
So recently, because so many trees don’t meet the temperature trend, Mann and his buddies took some of the other very very noisy data from sediments, pollen and a dozen other reconstruction techniques and ran the same process. Again, he has created the “hockey stick” using different data, no surprise really.
So now suppose that you had a noisy temperature signal in the data, you have filtered it all for a recent upward trend which is clear, now what would happen to the noisy but real temperature signal in the past. It would of course be muted by the average of the noise dampening its actual magnitude!! This is pretty basic math for a scientist so it’s not to difficult to figure out.
I had to point out this problem to Real Climate where they are currently in a circular celebration of this huge victory where they are saying things like the best climate reconstruction ever, now we have proof, na na na and the like.
So I wrote my own post.
To Real Climate.org HERE. Although it was a perfect fit for the thread, it has not been shown. Apparently criticism of the conclusions is unacceptable. I don’t have the original post but it went something like this. —
I explained that if you took a random data set and averaged it as above it would create a strong correlation to an upward trend with a flat line behind it for the reasons above. I then stated that an average of the entire data sets would be more useful in determining the magnitude of present data as compared to history. Statistically any deviation from the zero of actual temperature would be reduced by averaging with the rest of the data.
I then explained that I could see no basis for elimination of any of the data sets by correlation to measured temperature, as it would only serve to create an artificially strong spike at the end of otherwise flattened data and that for a better trend analysis, all the data should be used. I could see no scientific basis for the elimination of data by the single criteria that they don’t match current trends other than the intent to make a temperature increase stand out.
I finished by saying, one thing is certain, we cannot make the irresponsible claim from this reconstruction that past temperatures were lower than present. Past temperatures would be muted in the signal and therefore a high or low temperature would have to be substantially greater than today to show as an equal value.
Clearly they don’t want people to know.
I re-posted asking for an explanation of the deletion, so far no response.
As an addendum, just so people understand, noisier data, more random and less accurate data, will produce a better hockey stick than data with an actual signal with this method. They would almost want bad data to prove global warming was caused by man. We might as well roll dice and run it through the algorithm. I think I might just do that to prove the point!
This is much worse than I realized.