Posted by Jeff Id on March 17, 2013
So I spent several hours today writing scripts which parse the emails. I was hoping for continuations of some of the more interesting conversations we are familiar with but so far have found little more than a group of advocates for catastrophic climate change, doing what they do. They fully believe that the fact that proxy data doesn’t match temperature, in no way calls into question the randomly selected proxy data. Some question whether it is it ok to paste data on the end of a series. Still there are others who advocates of more study, stating that the “act-now” advocates are not honest scientists. Again, I’m reminded of the organized and funded attacks against anyone who notices the problems with their work. It is really shocking to read how they followed through with attacks against those who don’t fall in line. Mann in particular, is thin skinned and his angry attacks on other advocates not pushing his version of history, pressure those with little backbone to play both sides of the fence.
If you want the meaning of the emails, you have to be able to read and CG1 and 2 have most everything we need to know in them so far. Beyond a three word “hide the decline”, the average public has no interest. So far, I have found no new pithy quote with the kind of clarity that CG1 revealed. I did find a large number of emails which we have covered in topic before. Some have new replies but I’ve noticed nothing which was tremendously interesting.
There were so many nuances in these emails. Remember this email from Michael Mann (my bold):
Date: Tue, 14 Oct 2003 17:08:49 -0400
Subject: Re: smoothing
correction ‘1)’ should read:
‘1) minimum norm: sets padded values equal to mean of available data beyond the
available data (often the default constraint in smoothing routines)’
sorry for the confusion,
At 05:05 PM 10/14/2003 -0400, Michael E. Mann wrote:
To those I thought might be interested, I’ve provided an example for discussion of
smoothing conventions. Its based on a simple matlab script which I’ve written (and
attached) that uses any one of 3 possible boundary constraints [minimum norm, minimum
slope, and minimum roughness] on the ‘late’ end of a time series (it uses the default
‘minimum norm’ constraint on the ‘early’ end of the series). Warming: you needs some
matlab toolboxes for this to run…
The routines uses a simple butterworth lowpass filter, and applies the 3 lowest order
constraints in the following way:
1) minimum norm: sets mean equal to zero beyond the available data (often the default
constraint in smoothing routines)
2) minimum slope: reflects the data in x (but not y) after the last available data
point. This tends to impose a local minimum or maximum at the edge of the data.
3) minimum roughness: reflects the data in both x and y (the latter w.r.t. to the y
value of the last available data point) after the last available data point. This tends
to impose a point of inflection at the edge of the data—this is most likely to
preserve a trend late in the series and is mathematically similar, though not identical,
to the more ad hoc approach of padding the series with a continuation of the trend over
the past 1/2 filter width.
The routine returns the mean square error of the smooth with respect to the raw data. It
is reasonable to argue that the minimum mse solution is the preferable one. In the
particular example I have chosen (attached), a 40 year lowpass filtering of the CRU NH
annual mean series 1856-2003, the preference is indicated for the “minimum roughness”
solution as indicated in the plot (though the minimum slope solution is a close 2nd)…
By the way, you may notice that the smooth is effected beyond a single filter width of
the boundary. That’s because of spectral leakage, which is unavoidable (though minimized
by e.g. multiple-taper methods).
I’m hoping this provides some food for thought/discussion, esp. for purposes of IPCC…
After reading from these same people, how well funded “right-wing” skeptics with ties to industry are so biased, to read that reflection of a trend at the end of a hockey stick “might” be proper science is a little difficult to swallow. Don’t forget that this is a 2003 email, and we now know that temps have stayed relatively flat since then. The reflection Mr. Mann proposed, is therefore ad-hoc, and can now be proven inaccurate.
In the end, today’s reading was 99.9 percent review of just how loose a game is being played. It shouldn’t be overlooked that the purpose of the enzyte filter Dr. Mike proposed is for publishing in the premier global warming report of all time.