the Air Vent

Because the world needs another opinion

Archive for July, 2009

Effects of Surface Trends on Antarctic Reconstruction

Posted by Jeff Id on July 15, 2009

This is a guest post by NicL where he examines the effects of a variety of synthetic data on the Antarctic reconstruction. Nic has an interesting approach to his analysis which is always a good thing when looking at a complex algorithm. I’ve uploaded a Word document at the bottom which contains the original formatting and code for this post.

===================================================

UPDATE: Due to some new developments in the R version of RegEM used in this post.  The results have been updated in a paper by Nic here.  Although the result has improved it has changed very little and the conclusions are unchanged.  A link to the revised post is here.

The effect of surface station trends on Steig’s reconstruction2

====================================================================

The effect of surface station trends on Steig’s reconstruction of Antarctica temperature trends

In this article, I aim to show that RegEM, the Regularized Expectation Maximization algorithm used by Steig in its TTLS ((truncated total least squares) variant, is highly sensitive to trends contained in its input data, even when the trending data series have little or no relationship with each other apart from both having trends; to propose a method of counteracting this sensitivity and to evaluate it; and having demonstrated that the method works well to show that using it produces much lower average reconstruction trends than Steig’s method. But before I start, I would like to set readers a puzzle. Which surface station has the most impact on the average 1957-2006 trend per Steig’s reconstruction, and how much impact does it have? The answer, which I think will surprise most people, and casts further doubt on Steig’s results, is given in the last section of this article.

The sensitivity of RegEM to trends in the data

There has been considerable discussion, in the context of Steig’s RegEM based reconstruction of Antarctica temperatures from 1957-2006 using satellite AVHRR data from 1982 on, of the likelihood that RegEM may impute a long term trend from one data series to another based purely on short term (high frequency) correlation between the two data series. I believe that is indeed a major concern. It is entirely possible for the temperatures at two sites to be affected similarly by short term factors but for them to exhibit entirely different trends over the long term. If, say, temperatures in the Antarctic peninsula exhibited a high frequency correlation with the data series representing the satellite temperature measurements, local temperature trends in the peninsula arising from oceanic causes could have a distorting influence on the satellite data series based reconstruction of Antarctica average temperature trends.

However, I have seen little discussion or investigation of the sensitivity of RegEM to data series that exhibit long term trends but which are not otherwise correlated in any significant way with the remaining data series – where high frequency correlations are negligible. I have therefore carried out some investigation into this subject. In order to do so, I first modified Steve M’s R-port of the regem_pttls function so that it would, inter alia, produce and report the RegEM TTLS solution, in terms of principal component (PC) time series (equal in number to the regularization parameter regpar) and weights on those PCs for each of the data series, from which the “unspliced trend” can be derived.

Read the rest of this entry »

Posted in Uncategorized | 31 Comments »

COVARIANCE vs. CORRELATION

Posted by Jeff Id on July 14, 2009

A guest post by Ryan O.

As many of you know, we have been in contact with Dr. Beckers, primary author of a few papers on using the DINEOF algorithm for infilling cloud masked images. It was very gratifying to discover that our Iterative TSVD method we developed turns out to be the same algorithm – with one important exception. DINEOF uses covariance, while ours uses correlation.

The difference between correlation and covariance has been the subject of much debate. Indeed, it was one of the issues in the Hockey Stick debate. In our case, the rescaling to unit variance results in an artificial inflation of the variance of the actual data at the beginning of the algorithm (though, as the algorithm progresses, this effect disappears). It also changes the spatial structure of the EOFs used for imputation.

While there are certain a priori arguments that can be made for using correlation, there are also equally plausible arguments that can be made for using covariance. So based on Dr. Beckers’ suggestion, I felt it was important to compare results obtained using covariance to earlier results. The method used was the same as the previous “You Can’t Get There From Here” post. On the left is the correlation results; on the right is covariance.

Fig_1

Fig. 1: Results of split verification experiments using ground stations only. Left – correlation. Right – covariance.

Read the rest of this entry »

Posted in Uncategorized | 15 Comments »

Sanity Check

Posted by Jeff Id on July 11, 2009

Since we’re considering trashing our global economy for a potential couple degree’s C of warming, here’s an interesting graphic. It assumes humans have added 30% of the 380 parts per million of CO2 in the atmosphere (from 280-380ppm). How much is that? Find the red dot at the tip of the arrow.

Read the rest of this entry »

Posted in Uncategorized | 45 Comments »

Double O zero, Licensed to Speak

Posted by Jeff Id on July 10, 2009

Dr. James Hansen, hysteriologist and climatemongerer wrote an article slamming the Obama Cap and Steal ponzi scheme. Calling it exactly what it is:

For all its “green” aura, Waxman-Markey locks in fossil fuel business-as-usual and garlands it with a Ponzi-like “cap-and-trade” scheme. Here are a few of the bill’s egregious flaws:

  • It guts the Clean Air Act, removing EPA’s ability to regulate CO2 emissions from power plants.
  • It sets meager targets — 2020 emissions are to be a paltry 13% less than this year’s level — and sabotages even these by permitting fictitious “offsets,” by which other nations are paid to preserve forests – while logging and food production will simply move elsewhere to meet market demand.
  • Its cap-and-trade system, reports former U.S. Undersecretary of Commerce for Economic Affairs Robert Shapiro, “has no provisions to prevent insider trading by utilities and energy companies or a financial meltdown from speculators trading frantically in the permits and their derivatives.”
  • It fails to set predictable prices for carbon, without which, Shapiro notes, “businesses and households won’t be able to calculate whether developing and using less carbon-intensive energy and technologies makes economic sense,” thus ensuring that millions of carbon-critical decisions fall short.

Hansen Arrested

Hansen Arrested for Science

The biggest scam about this bill may be the free pass for insider trading. Unfortunately Hansen failed to mention the obvious ability for manipulation by cooperation with powerful politicians. What happens to carbon credit prices when the repeatedly self labled ‘Most Powerful Woman in the World’ Nancy Pelosi makes a strong statement about tighter regulation or if she surprises everyone with a conciliatory NO intent to increase caps or a potential reduction. Just a few words and her campaign contributors have a big payday. This is Chicago/Moscow/Venezuela politics, brought to you by the Obaminator. Don’t be fooled by your friendly government people, taxation with representation ain’t much better.

Of course Hansen needs global warming to maintain his manly demeanor, expensive travel budget and powerful presence on the world stage. Apparently licensed by Michael Mann’s Real Climate ‘scientists who are allowed to speak’ program, Hansen’s article pontificates about the glorious advantages of economic stimulus by wealth redistribution.  Stalin would be proud.

Read the rest of this entry »

Posted in Uncategorized | 4 Comments »

Fantastic High Resolution Video of Sea Ice

Posted by Jeff Id on July 9, 2009

This link is provided by Bremen Universtity and utilizes the ASMR-E sensors to compute the sea ice.  I recommend right clicking and downloading to your harddrive for viewing.  The files are only 20mb in size so the download time isn’t that long.   The advantage of these videos over my own is a much higher resolution which really reveals the flow patterns of the sea ice.   You can see the currents in the Arctic push in through the Bearing strait melting away the ice in the summer.    Thanks to DeWitt Payne for emailing the link.

Arctic Sea Ice Video

Antarctic is below.

Read the rest of this entry »

Posted in Uncategorized | 4 Comments »

Off the Deep End

Posted by Jeff Id on July 8, 2009

My god Mike Mann is full of himself. His rubbish math and media appearances have taken him right off the deep end. Now he’s demanding a new class of scientist – a super scientist who apparently can work in public policy complete with a reward system (public funded I assume).

Given that we (scientists) are part of the problem, it must stand to reason that we are also part of the solution. And indeed, this is a primary thesis advanced by Mooney and Kirshenbaum. The authors argue that we must fundamentally reinvent the way that scientists are trained, so as to encourage and reward those who choose to serve as much-needed science liasons and science communicators. Indeed, the reward system must be reworked in such a way as to facilitate the establishment of a whole new class of scientists, so-called ’science ambassadors’ who are rigorously trained in science, but have the proclivity and ability to engage in the broader discourse and to help bridge the growing rift between the ‘two cultures’. We can no longer rely on pure serendipity that figures such as Sagan will just come along. We must be proactive in establishing a pipeline of scientists who can fill this key nich

Science Ambassadors:

Read the rest of this entry »

Posted in Uncategorized | 27 Comments »

Area Weighted TPCA Check

Posted by Jeff Id on July 7, 2009

I put these images together from Ryan’s latest post and my own closest station post to show how truncated PCA is doing a good job with locating station information at the correct grid points. TPCA has no method for knowing exactly where each station is located. Ryan added eigenvector weights to each surface station during imputation which improves the odds that during convergence the station information is located at the correct grid area. We think of this like a sanity check to demonstrate that the method is working. My own opinion is that it’s doing an excellent job now and the Antarctic reconstruction by expectation maximization is repaired thanks to Ryan’s huge efforts.

fig_6[1]

Figure 1 Ryan Recon 28 pc

Read the rest of this entry »

Posted in Uncategorized | 38 Comments »

tAV to REALCLIMATE: YOU CAN’T GET THERE FROM HERE

Posted by Jeff Id on July 6, 2009

A few notes from Jeff,

First, this is exciting work Ryan has done,  The Air Vent has been lucky to have great guest posts lately.   The work by Ryan and Nic has been excellent in improving my own understanding of the original Steig et al. reconstruction and methods for improvement.

Ryan has been working on a concept he had to improve two items of the original reconstruction.   The first one I want to mention is the imputation of Steig et al. is actually backwards.  The Steig reconstruction is  a reconstruction of satellite surface skin temperature (AVHRR)  rather than a surface temperature reconstruction as it’s billed.  It needs to be recalibrated to match surface station trend to be considered a valid surface station reconstruction.  As we know from previous work here, the trends of the Steig reconstruction are substantially different from surface station trends.  The second point has to do with RegEM converging to a local or global minima.

Ryan employs a weighting method in a TSVD reconstruction which involves two separate steps.

The method Ryan used to correct for both of these problems involves pre-weighting surface stations in relation to satellite information .  By applying a large equal multiplier to the surface stations in relation to sat PC’s, Ryan gives a strong weighting to the surface stations vs sat and negates the need for a post-reconstruction calibration.

The second weighting is more clever and important to climate science.  While Ryan explains it pretty well below, sometimes two explanations can help communicate it and this mathematical step should be very important in EM processing using similar data fields.   In  EM where large qty of data is missing,  spurious correlations can occur and a non-global minimum can be reached.  In this case large portions of the data are missing.   Imputing 1 PC at a time, Ryan weighted the individual surface stations by the pca eigenvector weighting of the original AVHRR data.  This means areas with low information content for the pc are less likely to accidentally become heavily weighted as each iteration progresses.  – Very important.  Think of it as an improved start point for the iteration, or an improved station location in the imputation rather than RegEM figuring out where stations belong.

Ryan, please feel free to correct any details you see wrong with this description.

Of course Ryan couldn’t resist a little discussion with RC 😉 .  Read on, I think you’ll like it.

============================================================================

Guest post by Ryan O

Most of you are aware that Dr. Steig posted a response to our reconstructions over at RealClimate. The link is here: http://www.realclimate.org/index.php/archives/2009/06/on-overfitting/

There were two salient points in his post that we should look at. One point was that “someone called Ryan O” had obtained better verification statistics by first calibrating the satellite data to ground data. This point is easily addressed. To all RealClimate readers: everything that follows was done using the cloud masked AVHRR data provided by Dr. Steig as-is. No calibration. In fact, given the way our reconstructions are done, any such calibration would not affect the results. This, too, will be shown later.

The second point is that Dr. Steig claimed that the verification statistics were degraded as additional AVHRR PCs were included. This is certainly true, if you use the original tools (RegEM TTLS) and the original methodology (impute the whole mess at once). I think this point was lost at RealClimate. There are problems with the method and the math behind the method – so we changed the method to address these issues.

Here is a short summary of the issues:

1. TTLS assumes errors in both the predictors and predictands. However, an error in a PC (which is an abstract quantity) does not mean the same thing as an error in a temperature measurement at a specific location. Additionally, if the pre-1982 portion is calculated based on assuming errors in the ground stations and the PCs, then it is inappropriate to simply add the original, unmodified PCs onto the end. The post-1982 solution needs to be calculated the same way as the pre-1982 solution: assuming errors in both.

2. While Steig refers to the ground stations as the predictors and the satellite PCs as the predictands, in their method, this is not strictly true. Any existing values are the predictors and the missing values are the predictands. This means the ground stations and satellite PCs are both predictors and predictands. Not only that, but the satellite PCs affect each other’s imputation by interacting with each other and with the ground stations.

3. Because the solution is based on the truncated SVD of the correlation matrix, the pre-1982 portion of the AVHRR PCs is not truly an extrapolation of the PCs. It is a rotation of the PCs to the ground station data. This means that the original AVHRR PCs should not simply be tacked on to the end. The rotated PCs should be used from 1957 to 2006. The standard RegEM TTLS algorithm does not return the rotated (unspliced) solution (though Nic L’s modification does return that solution). This problem can be done as an extrapolation, but Steig’s method does not accomplish that.

4. The ground stations are used to predict PC values without regard to whether the PC explains any variance at the station location. This is not necessarily a problem – unless you subsequently recover gridded temperatures using the eigenvector. Because Steig uses the eigenvectors to recover gridded temperatures, then the eigenvector must be used to constrain the imputation. RegEM TTLS has no means of doing this.

5. An insufficient number of PCs are used. The claim that 3 PCs can represent land the size of Antarctica when the ERSST reconstructions required 15+ PCs to represent open ocean areas of equivalent size defies belief.

Some of these issues cannot be resolved when using RegEM TTLS. To that end, we started using a different imputation tool based on a truncated SVD approach (originally written for R by Steve McIntyre). The benefits of the truncated SVD approach are that it is faster, allows direct access to the unspliced solution, and is simpler to understand. That last benefit is important because we will discover that we need to modify the truncated SVD approach to address some of the methodological problems.

Read the rest of this entry »

Posted in Uncategorized | 72 Comments »

Declaration of Independence

Posted by Jeff Id on July 4, 2009

It’s good to remember where we’ve been so we have perspective on where we’re going.

Declaration of Independence

The complete text.

Original spelling and capitalization have been retained.

Read the rest of this entry »

Posted in Uncategorized | 17 Comments »

Unprecedented Again

Posted by Jeff Id on July 3, 2009

From Science Daily where I read fairly often, I found this headline.

Sea Ice At Lowest Level In 800 Years Near Greenland

The title makes you instantly skeptical because sea ice isn’t at it’s lowest even in the last two years. Of course we then have to wonder how they determined sea ice levels back 800 years.

ScienceDaily (July 2, 2009) — New research, which reconstructs the extent of ice in the sea between Greenland and Svalbard from the 13th century to the present indicates that there has never been so little sea ice as there is now. The research results from the Niels Bohr Institute, among others, are published in the scientific journal, Climate Dynamics.

What do you know it’s another tree ring study in noise blending.

There are of course neither satellite images nor instrumental records of the climate all the way back to the 13th century, but nature has its own ‘archive’ of the climate in both ice cores and the annual growth rings of trees and we humans have made records of a great many things over the years – such as observations in the log books of ships and in harbour records. Piece all of the information together and you get a picture of how much sea ice there has been throughout time.

If someone has a copy of the PDF for this paper, I would appreciate it. In the meantime, I’ve copied the abstract below. It sounds straight from the team bendahockeystick playbook. The abstract can be found HERE

Read the rest of this entry »

Posted in Uncategorized | 15 Comments »

Hubris

Posted by Jeff Id on July 2, 2009

On RC the entire team shows this statement in defense of the endless ‘faster and worse than expected’ we the unwashed are continually bombarded with. LINK HERE

Some aspects of climate change are progressing faster than was expected a few years ago – such as rising sea levels, the increase of heat stored in the ocean and the shrinking Arctic sea ice.

Since I’ve only looked at sea ice myself and have gained some knowledge from several months of study, let’s look at the scientifically moot point of Arctic sea ice trend.


3. Arctic Sea Ice. The Synthesis Report states:

One of the most dramatic developments since the last IPCC Report is the rapid reduction in the area of Arctic sea ice in summer. In 2007, the minimum area covered decreased by about 2 million square kilometres as compared to previous years. In 2008, the decrease was almost as dramatic.

This decline is clearly faster than expected by models, as the following graph indicates.

This is one graph they present to show how wrong an accomplished scientist like Dr. Pielke is.

https://i0.wp.com/www.realclimate.org/wp-content/uploads/_45146192_ice_extent_466.gif

Figure 1 Arctic Ice vs Models

So the model sits above the projection by 1-2M km from 1977ish until 2009 and they just now realize that perhaps the “model” may have a problem. I’m still a rookie to climate science (for about 1 more month) but you can’t sell me this nonsense and expect me to pay. Ain’t happenin’ Dr’s.

Read the rest of this entry »

Posted in Uncategorized | 22 Comments »

WUWT Cloud Creation

Posted by Jeff Id on July 1, 2009

Everyone who hasn’t seen the slide show on cloud aerosol should take a look at this post at WUWT.

http://wattsupwiththat.com/2009/07/01/message-in-the-cloud-for-warmists-the-end-is-near/

Posted in Uncategorized | 6 Comments »