the Air Vent

Because the world needs another opinion

Archive for August, 2009

Area Weighted Antarctic – Offset Reconstructions

Posted by Jeff Id on August 31, 2009

Ok, I know you guys are tired of area weighted reconstructions but for those who say publish, the detail of this is important. What I’ve been attempting to do is verify that the anomaly based area weighted reconstruction is of good quality. As we discussed before, even two thermometers of the same exact measurements value can have different anomalies when the timeframe is different. As an example, assume we have an noiseless linear upslope in temp for 20 years. Thermometer 1 measures for all 20 years and thermometer 2 measures for only 10 but both record the exact same number. When anomalized the mean of each record is zero, the thermometer 2 anomaly will have a lower value than thermometer 1 and the average will have a sudden step when the second thermometer is introduced.

These steps can be corrected by looking at the beginning of the new record and making sure it is offset to match the longer term record. This method implicitly makes the assumption that both records are the same even though we’re not sure what the heck thermometer two would have measured had it had existed for the same time as thermometer 1. Confounding the issue is the fact that we’re looking at tenths or hundredths of a degree C in noisy data that can vary by 10C anomaly per month.

To calculate reasonable offsets for shorter length surface stations, an algorithm was created that starts from the earliest 1957 ground station records and works its way forward. When a new station is introduced, it finds the absolute closest already offset station and computes an offset for the new station in the hopes that we can remove the step. Unfortunately, the noise level of the data makes the whole process less simple than we might imagine. First, the no offset anomaly data looks like this.

Read the rest of this entry »

Posted in Uncategorized | 13 Comments »

The Best Laid Plans

Posted by Jeff Id on August 30, 2009

My job for the Antarctic paper so far has been simple. Present an area weighted reconstruction of the surface stations as a simple sanity check of the serious math reconstruction Ryan and Nic have been doing and I’ve been reading along on by email. So if you’ve been visiting once in a while you know I’ve presented several area weighted reconstruction’s. The algorithm I prefer so far is one which infill’s each satellite gridcell by the closest station which has data for that year I’ve created several ‘dozen’ versions, the latest one worth publishing was shown Some Other Area Weighted Reconstrucions and others were presented at Area Weighted TPCA Check .

Since the latest reconstructions utilize the noisy automatic weather stations which weren’t employed until after 1982 there are a lot of new series starting in the second half of the data. This creates some problems. First, we need to recognize that the Manned stations are more consistent and of a better quality than AWS stations which can become buried in snow for extended periods of time. A second issue is also important, each anomaly is centered about it’s mean. Average = 0. So if you have two thermometers which measure exactly the same values always and one starts 10 years after the other. Both have a mean of all their data of zero so if any trend exists thermometer 2 which measures exactly the same temperature as thermometer 1 ends up having a slightly different anomaly.  When the two are averaged this creates a sudden step in the data.

Currently all presented plots do not correct for this factor. Since trends are fairly low, it isn’t a terrible method to use and we can expect that whatever corrected versions I come up with we’ll have a similar result. I’ve had several ideas of how to fix this annoying effect, some more stupid than others and none of which were worth posting on.

One particularly misguided method I used was to start at 1957 – the beginning year of our reconstruction – and offset existing Jan 1957 temps by zero. Then each time a new series came in I took the mean of all the other stations and applied that as the offset for the new data. I got a trend of 0.12 C/Decade which is out of whack completely and matches Steig et al. We know it’s out of whack by checking the surface station versions of the same algorithm which typically come to 0.04 or 0.05C/Decade. What the algorithm did by taking mean was to ignore the fact that the peninsula and Ross ice shelf are heavily oversampled in relation to the rest of the continent.

Ryan O made a great point that the offset in this manner assumes that when a new station is introduced and the mean is used, the temperature record for the new station prior to its existence is assumed to be the same as the continent. It’s an obviously incorrect assumption. In this post, I tried another method for correction. Since any offset assumes the trend of the new station prior to its own existence the next best option would be to choose the next closest station with existing data to compute the offset!

Well the algorithm I used was designed to calculated the offset between the new station and next closest station by using an average of the closest data points. I took the matrix of all ground station data on the sat grid 63 manned and aws stations and ran them through a filtering algorithm.  This matrix FYo (filtered Yo)was stored separately and used only for calculating offsets.

squarefilter=function(Yo=Yo,windowsize=3)

{

FYo=Yo #allocate memory

for(j in time(Yo))

{

FYo[time(Yo)==j,]= colMeans(window(Yo,start=j-windowsize,end=j+windowsize),na.rm=TRUE)

}

FYo

}

FYo=squarefilter(Yo,2)

Read the rest of this entry »

Posted in Uncategorized | 6 Comments »

Climate Chains

Posted by Jeff Id on August 28, 2009

Being skeptical about the science is one thing but I’m a denier about the solutions. Here’s a trailer which was fun to watch. The real scam of global warming isn’t the exaggerated science, it’s the solution.

Image3

Climate Chains Trailer - Click for YouTube

Posted in Uncategorized | 6 Comments »

Ten Unprecedented Points

Posted by Jeff Id on August 28, 2009

1 – Historic temperature reconstrucitons are critical for determining how much energy to put into the all important CO2 mitigation effort. Advocates like to redirect this point when a reconstruction fails quality control.

2 – There are no verified temperature proxies. NOT one proxy for thermometers has been verified to be associated with temperature in an engineering style test. Ice cores, speleothum, tree ring widths, tree ring latewood density, boreholes, pollen, sediments, mollusk shells or historic records. They are un-calibrated and completely unverified and several may be completely unrelated.

Read the rest of this entry »

Posted in Uncategorized | 24 Comments »

Artifacts

Posted by Jeff Id on August 27, 2009

Today a post from Curt brought my attention to the VonStorch 2004 paper where some of the same effects I’ve been discussing are covered. Recently a new blog called climatesight clipped my post which addressed the specious claims about conservatism, skepticism and denial. The claimed reason it was clipped was appparently due to a lack of references. When I inquired which points required references those questions were clipped too. If I was a naive person, I would consider that the two links back to the hockey stick page at the top here were the problem but actually, she likely saw me as too dangerous to her cause. Tamino’s training her well for her future climatology career where censorship and obfuscation go a long way.

Still extra references to base your work on are often good, and several papers have been published demonstrating and discussing the some of the effects that I’ve demonstrated here. I find most of them far to understated for my liking but at least they make the point.

Here’s a quote from VonStorch and Zorita 2004 (VZ04) my bold.

Hints of the underestimation of lowfrequency variability by empirical reconstruction methods have been found in previous studies, based either on short data sets (17) or climate simulations with fixed external forcing (23). In a study based entirely on an instrumental data set (17), the spectrum of the difference between the reconstructed and observed global mean annual temperature is, albeit consistent with a white noise assumption, slightly red. In a further analysis of an instrumental data set and data from a long control simulation with the Geophysical Fluid Dynamics Laboratory climate model (with constant external forcing) and a relatively short simulation of 143 years driven by varying external forcing (23), the spectra of the temperature differences from the analysis of control simulation are red (although again statistically compatible with white noise assumption). In this externally forced simulation, it was found that the temperature reconstructions are biased if the external forcing leads to nonstationary behavior in the verification period. In a long control simulation (1000 years) with the model ECHO-G (24), the spectrum of the reconstructed annual global temperature underestimates the spectrum of the simulated global temperature at very low frequencies.

Read the rest of this entry »

Posted in Uncategorized | 54 Comments »

More Hockey Mathmagic.

Posted by Jeff Id on August 27, 2009

In my last post we looked at the change in historic signal magnitude as it relates to the signal/noise ratio of proxies used in CPS.  This post takes the next step and explores what happens to the signal quality as we search for ever increasing R values. Although, I have a surface plot also, but the surface is difficult to interpret. The best method I found to show this variance is through video plot of the 2D graphs at different R values. This video assumes a fixed signal to noise ratio which is matched to the Schweingruber MXD latewood proxies.

First, recall that the signal has a true peak amplitude of 1 (figure 2) which is added to 10,000 arma simulated proxies. The signal is shown below.

signal added to proxies

Figure 1, Artificial signal added to proxies

Figure 3 is actually a video which is linked in YouTube that starts when you click on it, however the frame shown is CPS using a very low r value of 0.01. The only proxies rejected are those which have a negative correlation to the signal we’re looking for in this frame of the video. In this case, the temperature signal we’re looking for is a linear upslope from 0 to 1 in the last 100 years which exactly matches the artificial signal placed in the proxy data shown in Figure 1.

Note that the amplitude of the signal is reduced to about 0.3 from an initial value of 1. This is caused by scaling the standard deviation of the calibration range signal to match the standard deviation of the 0 – 1 temperature signal we’re looking for. Adding noise to the signal always increases the standard deviation! Think about that, no random signal can be added which decreases the standard deviation of the proxies on average.

Read the rest of this entry »

Posted in Uncategorized | 7 Comments »

More on CPS Signal Distortion

Posted by Jeff Id on August 26, 2009

I’ve been playing around with CPS some more and am trying to figure out how to correct for the signal de-amplification created by correlation based proxy sorting. As I have explained here Historic Hockey Stick – Pt 2 Shock and Recovery correlation does not respond linearly to noise. I’ve re-written that post several times to improve it so it reads completely differently than before. This post was created using the same 10000 ARMA simulated proxies as noise and adding a new signal with a square wave in the historic portion. By averaging the square wave which was spread over 200 years we can get a nice calculation of the signal magnitude. Figure 1 is the shape of the signal used.

signal added to proxies

Figure 1, Artificial Signal

Now to explore the effects of different noise levels on the ARMA data a multiplier was used from 0 – 1 in 0.01 steps. So each calculation used the Fig 1 signal + proxy * multiplier. This had the effect of adding noise levels from 0 to 1 times the proxy data. Since the proxy data had a standard deviation very close to 1 you can think of the multiplier as the standard deviation of the noise.

Since we have 101 individual CPS reconstructions of varying quality, a surface plot does a good job depicting the shape change in the recovered signal. The RMS axis is noise/signal because the plotting program needed ascending values. Each of the individual reconstructions is plotted on the time axis.

signal to noise CPS

Figure 2,Surface plot of CPS reconstructions at different noise levels

Read the rest of this entry »

Posted in Uncategorized | 26 Comments »

Your Glorious Government

Posted by Jeff Id on August 25, 2009

Tom Fuller has a follow up letter from Alan Carlin, the government employee who was disciplined for pointing out potential issues with the EPA document.  Alan’s views are not my own on several technical details but we share his plight for impartial review of climate science for the purpose of policy making and are in agreement about the true nature of the EPA.

Rather than take excerpts out and post them I’ll simply provide a link.

Update on Alan Carlin’s refusal to bow to the EPA brass.

Please consider giving him some support in the way of comments on Tom’s blog, the man has basically risked any future career in the EPA for simply trying to tell the truth as he sees it. Your support holds those in power back due to fear of backlash and lets them know we’re still paying attention.

It will be interesting to see how they play this out.

Thanks to JAE for this link.

Posted in Uncategorized | 11 Comments »

Often Wrong

Posted by Jeff Id on August 25, 2009

Anthony Watts has a post on this article at WUWT but it deserves a bit of venting here.

U.S. Chamber of Commerce seeks trial on global warming

I wanted to highlight some of the leftists attempts at defining and marginalizing anyone skeptical of government funded science. The article was written by Jim Tankersley in the never right but often wrong LA times.

The U.S. Chamber of Commerce, trying to ward off potentially sweeping federal emissions regulations, is pushing the Environmental Protection Agency to hold a rare public hearing on the scientific evidence for man-made climate change.

Sounds fine to me, there are valid reasons that government funded science claims of concluded science are false.   From my viewpoint the recent attempts to declare consensus only adds weight to the skeptics case.  There are many things to be skeptical about within the defined boundaries of climate science, the fact that these details are largely ignored by the declaration of consensus cannot be understated and should raise red flags everywhere.  Three critical items off the top of my head which are key to the global warming case and yet poorly defined are moisture feedback, solar forcing (past and present) and natural variability.  There are many others as well.

As an example, what is wrong about calculating or discussing the magnitude of natural variability?  Has it truly been defined?  So far the math I’ve seen says not just no but HELL no!  What should we do if we find natural variability  is far greater than CO2 warming can reasonably be?  Would we be capable of controlling the sun?  My own opinion is we would then not be capable of stopping the predicted disasters and should instead work on better quantifying and  coping with them.  From my reading there is good cause to believe nature is in charge still, hockey sticks and proxies are mostly garbage and previous ice melts are well documented. This cannot be ignored!  Well it shouldn’t be anyway.

Still that doesn’t prevent people from trying to marginalize reasonable debate.

“It would be evolution versus creationism,” said William Kovacs, the chamber’s senior vice president for environment, technology and regulatory affairs. “It would be the science of climate change on trial.”

Read the rest of this entry »

Posted in Uncategorized | 29 Comments »

Not Developing Nations

Posted by Jeff Id on August 24, 2009

One of the most frustrating terms invented by the progressive movement is the term ‘Developing Nation’. In it implies the obviously false assumption that undeveloped nations have somehow been repressed or unlucky. Nothing could be farther from the truth. In fact, undeveloped nations are almost always the most totalitarian governments in existence. Cuba, Venezuela, Zimbabwe, Egypt, Iran, the success is proportional to the freedom of the people. You can’t sell cars to poor farmers making less than $1/day.

Well Africa finally stepped up to the climate change handout bag looking for 67 Billion dollars. Equivalent to 216 dollars from every American man woman and child to support the cause of climate change.

Africa wants $67 bln a year in global warming funds

ADDIS ABABA, Aug 24 (Reuters) – African leaders will ask rich nations for $67 billion per year from 2020 to cushion the impact of global warming on the world’s poorest continent, according to a draft resolution seen by Reuters on Monday.

Environment and agriculture ministers from several nations are meeting at African Union (AU) headquarters in the Ethiopian capital Addis Ababa to try to agree a common stance before a U.N. summit on climate change in Copenhagen in December.

Experts say Africa contributes little to the pollution blamed for warming, but is likely to be hit hardest by the droughts, floods, heatwaves and rising sea levels forecast if climate change is not checked.

Read the rest of this entry »

Posted in Uncategorized | 20 Comments »

A Challenge to RC

Posted by Jeff Id on August 24, 2009

I just wanted to call some attention to a blog post by Dr. Pielke Sr. requesting a reply from Gavin Schmidt regarding the sensitivity of the atmosphere to night time measurements. Apparently they have concluded that the measurements sensitivity to cloud cover is imparting as much as 30% of the trend as measured from a single level (altitude) in the atmosphere.

“The stable nocturnal boundary layer does not measure the heat content in a large part of the atmosphere where the greenhouse signal should be the largest (Lin et al. 2007; Pielke et al. 2007a). Because of nonlinearities in some parameters of the stable boundary layer (McNider et al. 1995), minimum temperature is highly sensitive to slight changes in cloud cover, greenhouse gases, and other radiative forcings. However, this sensitivity is reflective of a change in the turbulent state of the atmosphere and a redistribution of heat not a change in the heat content of the atmosphere (Walters et al. 2007). Using the Lin et al. (2007) observational results, a conservative estimate of the warm bias resulting from measuring the temperature from a single level near the ground is around 0.21°C per decade (with the nighttime minimum temperature contributing a large part of this bias). Since land covers about 29% of the Earth.s surface, extrapolating this warm bias could explain about 30% of the IPCC estimate of global warming. In other words, consideration of the bias in temperature could reduce the IPCC trend to about 0.14°C per decade; still a warming, but not as large as indicated by the IPCC.”

Since Gavin is a climate modeler, Dr. Pielke went to him requesting a response to these papers of some kind. Apparently they are going unaddressed to date despite the implications of the conclusions.

Read the rest of this entry »

Posted in Uncategorized | 11 Comments »

Arctic Sea Ice Video Update

Posted by Jeff Id on August 23, 2009

As we approach the Arctic Sea Ice minimum, a lot of eyes are looking and projecting what the minimum will be. In a previous post I calculated the centroid of the sea ice as a method for determining how the weather patterns were affecting the data. About a month ago, it seemed that the weather pattern was going to support a leveling off of the sea ice shrink rate so that’s what I predicted and that’s what happened. The curve cut across the 2008 line and reached over until it touched the 2005 line.

AMSRE_Sea_Ice_Extent[1]Unfortunately, from this centroid video, it looks like the winds from the SouthEast in the image which created the huge reduction in Sea Ice in 2007 appears the have resarted this year. It’s already starting to accelerate the melting which caused this year’s red line to dip below the 2005 green line.

The shift in weather pattern is most visible in the shadows on the ice which are actually clouds blowing through. The shadows indicate the 29GhZ microwave data is sensitive to clouds which is part of the noise in the long term signal.

Read the rest of this entry »

Posted in Uncategorized | 14 Comments »

Confirmation Bias

Posted by Jeff Id on August 22, 2009

Recently at the inappropriately named Open Mind, Tamino produced a ‘two box’ solution set of equations representing the atmosphere’s response to forcings from the Nasa sanctioned dataset. He opened his comments with an incredibly pompous and in my opinion foolish paragraph on computer models which I’ll be happy to reproduce here.

Denialists love to denigrate computer models of earth’s climate. In my opinion they only do this because they’re in denial of the result, not because of any valid evidence. They also love to make the false claim that without computer models there’s no reason to believe that global warming is real, and is going to get worse.

The term “computer model” refers to an actual simulation of earth’s climate, often in remarkable detail. Such models are (of course!) not able to predict, or even post-dict, the chaotic aspects of the sytem (the weather), but they do an outstanding job of post-dicting the global statistical characterization of the system (the climate).

Besides the fact that Tamino is deliberately ignoring the known uncertainties in computer models he’s accusing anyone who doesn’t believe in their accuracy a denier. Astounding considering the models don’t even agree with each other, perhaps the models are in denial?

Read the rest of this entry »

Posted in Uncategorized | 40 Comments »

Times Taken to Task

Posted by Jeff Id on August 21, 2009

From Climate Depot, Dr. Hertzberg takes the leftist NYT to task for their intentionally faulty reporting on global warming.  It’s good to see this rag get beat up, you don’t have to think too hard to figure out why they’re going bankrupt.

———

Dr. Hertzberg’s August 19, 2009 Letter To The New York Times is Reprinted Below:

Distortions and misrepresentations of your coverage of global warming/climate change

I am a scientist who has studied the theory of human caused global warming for over 20 years, and it is both saddening and offensive to me as a scientist to see the Times continuously regurgitating the fear-mongering, anecdotal clap trap it is being fed by know-nothing environmentalists and global warming propagandists in the Gore-IPCC-Hansen camp. As an example, consider the latest article in today’s Times by Cornelia Dean and her regurgitation from NOAA’s Climate Change Center:

angryHomer

“The agency also said that, on average, Arctic sea ice covered 3.4 million square miles in July, 12.7 percent below the 1979-2000 average and the third lowest on record after 2007 and 2006″.

That description is a distortion and a complete misrepresentation of the actual data. For your benefit, I have attached the comprehensive, latest data record from Ole Humlum’s web site under the heading of Climate4you June 2009.” From the data on page 11 of that site, one obtains the following record for ice coverage for the months of July from 2002 until 2009 (after converting square kilometers to square miles):
Read the rest of this entry »

Posted in Uncategorized | 9 Comments »

SteveM Got a Little Press

Posted by Jeff Id on August 21, 2009

On the 18th Kevin Libin of the National post picked up the HadCrut idiocy.  As a good reporter should he tried to make sens of the claims made at face value.  Its fun to watch him struggle with it because I sure have.

Kevin Libin: You’ll just have to take our word on the global warming stuff

But probably nothing could damage the credibility of climate change believers than the recent revelation by the Climatic Research Unit (CRU) that it has lost or destroyed all the original data used to construct historic global temperature records. The CRU, at the University of East Anglia in the UK, which has been using information collected from weather stations across the globe for decades, is probably the most widely cited source worldwide for those mounting a case that the earth has exhibited an inexorable warming trend: its website boasts that CRU’s research has “set the agenda for the major research effort in, and political preoccupation with, climate research.” The critical raw climate data responsible, which scientists of all climate-creeds have a natural interest in, is now gone, apparently, forever. With the exception of a handful of countries that the CRU has agreements with to sell its data, all that remains for the bulk of the statistics are “value added” versions, which is to say, consolidated, homogenized data. Actually, the CRU says it doesn’t even have all the data for countries it has data-sharing agreements with. “We know that there were others, but cannot locate them, possibly as we’ve moved offices several times during the 1980s,” the CRU writes in a rather embarrassing explanation for all this posted on its website.

It’s good to see the press going after some of this ridiculous covering of data.  It’s very clear that HadCrut has the data, after all they post global averages from it.  Claims they don’t have raw data are untrue in my opinion, they are acting like people with something to hide.  Interesting considering they have the highest warming trend of any temperature dataset. Read the rest of this entry »

Posted in Uncategorized | 5 Comments »

 
Follow

Get every new post delivered to your Inbox.

Join 133 other followers