the Air Vent

Because the world needs another opinion

Archive for March, 2014

The Truth Will Out

Posted by Jeff Id on March 28, 2014

Richard Tol

“It is pretty damn obvious that there are positive impacts of climate change, even though we are not always allowed to talk about them,” he said. ”

Article here: Simple reality just the way normal people like it

 

 

Posted in Uncategorized | 30 Comments »

Good news for stocks folks – The End of the World is Moved Back —- again!

Posted by Jeff Id on March 22, 2014

It’s unfortunate that I don’t have time for blogging cause I’m sure there is some easy math to play with in this article by Michael Mann.   The article is essentailly “climate porn” for believers who are still praying for the end of the world.  In it, even he admits that temperatures aren’t rising, although he stays away from recognizing that climate model gods have failed him.   The intent of the article seems to be to move the bar of absolute global doom – apparently because the last bar was missed.

sandwich-board-man-warns-us-of-impending-doom[1]

Who could have guessed that in pre-apocalyptic earth, end of the worlders would have created such lucrative and easy jobs for themselves.

The title of the article was originally – False Hope.   The text of the article leads us to the understanding that we have false hope that the world WON’T end, that hope being given to us by the fact that the world ISN’T warming as the hundred-plus billion dollar climate sandwich board industry predicted.    None of those problems have reached the level where “Scientific American” is concerned about their own credibility, as they have prostrated themselves before that particular climate god so many times in the past that the practice of publishing climate based nonsense from the right people is reflexive at this point.

The new signpost of doom is planted by Mann, firmly in the soil at 2036 with this improved title.

Earth Will Cross the Climate Danger Threshold by 2036

Interestingly, Mann was able to write climate sensitivity levels far lower than I would have expected without himself being called a “denier”.  There are plenty of contoversial statements in the article starting with this one which I challenge anyone to provide proof of:

Most scientists concur that two degrees C of warming above the temperature during preindustrial time would harm all sectors of civilization—food, water, health, land, national security, energy and economic prosperity.

I believe most actual “scientists” would say the opposite.  Rather two C would be beneficial from food, water, health, land, etc perspectives.  This national security nonsense is for the purpose of collecting money from that government channel and exists for no other reason.  However, I don’t have proof of more scientists claiming 2C is good for humans than Mann’s fabricated claim that MOST scientists believe it will be bad.  At least in my case I admit it.

The Two C by 2036 claim is most interesting because that is only 22 years away and we have only warmed about 0.8C since 1950 i.e. THE global warming years.  Despite the pause, which he admits actually exists in this article, Mann is predicting a MASSIVE increase in global warming trend over the next 22 years of 0.6C/decade!  Of course he does it by using a 3C climate sensitivity that other climate scientists have found to be over 2X what observations show.  I think that even 3C puts him more toward the lower middle range of the now known to be defective climate models.

To my wonder, I found that for an ECS of three degrees C, our planet would cross the dangerous warming threshold of two degrees C in 2036, only 22 years from now. When I considered the lower ECS value of 2.5 degrees C, the world would cross the threshold in 2046, just 10 years later [see graph on pages 78 and 79].

The article is far more entertaining than I have written here but that is all I have time for at the moment.   Feel free to copy your favorite bits below.

Posted in Uncategorized | 48 Comments »

Strike Out

Posted by Jeff Id on March 20, 2014

Bishop Hill blog has posted a fun thread on our favorite climate scientist wannabe, Dr. Stephen Lewandowsky.  Apparently the moon landing followup psycho-paper was removed from publication entirely.   A rare and embarrassing event to be sure.  Especially for such strong pro-science individuals as Dana Nuccitelli of Skeptical Science™ (SkS) fame.

We are told from the BH article that the post vanished from their blog shortly after publication.   Probably due to the fact that the blatant lack of objectivity didn’t quite reach the bar for a typical SkS post.  The whole mess is stuck in the Google cash for us to read here.

My favorite quote in the deleted article is this:

Lewandowsky, known for his creative publication titles, came up with another doosey for the follow-up paper:

I’m glad Lew is known for his titles, cause it ain’t his sciency skills that are going to put food on the table.

Dana writes:

Frontiers Bails Out

However, nobody likes being called a conspiracy theorist, and thus climate contrarians really didn’t appreciate Recursive Fury.  Very soon after its publication, the journal Frontiers was receiving letters from contrarians threatening libel lawsuits.  In late March 2013, the journal decided to “provisionally remove the link to the article while these issues are investigated.”  The paper was in limbo for nearly a full year until Frontiers finally caved to these threats.

In its investigation, the journal found no academic or ethical problems with Recursive Fury.  However, the fear of being sued by contrarians for libel remained.  The University of Western Australia (UWA: Lewandowsky’s university when Recursive Fury was published – he later moved to the University of Bristol) also investigated the matter and found no academic, ethical, or legal problems with the paper.  In fact, UWA is so confident in the validity of the paper that they’re hosting it on their own servers.

After nearly a year of discussions between the journal, the paper authors, and lawyers on both sides, Frontiers made it clear that they were unwilling to take the risk of publishing the paper and being open to potential frivolous lawsuits.  Both sides have finally agreed to retract Recursive Fury.

It’s unfortunate that the Frontiers editors were unwilling to stand behind a study that they admitted was sound from an academic and ethical standpoint, especially since UWA concluded the paper would withstand a legal assault.  Nobody wants to get caught up in a lawsuit, but by caving in here, Frontiers has undoubtedly emboldened climate contrarians to use this tactic again in the future to suppress inconvenient research.  Academics also can’t be confident that the Frontiers staff will stand behind them if they publish research in the journal and are subjected to similar frivolous attacks.  Frontiers may very well be worse off having lost the confidence of the academic community than if they had called the bluffs of the contrarians threatening frivolous lawsuits.

Hopefully editors of other climate-related journals will learn from this debacle and refuse to let climate contrarians bully them into suppressing valid but inconvenient research.

So it was those evil well-funded skeptics who beat up on the poor government funded science team who brazenly accused a bunch of people of saying and believing things they didn’t, using blatantly fraudulent statistics and making complete asses of themselves, all in an attempt to “discredit” those who recognize that OERVATOINS ARE NOT WARMING AS MUCH AS CLIMATE MODELS!

Not even close.

In the non-government world where people need to produce something functional to make a living, we have a word for non-productive people like Lewandowsky and Dana.

Idiots!

And then we fire them.

Posted in Uncategorized | 9 Comments »

Proxy Hammer

Posted by Jeff Id on March 16, 2014

This idea has been in my head for some time.   I think RomanM deserves the credit for the concept, I have simply applied it to tree latewood density data.   The idea is to use the actual “thermal hammer” least squares functions to provide a minimized sum of squares of errors fit to tree data.   For readers who don’t recall the method, I have several posts on it:

http://statpad.wordpress.com/2010/02/19/combining-stations-plan-b/

http://statpad.wordpress.com/2010/02/25/comparing-single-and-monthly-offsets/

http://statpad.wordpress.com/2010/03/04/weighted-seasonal-offsets/

http://statpad.wordpress.com/2010/03/18/anomaly-regression-%E2%80%93-do-it-right/

https://noconsensus.wordpress.com/2010/03/21/demonstration-of-improved-anomaly-calculation/

https://noconsensus.wordpress.com/2010/03/24/thermal-hammer/

https://noconsensus.wordpress.com/2010/03/25/thermal-hammer-part-deux/

Since it is unlikely that many will read those links, the idea behind them was to take care of the anomaly calculation and the offset of temperature station data in different altitudes and regions with a single step least-squares calculation. So when looking for a global average, a colder mountain  based station can be combined with a valley station in a highly optimized manner.

Well, some years ago, Roman left a comment about applying the methods to proxy data.   I found the comment interesting enough that it hasn’t disappeared from my thoughts over all of that time.  One of the main concerns in dendroclimatology is finding the most optimal method for combining series of tree data such that long term signals are not repressed.

Example:

Temperature is rising from 0-1 C over 100 years.

You have two thermometer trees (trees supposedly responding to temperature linearly with their growth) and they measure such that the first tree grows from year 0-55 and the second tree from year 45-100 so each tree has an overalap period from years 45-55.

Results of different methods:

First, the actual temperature which occurred and the two perfect trees which recorded it.   Between years 45 and 55 the black line is covered by the red line.  temp that occurred

Now the typical assumption is that each nearby tree may respond a little differently to temperature due to factors such as soil condition or local competition.  This makes the calibration of each tree a separate calculation.  In my previous post, I used a mean of the tree ring size and got a very flat looking reconstruction with no substantial deviations from an average growth rate.  The calculations subtracted the mean of each tree and scaled them by standard deviation before averaging.   Subtracting the mean is a standard practice in the field of dendrochronology prior to a variety of ad-hoc methods to restore the long term variance.   If we don’t work to restore long-term variance, the following two plots show what happens.

First there are two overlapping thermometer trees with means subtracted.

temp that was recordedThis is what the mean looks like after centering:

temp from simple reconstruction of meansOf note is that while the series is now centered, the original temperature experienced by these thermometers was 0 to 1 degrees, and this curve is from -0.27 to 0.27.  We can see that the long term variance (signal) is suppressed by the method.

Using the least-squares offset calculation functions introduced in the thermal hammer method, we can take the two offset treemometer series and calculate the ideal offset for each series.   The result is shown in the next graph:

temp from least squares reconstruction

The data is from -.5 to 0.5 or exactly 1 and has no steps in the center where the data from the two series overlaps.   The great bit about this multi-variate  least squares approach, is that you can literally ‘stuff’ as many series as you want into it and find the minimum least squares error for every single one.

—–

Growth curve:

The meat of the post in this case is very long and takes some explanation.   I’m going to brush by a few concepts without a lot of explanation, those with familiarity in dendro issues won’t have trouble with this but others may.

First, I piled the polar.mxd latewood density data into a matrix by tree age and calculated a growth curve by mean of the year.   This curve has annual level noise which is clearly not related to the age of the tree (growth curve). I calculated the curve different ways but personally it is an ad-hoc correction for something we know exists so my favorite correction is the simple way of just low-pass filtering the signal.  Spline or other fits make little difference.   For this data, we did demonstrate previously that there is a signal in the growth curve that should to be corrected for, still some may not like the correction I used as it can cause unwanted effects, so I performed the following calculations both with and without age-based corrections.

Method:

I provide software below so readers can check whether my methods are what I claim.

Trees (123 core samples) are arranged into a timeseries of 400 x 123 with each tree starting at year zero

Age curve is calculated by averaging the tree density value for each year

Filtered age curve is subtracted from each tree in no growth signal matrix – ngsalltrees

Trees are compiled into larger matrix with start year being actual year in which tree grew.  Other values are NA.

No offset simple reconstructions were done by taking row means.

Least squares reconstructions were then done to minimize least squares error overlapping data for entire matrix.

Results:

First,  we have the simple mean reconstructions without and with growth curve corrections.   Note the flatness of the long term signal.

no growth correction simple mean reconstruction

The next graph includes the age based growth curve correction.

filtered mean growth curve correction recon

Then we start to get to the important part of this post.  The first least-squares reconstruction from tree data that I have seen anywhere.   To be clear, this is the same data which produced the previous two graphs.

least squares offset reconstruction no growth correction

This next plot has the growth signal removed from each tree, is offset by least squares.least squares offset reconstruction filtered growth correctionConclusions:

I think there is plenty to discuss about the shape of this curve and whether it has anything to do with temperature at all.   From our previous posts, we know there is a good quality high frequency match to regional temperature data.  I will need to take some time to see if trends match better.  While I am still highly skeptical that this is a representation of long term temperature, this method is far more objective than the ad-hoc iterative approach used in the Briffa 2013 “no-signal” dendrocrhonolgy paper.

While I’ve noticed the propensity for people to look at these graphs and assume they are temperature, I must warn that the limitations of this method for recreating long term trends are significant.  Each series knits with residual error to the other series so that as we go further from the calibration period in time a “walk” in the series occurs.  Since calibration is in recent years when thermometers exist, growth curve accuracy is lost in history.   I think some sensitivity analysis of the method to added noise and different growth curve strategies is in order.  If other changes don’t generate significant differences though, and I don’t think they will, the last curve is simply the data and either we accept that it is temperature only or something else.

Each method used in proxy history seems to have its own disadvantages, no-signal dendrochronlogy of Briffa 2013 represses the effect by removing the average signal in as the growth data and employing an average.    While the no-signal method will regularly produce a less dramatic looking chronology, it cannot capture long term variance from tree growth in significantly different altitude or soil conditions.

Although this is the first time I have ever seen this method used in proxy based reconstructions, I cannot take credit for it as RomanM of Climate Audit fame came up with the foundational concept.

I will post the code later tonight!


#ROMAN M LEAST SQUARES OFFSET FUNCTIONS
psx.inv = function(mat,tol = NULL)
{
    if (NCOL(mat)==1) return( mat /sum(mat^2))
    msvd = svd(mat)
    dind = msvd$d
    if (is.null(tol))
    {
        tol = max(NROW(mat),NCOL(mat))*max(dind)*.Machine$double.eps
    }
    dind[dind
    dind[dind="">0] = 1/dind[dind>0]
    inv = msvd$v %*% diag(dind, length(dind)) %*% t(msvd$u)
    inv
}
### subfunction to do offsets
calcx.offset = function(tdat,wts)
{
    ## new version
    nr = length(wts)
    delt.mat = !is.na(tdat)
    delt.vec = rowSums(delt.mat)
    row.miss= (delt.vec ==0)
    delt2 = delt.mat/(delt.vec+row.miss)
    co.mat = diag(colSums(delt.mat)) - (t(delt.mat)%*% delt2)
    co.vec = colSums(delt.mat*tdat,na.rm=T) - colSums(rowSums(delt.mat*tdat,na.rm=T)*delt2)
    co.mat[nr,] = wts
    co.vec[nr]=0
    psx.inv(co.mat)%*%co.vec
}

#### load external functions filtering used
source("http://www.climateaudit.info/scripts/utilities.txt")  #Steve McIntyre

### Gausssian filter
ff=function(x)
{
	filter.combine.pad(x,truncated.gauss.weights(51) )[,2]#31
}

### load briffa data
loc="c:/agw/briffa 2013/data/raw/polar/polar.mxd"
wd=c(12,6,6,6,6,6,6,6,6,6,6)
dat=read.fwf(loc,widths=wd)

treename=substr(dat[,1],1,8)
treeid=levels(factor(treename))
year = as.numeric(substr(dat[,1],9,12))
allyear=levels(factor(year))

alltrees=array(NA,dim=c(400,length(treeid)))
treemin=rep(NA,length(treeid))

### align trees by age rather than year
for (i in 1:length(treeid))
{
	mask= treename==treeid[i]
	da=dat[mask,]
	yr=year[mask]
	treemin[i]=min(yr)
	for(j in 1:length(yr))
	{
		ageindex=yr[j]-treemin[i]+1
		alltrees[ageindex:(ageindex+9),i]=as.numeric(da[j,2:11])
	}
}

mask= alltrees== -9999 | alltrees== -9990
alltrees[mask]=NA
alltrees=ts(alltrees,start=1)

#plot(alltrees[,1:10])

### center and normalize all trees by standard deviation
alltrees=t(t(alltrees)-colMeans(alltrees,na.rm=TRUE)) #center
alltrees=t(t(alltrees)/sd(alltrees,na.rm=TRUE)) #normalize to sdev
alltrees=ts(alltrees,start=1)

## calculate growth curve
## this curve is an age based mean value of each tree's growth
## The curve is low-pass filtered to remove high frequency non-age related noise

par(bg="gray90")
growthcurve=ff(ts(rowMeans(alltrees,na.rm=TRUE),start=1))
plot(growthcurve,main="Growth Signal from Briffa 2013 Polar MXD Data",xlab="Age (years)",ylab="Unitless",ylim=c(-1,1))

##no growth signal version of alltrees
ngsalltrees=alltrees-growthcurve

## create year based matrix from no growth trees start year is 870
tsdat= ts(matrix(nrow=1800, ncol=length(treeid)),start=870)
ngtsdat=ts(matrix(nrow=1800, ncol=length(treeid)),start=870)

for(i in 1:dim(ngsalltrees)[2])

{
	yearindex=treemin[i]-870
	print(yearindex)
	tsdat[yearindex:(yearindex+399),i]=ngsalltrees[,i]
	ngtsdat[yearindex:(yearindex+399),i]=alltrees[,i]
}

#################
# no growth calculations
ngaveragereconstruction=window(ts(rowMeans(ngtsdat,na.rm=TRUE),start=870),end=2010)
averagereconstruction=window(ts(rowMeans(tsdat,na.rm=TRUE),start=870),end=2010)
plot(ngaveragereconstruction,main="Simple Mean Reconstruction\nFiltered Growth Curve Correction",xlab="Year",ylab="Unitless Growth")
lines(ff(ngaveragereconstruction),col="red",lwd=3)

plot(averagereconstruction,main="Simple Mean Reconstruction\nNo Growth Curve Correction",xlab="Year",ylab="Unitless Growth")
lines(ff(averagereconstruction),col="red",lwd=3)

plot(ngaveragereconstruction)
plot(averagereconstruction)

# Least squares offset reconstruction
a=calcx.offset(tsdat,rep(1,dim(alltrees)[2]))  #.1711838
tsdatoffset=t(t(tsdat)-(as.vector(a)))
lsreconstruction=window(ts(rowMeans(tsdatoffset,na.rm=TRUE),start=870),end=2010)
plot(lsreconstruction,main="Least Squares Offset Reconstruction\nNo Growth Curve Correction",xlab="Year",ylab="Unitless Growth")
lines(ff(lsreconstruction),col="red",lwd=3)

# Least squares offset reconstruction
a=calcx.offset(ngtsdat,rep(1,dim(alltrees)[2]))  #.1711838
ngtsdatoffset=t(t(ngtsdat)-(as.vector(a)))
nglsreconstruction=window(ts(rowMeans(ngtsdatoffset,na.rm=TRUE),start=870),end=2010)
plot(nglsreconstruction,main="Least Squares Offset Reconstruction\nFiltered Growth Curve Correction",xlab="Year",ylab="Unitless Growth")
lines(ff(nglsreconstruction),col="red",lwd=3)

Posted in Uncategorized | 14 Comments »

Climate Sensitivity

Posted by Jeff Id on March 6, 2014

I want to urge everyone interested in climate change science to take their time to read this paper by Nic Lewis and Marcel Crok.  Nic was a blog-famous coauthor of the O’Donnell Antarctic correction to Steig 09.  His role in that article was in review of the mathematics and software developed to do the corrected reconstruction.   From that time, his publications, and some email conversations since, I happen to know that Nic is probably the most underrated scientist/mathematician working in the climate field.  He has time and patience beyond most and his work is vetted at a level we currently don’t see in climate science anywhere, in any article, skeptics or otherwise.

Oversensitive – Final

A Sensitive Matter – Final

Now Nic follows data, so the fact that he is probably considered by Real Climate types to be a skeptic is only the fault of the data.   If the data isn’t going your way, you can be sure that Nic will go that way as well.

Judith Curry has a post here

Anthony Watts has a post here

If you are a climate scientist reading these articles, open your mind and look deeply.  Ask questions of the authors.  Find the flaw.

Posted in Uncategorized | 55 Comments »