the Air Vent

Because the world needs another opinion

• JeffId1 at gmail dot com

Everything you need to know about Climategate. Now available at Amazon.com

 hunter on Scientific Warming omanuel on Scientific Warming Jeff Id on Scientific Warming stan on Scientific Warming hunter on Scientific Warming hunter on Scientific Warming Jeff Id on Scientific Warming Tom Fuller on Scientific Warming Tom Fuller on Scientific Warming hunter on Scientific Warming Another Ian on Scientific Warming Frank on A good discussion Paul_K on A good discussion Frank on A good discussion Frank on A good discussion

Join 148 other followers

Posted by Jeff Id on January 25, 2014

To look at the local temp data in comparison to the Briffa 2013 polar.mxd data, I downloaded both the RSS and UAH gridded lower troposphere satellite data.   Unlike ground temperature data, the lower troposphere data is a layer of air which extends miles above the surface of the earth (Blue curve in Figure 1).

Figure 1 – From World Climate Report

UAH and RSS regional trends are shown in Figures 2 and 3.

Figure 2

Figure 3

From past posts here, we have discussed that UAH estimates the polar regions whereas RSS leaves them out.   For fun, I plotted the difference of the two series globally and placed a green marker over the Yamal area where the data for Briffa 2013 was taken.

Figure 4

Data from that single gridcell for each series is sown in Figures 5 and 6.

Figure 5

Figure 6

The two series are mostly comprised of the same data.  UAH made the decision to change to station keeping satellites in recent years which eliminates some of the corrections necessary for satellites with orbits that decay over years.   Still the series are very highly correlated with an r^2 of 0.93.  Next,  I will get the ground data series for the region and then we can see how well things match up with the treerings.

I almost forgot:  R sourcecode is below and should be turnkey.

```
plotTrend = function (x, st = 1978, en = c(2013, 12), y.pos = NA, x.pos = NA, main.t = "Untitled") {

### Get trend
trend = lm(window(x, st, en) ~ I(time(window(x, st, en))))

### Initialize variables
N = length(window(x, st, en))
I = seq(1:N) / frequency(x)
rmI = ssq(I - mean(I))

### Calculate sum of squared errors
SSE = ssq(trend[[2]]) / (N - 2)

### Calculate OLS standard errors
seOLS = sqrt(SSE / rmI) * 10

### Calculate two-tailed 95% CI
ci95 = seOLS * 1.964

### Get lag-1 ACF
resids = residuals(trend)
r1 = acf(resids, lag.max=1, plot=FALSE)\$acf[[2]]

### Calculate CIs with Quenouille adjustment for autocorrelation
Q = (1 - r1) / (1 + r1)
ciQ = ci95 / sqrt(Q)

### Plot data
plot(window(x, st, en), main=main.t, ylab="Anomaly Deg C/Dec")
grid()

abline(h = 0, col = 2)
abline(trend, col = 4, lwd = 2)

### Annotate text
text(x = ifelse(is.na(x.pos), st, x.pos), y = ifelse(is.na(y.pos), max(x, na.rm = TRUE) - 0.2 * max(x, na.rm = TRUE), y.pos), paste("Slope: ", signif(trend\$coef[2] * 10, 3), "+/-", signif(ciQ, 3), "Deg C/Decade\n(corrected for AR(1) serial correlation)\nLag-1 value:", signif(r1, 3)), cex = 0.8, col = 4, pos = 4)

###  Return
r = list(trend\$coef[2] * 10, ciQ)
}

cal.Isig=function(dat=dat, st=0,en=2020)
{
###  Get trend
fm=lm(window(dat, st, en)~I(time(window(dat, st, en))))

###  Initialize variables
N=length(window(dat, st, en))
I=seq(1:N)/frequency(dat)
rmI=ssq(I-mean(I))

###  Calculate sum of squared errors
SSE=ssq(fm[[2]])/(N-2)

###  Calculate OLS standard errors
seOLS=sqrt(SSE/rmI)

###  Calculate two-tailed 95% CI
ci95=seOLS*1.964

###  Get lag-1 ACF
resids=residuals(lm(window(dat, st, en)~seq(1:N)))
r1=acf(resids, lag.max=1, plot=FALSE)\$acf[[2]]

###  Calculate CIs with Quenouille (Santer) adjustment for autocorrelation
Q=(1-r1)/(1+r1)
ciQ=ci95/sqrt(Q)

return(c(fm\$coef[2],ciQ))
}

ssq=function(x) {sum(x^2)}

plt.avg=function(dat, st=1957, en=c(2006, 12), y.pos=NA, x.pos=NA, main.t="Untitled") {

### Get trend
fm=lm(window(dat, st, en)~I(time(window(dat, st, en))))

### Initialize variables
N=length(window(dat, st, en))
I=seq(1:N)/frequency(dat)
rmI=ssq(I-mean(I))

### Calculate sum of squared errors
SSE=ssq(fm[[2]])/(N-2)

### Calculate OLS standard errors
seOLS=sqrt(SSE/rmI)*10

### Calculate two-tailed 95% CI
ci95=seOLS*1.964

### Get lag-1 ACF
resids=residuals(lm(window(dat, st, en)~seq(1:N)))
r1=acf(resids, lag.max=1, plot=FALSE)\$acf[[2]]

### Calculate CIs with Quenouille (Santer) adjustment for autocorrelation
Q=(1-r1)/(1+r1)
ciQ=ci95/sqrt(Q)

### Plot data
plot(window(dat, st, en), main=main.t, ylab="Anomaly (Deg C)")

abline(h=0, col=2); abline(fm, col=4, lwd=2)

### Annotate text
text(x=ifelse(is.na(x.pos), st, x.pos), y=ifelse(is.na(y.pos), max(dat,na.rm=TRUE)-0.2*max(dat,na.rm=TRUE), y.pos), paste("Slope: ", round(fm\$coef[2]*10, 4), "+/-", round(ciQ, 4), "Deg C/Decade\n(corrected for AR(1) serial correlation)\nLag-1 value:", round(r1, 3)), cex=.8, col=4, pos=4)
}
library(mapproj)

plt.map=function(dat, coord=sat.coord,maxdat=10,mindat=-10,tit="None")
{
#pick map colors
col=colorRampPalette(c("darkblue","white","darkred"))
mapcolors <- col(51)#51 colors
### regress data so that colors are set to zero anomaly equals middle
colors=dat*51/(maxdat-mindat)+26

### Set up Antarctic projection and lat/lon lines
par(mar=c(3, 2, 2, 2))
map("world", proj="mollweide", ylim=c(-90, 90),wrap=TRUE)
title(tit)
#map("world", proj="gilbert", ylim=c(-90, 90),wrap=TRUE)

### Plot points
xy=mapproject(coord[, 1], coord[, 2])
points(xy\$x, xy\$y, pch=15, cex=1, col=mapcolors[colors])

### Overlay map
map("world", proj="", ylim=c(-90, 90),wrap=TRUE, fill=FALSE, add=TRUE)
legend("right",legend=c(seq(mindat,maxdat,length.out=11)),fill=mapcolors[seq(1,51,length.out=11)])
}

######################################
# Setup lat/lon coordinates for UAH and RSS data
setupcoord=function()
{
lo=seq(-178.75,178.75,2.5)
lt=seq(-88.75,88.75,2.5)
lon=rep(lo,72)
lat=rep(lt,144)
dim(lat)=c(72,144)
lat=t(lat)
crd=c(lon,lat)
dim(crd)=c(72*144,2)
crd
}

#########################################
uahts=function()
{
uahgts=ts(array(NA,dim=c((2014-1978)*12,144*72)),start=1978,deltat=1/12)

for(jj in 1978:2013)
{
loc=paste("http://vortex.nsstc.uah.edu/data/msu/t2lt/tltmonamg.",jj,"_5.6",sep="")
UAHG=as.numeric(unlist(t(as.vector(UAHG))))

montleng=144*72+16
mont=length(UAHG)/montleng

for(i in 0: (mont-1))
{
strt=montleng*i
yr = UAHG[strt+2]
mon= UAHG[strt+3]
ind=(yr-1978)*12+mon
strt=strt+16
uahgts[ind,]=UAHG[strt:(strt+144*72-1)]
print(jj)
}
}

uahgts=uahgts/100
uahgts
}

###################################################
#ftp://ftp.ssmi.com/msu/data/uah_compatible_format/channel_tlt_tb_anom_1978.v03_3.txt

{
wd=c(5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5)
for(jj in 1978:2013)
{
loc=paste("ftp://ftp.ssmi.com/msu/data/uah_compatible_format/channel_tlt_tb_anom_",jj,".v03_3.txt",sep="")

montleng=144*72+16

for(i in 0: (mont-1))
{
strt=montleng*i
ind=(yr-1978)*12+mon
strt=strt+16
print(jj)
}
}
}

###############################################

crd=setupcoord()

plt.map(uahgts[360,],crd,maxdat=5,mindat=-5,tit="UAH Temp Anomaly Nov 07")

slpuah=rep(NA,10368)

for(i in 1:10368)
{
{
}

}

for(i in 1:10368)
{
if(sum(!is.na(uahgts[,i]))>1)
{
slpuah[i]=coef(lsfit(time(uahgts),uahgts[,i]))[2]
}

}

#plot yamal
# (66.52N,65.38E, 250m.a.s.l.).
xy=mapproject(66.52, 65.38)
points(xy\$x, xy\$y, pch=16, cex=2, col="green")

#find yamal in matrix
##a=which(crd[,1]==66.25)
##b=which(crd[,2]==66.25)
## crd 9027 === yamalia

uahyamal=uahgts[,9027]

plotTrend(uahyamal,main.t="UAH Yamal Temperature Anomaly Trend")
plotTrend(window(recon,start=1978))
lines(ff(uahyamal),col="green")

```

1. timetochooseagainsaid

The above chart for weighting profiles appears to label as “tropospheric temperatures” the MT weighting function. The LT weighting function has more weight lower in the atmosphere and is (theoretically at least) more tightly coupled to surface temperature. This chart:

From Remote Sensing Systems is a bit more comprehensive. I’ve found it useful for a lot of purposes.

At any rate. I have been working for a while now on an “empirical amplification factor” technique to estimate potential bias in surface temperature data which is based on regression of detrended anomalies. the suggested amplification coefficient is then used to convert satellite anomalies into estimated surface temperature anomalies. It would be interesting to see if, in this particular region, there is an amplification factor and if so, what that might imply about the possibility of trend bias.

As an example, over the continental US, the surprising result was: the surface varies more than the atmosphere, but basically in sync with it, and after account for this, the trends were almost exactly the same. This was surprising to me because this was the *opposite* of the result I got with global data: the atmosphere varies more than the surface, and once this is accounted for, surface temperature increases appear exaggerated-at least if we take the satellite data seriously.

Here is the post where I explain this:

http://devoidofnulls.wordpress.com/2013/11/25/temperature-data-internally-consistent-beliefs-and-data-bias/