Predictions Of Global Mean Temperatures & IPCC Projections


By Girma Orssengo, B. Tech, MASc, PhD

orssengo@lycos.com

April 2010

The Intergovernmental Panel on Climate Change (IPCC) claims that human emission of CO2 causes catastrophic global warming. When such extraordinary claim is made, every one with background in science has to look at the data and verify whether the claim is justified or not. In this article, a mathematical model was developed that agrees with observed Global Mean Temperature Anomaly (GMTA), and its prediction shows global cooling by about 0.42 deg C until 2030. Also, comparison of observed increase in human emission of CO2 with increase in GMTA during the 20th century shows no relationship between the two. As a result, the claim by the IPCC of climate catastrophe is not supported by the data.

Fossil fuels allowed man to live his life as a proud human, but the IPCC asserts its use causes catastrophic global warming. Fortunately, the global warming claim by the IPCC that “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenario” [1] is not supported by observations as shown in Figure 1, which shows a plateau for the global mean temperature trend for the last decade.

Figure 1. Observed temperatures are less than all IPCC projections. The observed temperatures are from the Climate Research Unit of the Hadley Center [2].

Figure 1 also shows that the observed temperatures are even less than the IPCC projections for emission held constant at the 2000 level.

As a result, the statement we often hear from authorities like UN Secretary-General Ban Ki-moon that “climate change is accelerating at a much faster pace than was previously thought by scientists” [3] is incorrect.

Thanks for the release of private emails of climate scientists, we can now learn from their own words whether global warming “is accelerating at a much faster pace” or not. In an email dated 3-Jan-2009, Mike MacCracken wrote to Phil Jones, Folland and Chris [4]:

I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong. Otherwise, the Skeptics will be all over us–the world is really cooling, the models are no good, etc. And all this just as the US is about ready to get serious on the issue.

We all, and you all in particular, need to be prepared.

Similarly, in an email dated 24-Oct-2008, Mick Kelly wrote to Phil Jones [5]:

Just updated my global temperature trend graphic for a public talk and noted that the level has really been quite stable since 2000 or so and 2008 doesn’t look too hot.

Be awkward if we went through a early 1940s type swing!

The above statements from the climategate emails conclusively prove that the widely used phrase by authorities in public that global warming “is accelerating at a much faster pace” is supported neither by climate scientists in private nor by the observed data.

Thanks also goes to the Climate Research Unit (CRU) of the Hadley Center for daring to publish global mean temperature data that is “quite stable since 2000”, which is contrary to IPCC projections of 0.2 deg C warming per decade. If the CRU had not done this, we would have been forced to swallow the extremely irrational concept that the gas CO2, a plant food, i.e. foundation of life, is a pollutant because it causes catastrophic global warming.

As IPCC’s “models are no good”, it is the objective of this article to develop a valid mathematical global mean temperature model based on observed temperature patterns.

Mathematical Model For The Global Mean Temperature Anomaly (GMTA) Based On Observed Temperature Patterns

The Global Mean Temperature Anomaly (GMTA) data from the Climate Research Unit (CRU) of the Hadley Center shown in Figure 2 will be used to develop the mathematical model. In this article, the observed GMTA data from the CRU are assumed to be valid.

Examination of Figure 2 shows that the globe is warming at a linear rate as shown by the least square trend central line given by the equation

Linear anomaly in deg C = 0.0059*(Year-1880) – 0.52                           Equation 1

Figure 2 also shows that superimposed on this linear anomaly line there is an oscillating anomaly that gives the Global Mean Temperature Anomaly (GMTA) the characteristics summarized in Table 1.

Table 1. Characteristics of the observed Global Mean Temperature Anomaly (GMTA) shown in Figure 2.

From 1880s to 1910s End of warming, plateau at –0.2 deg C & then cooling trend
From 1910s to 1940s End of cooling, plateau at –0.6 deg C & then warming trend
From 1940s to 1970s End of warming, plateau at 0.1 deg C & then cooling trend
From 1970s to 2000s End of cooling, plateau at –0.3 deg C & then warming trend
From 2000s to 2030s End of warming, plateau at 0.5 deg C & then     ?         trend

A mathematical model can be developed that satisfies the requirements listed in Table 1.  If the model to be developed gives good approximation for the GMTA values at its turning points (plateaus) and the GMTA trends between its successive turning points as summarized in Table 1, the model may be used for prediction.

Figure 2. Observed Global Yearly Mean Temperature Anomaly (GMTA) from the Climate Research Unit (CRU) of the Hadley Centre [2].

For the oscillating anomaly, the sinusoidal function cosine meets the requirements listed in Table 1. From Figure 2, the amplitude of the oscillating anomaly is given by the vertical distance in deg C from the central linear anomaly line to either the top or bottom parallel lines, and it is about 0.3 deg C. From Figure 2, the oscillating anomaly was at its maximum in the 1880s, 1940s, & 2000s; it was at its minimum in the 1910s and 1970s. The years between successive maxima or minima of the oscillating anomaly is the period of the cosine function, and it is about 1940–1880=1970–1910=60 years. For the cosine function, once its amplitude of 0.3 deg C and its period of 60 years are determined, the mathematical equation for the oscillating anomaly, for the years starting from 1880, can be written as

Oscillating anomaly in deg C = 0.3*Cos(((Year-1880)/60)*2*3.1416)                            Equation 2

In the above equation, the factor 2*3.1416 is used to convert the argument of the cosine function to radians, which is required for computation in Microsoft Excel. If the angle required is in degrees, replace 2*3.1416 with 360.

Combining the linear anomaly given by Equation 1 and the oscillating anomaly given by Equation 2 gives the equation for the Global Mean Temperature Anomaly (GMTA) in deg C for the years since 1880 as

GMTA = 0.0059*(Year-1880) – 0.52 + 0.3*Cos(((Year-1880)/60)*2*3.1416)               Equation 3

The validity of this model may be verified by comparing its estimate with observed values at the GMTA turning points as summarized in Table 2.

Table 2. Comparison of the model with observations for GMTA in deg C at its turning points.

Year Observed (Table 1) Model

(Equation 3)

Warming plateau for the 1880s -0.2 -0.22
Cooling plateau for the 1910s -0.6 -0.64
Warming plateau for the 1940s +0.1 +0.13
Cooling plateau for the 1970s -0.3 -0.29
Warming plateau for the 2000s +0.5 +0.48

Table 2 shows excellent agreement for the GMTA values between observation and mathematical model for all observed GMTA turning points.

A graph of the GMTA model given by Equation 3 is shown in Figure 3, which includes the observed GMTA and short-term IPCC projections for GMTA from 2000 to 2025. In addition to the verification shown in Table 2, Figure 3 shows good agreement for the GMTA trends throughout observed temperature records, so the model may be used for prediction. As a result, Figure 3 includes GMTA predictions until 2100, where the year and the corresponding GMTA values are given in parentheses for all the GMTA turning points.

As shown in Figure 3, a slight discrepancy exist between observed and model GMTA values at the end of the 1890s when the observed values were significantly warmer than the model pattern, and in the 1950s when the observed values were significantly colder than the model pattern.

Figure 3. Comparison of observed Global Yearly Mean Temperature Anomaly (GMTA) with models.

From the model in Figure 3, during the observed temperature record, there were two global warming phases. The first was from 1910 to 1940 with a warming of 0.13+0.64=0.77 deg C in 30 years. The second was from 1970 to 2000 with a warming of 0.48+0.29=0.77 deg C in 30 years. Note that both warming phases have an identical increase in GMTA of 0.77 deg C in 30 years, which gives an average warming rate of (0.77/30)*10=0.26 deg C per decade.

From the model in Figure 3, during the observed temperature record, there were two global cooling phases. The first was from 1880 to 1910 with a cooling of 0.64-0.22=0.42 deg C in 30 years. The second was from 1940 to 1970 with a cooling of 0.13+0.29=0.42 deg C in 30 years. Note that both cooling phases have an identical decrease in GMTA of 0.42 deg C in 30 years, which gives an average cooling rate of (0.42/30)*10=0.14 deg C per decade.

The above results for the normal ranges of GMTA determined from the model can also be calculated using simple geometry in Figure 2. In this figure, almost all observed GMTA values are enveloped by the two parallel lines that are 0.6 deg C apart. Therefore, as a first approximation, the normal range of GMTA is 0.6 deg C. From Figure 2, the period for a global warming or cooling phase is about 30 years. Therefore, as a first approximation, the normal rate of global warming or cooling is (0.6/30)*10=0.2 deg C per decade.

The above approximation of 0.6 deg C for the normal range of GMTA should be refined by including the effect of the linear warming anomaly given by Equation 1 of 0.006 deg C per year, which is the slope of the two envelope parallel lines in Figure 2.  As the oscillating anomaly changes by 0.6 deg C in 30 years between its turning points, the linear anomaly increases by 0.006*30=0.18 deg C. Due to this persistent warming, instead of the GMTA increasing or decreasing by the same 0.6 deg C, it increases by 0.6+0.18=0.78 deg C during its warming phase, and decreases by 0.6–0.18=0.42 deg C during its cooling phase. As a result, the refined normal ranges of GMTA are 0.77 deg C in 30 years during its warming phase, and 0.42 deg C in 30 years during its cooling phase. These results for the normal ranges of GMTA obtained using simple geometry in Figure 2 agree with those obtained from the model in Figure 3.

Correlation of Model and Observed Global Mean Temperature Anomaly (GMTA)

In Table 2, data points for only five years were used to verify the validity of Equation 3 to model the observed data. However, it is important to verify how well the observed GMTA is modeled for any year.


Figure 4. Correlation between model and observed GMTA values. The model GMTA values are from Equation 3, and the observed GMTA values are from the Climate Research Unit shown in Figure 2.

How well the observed data is modeled can be established from a scatter plot of the observed and model GMTA values as shown in Figure 4. For example, for year 1998, the observed GMTA was 0.53 deg C and the model GMTA is 0.47 deg C.  In Figure 4, for year 1998, the pair (0.47,0.53) is plotted as a dot. In a similar manner, all the paired data for model and observed GMTA values for years from 1880 to 2009 are plotted as shown in Figure 4.

Figure 4 shows a strong linear relationship (correlation coefficient, r=0.88) between the model and observed GMTA. With high correlation coefficient of 0.88, Figure 4 shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. The positive slope of the trend line indicates a positive relationship between model and observed GMTA. That is, global cooling from the model indicates observed global cooling, and global warming from the model indicates observed global warming.

Global Mean Temperature Prediction Calculations

The following patterns may be inferred from the graph of the Global Mean Temperature Anomaly (GMTA) model shown in Figure 3 for the data from the Climate Research Unit of the Hadley Center [2]:

1. Year 1880 was the start of a cooling phase and had a GMTA of –0.22 deg C.

2. During the global cooling phase, the GMTA decreases by 0.42 deg C in 30 years.

3. Global cooling and warming phases alternate with each other.

4. During the global warming phase, the GMTA increases by 0.77 deg C in 30 years.

The patterns in the list above are sufficient to estimate the GMTA values at all of its turning points since 1880.

For example, as year 1880 with GMTA of –0.22 deg C was the start of a cooling phase of 0.42 deg C in 30 years, the next GMTA turning point was near 1880+30=1910 with GMTA of –0.22–0.42=-0.64 deg C. This GMTA value for 1910 is shown as (1910,-0.64) in Figure 3.

As year 1910 with GMTA of –0.64 deg C was the end of a global cooling phase, it is also the start of a global warming phase of 0.77 deg C in 30 years. As a result, the next GMTA turning point was near 1910+30=1940 with GMTA of 0.77–0.64=0.13 deg C. This GMTA value for 1940 is shown as (1940,0.13) in Figure 3.

As year 1940 with GMTA of 0.13 deg C was the end of a global warming phase, it is also the start of a global cooling phase of 0.42 deg C in 30 years. As a result, the next GMTA turning point was near 1940+30=1970 with GMTA of 0.13–0.42=-0.29 deg C. This GMTA value for 1970 is shown as (1970,-0.29) in Figure 3.

As year 1970 with GMTA of -0.29 deg C was the end of a global cooling phase, it is also the start of a global warming phase of 0.77 deg C in 30 years. As a result, the next GMTA turning point was near 1970+30=2000 with GMTA of 0.77–0.29=0.48 deg C. This GMTA value for 2000 is shown as (2000,0.48) in Figure 3.

As the GMTA values calculated above using the global temperature patterns listed at the beginning of this section give good approximation of observed GMTA values at all GMTA turning points (1880, 1910, 1940, 1970 & 2000), it is reasonable to assume that the patterns may also be used for prediction.

As a result, as year 2000 with GMTA of 0.48 deg C was the end of a global warming phase, it is also the start of a global cooling phase of 0.42 deg C in 30 years. As a result, the next GMTA turning point will be near 2000+30=2030 with GMTA of 0.48–0.42=0.06 deg C. This GMTA value for 2030 is shown as (2030,0.06) in Figure 3.

In a similar manner, the GMTA values for the remaining GMTA turning points for this century can be calculated, and the results are shown in Figure 3.

Figure 3 shows a very interesting result that for the 20th century, the global warming from 1910 to 2000 was 0.48+0.64=1.12 deg C. In contrast, for the 21st century, the change in GMTA from 2000 to 2090 will be only 0.41–0.48=-0.07 deg C. This means that there will be little change in the GMTA for the 21st century! Why?

Why Does The Same Model Give A Global Warming Of About 1 deg C For The 20th Century But Nearly None For The 21st Century?

According to the data shown in Figure 3, it is true that the global warming of the 20th century was unprecedented. As a result, it is true that the corresponding sea level rise, melting of sea ice or the corresponding climate change in general were unprecedented. However, this was because the century started when the oscillating anomaly was at its minimum near 1910 with GMTA of –0.64 deg C and ended when it was at its maximum near 2000 with GMTA of 0.48 deg C, giving a large global warming of 0.48+0.64=1.12 deg C. This large warming was due to the rare events of two global warming phases of 0.77 deg C each but only one cooling phase of 0.44 deg C occurring in the 20th century, giving a global warming of 2*0.77-0.42=1.12 deg C.

In contrast to the 20th century, from Figure 3, there will be nearly no change in GMTA in the 21st century. This is because the century started when the oscillating anomaly was at its maximum near 2000 with GMTA of 0.48 deg C and will end when it is at its minimum near 2090 with GMTA of 0.41 deg C, giving a negligible change in GMTA of 0.41-0.48=-0.07 deg C. This negligible change in GMTA is due to the rare events of two global cooling phases of 0.42 deg C each but only one warming phase of 0.77 deg C occurring in the 21st century, giving the negligible change in GMTA of 0.77-2*0.42=-0.07 deg C. Note that this little change in GMTA for the 21st century is identical to that from 1880 to 1970, which makes the global warming from 1970 to 2000 by 0.77 deg C appear to be abnormally high.

If the period for a century had been 120 years, we wouldn’t have this conundrum of nearly 1 deg C warming in the 20th century but nearly none in the next!

Ocean Current Cycles

One of the most important variables that affect global mean surface temperature is ocean current cycles. The rising of cold water from the bottom of the sea to its surface results in colder global mean surface temperature; weakening of this movement results in warmer global mean surface temperature. Various ocean cycles have been identified. The most relevant to global mean temperature turning points is the 20 to 30 years long ocean cycle called Pacific Decadal Oscillation (PDO) [6]:

Several independent studies find evidence for just two full PDO cycles in the past century: “cool” PDO regimes prevailed from 1890-1924 and again from 1947-1976, while “warm” PDO regimes dominated from 1925-1946 and from 1977 through (at least) the mid-1990’s (Mantua et al. 1997, Minobe 1997).

These cool and warm PDO regimes correlate well with the cooling and warming phases of GMTA shown in Figure 3.

The model in Figure 3 predicts global cooling until 2030. This result is also supported by shifts in PDO that occurred at the end of the last century, which is expected to result in global cooling until about 2030 [7].

Effect Of CO2 Emission On Global Mean Temperature

Examination of Figure 3 shows that the Global Mean Temperature Anomaly (GMTA) for 1940 of 0.13 deg C is greater than that for 1880 of –0.22 deg C. Also, the GMTA for 2000 of 0.48 deg C is greater than that for 1940 of 0.13 deg C. This means that the GMTA value, when the oscillating anomaly is at its maximum, increases in every new cycle. Is this global warming caused by human emission of CO2?

The data required to establish the effect of CO2 emission on global mean temperature already exist. The global mean temperature data are available from the Climate Research Unit of the Hadley Centre shown in Figure 3, and the CO2 emission data are available from the Carbon Dioxide Information Analysis Centre [8]. For the period from 1880 to 1940, the average emission of CO2 was about 0.8 G-ton, and the increase in the GMTA was 0.13+0.22=0.35 deg C. For the period from 1940 to 2000, the average emission of CO2 was about 4 G-ton, but the increase in GMTA was the same 0.48-0.13=0.35 deg C. This means that an increase in CO2 emission by 4/0.8=5-fold has no effect in the increase in the GMTA. This conclusively proves that the effect of 20th century human emission of CO2 on global mean temperature is nil.

Note that the increase in GMTA of 0.35 deg C from 1880 to 1940 (or from 1940 to 2000) in a 60 year period has a warming rate of 0.35/60=0.0058 deg per year, which is the slope of the linear anomaly given by Equation 1. As a result, the linear anomaly is not affected by CO2 emission. Obviously, as the oscillating anomaly is cyclic, it is not related to the 5-fold increase in human emission of CO2.

Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-fold increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil.

Further evidence for the non-existent relationship between CO2 and GMTA is IPCC’s projection of a global warming of 0.2 deg C per decade, while the observed GMTA trend was “quite stable since 2000” [5]. The evidence will be “unequivocal” if global cooling by about 0.42 deg C starts soon and continues until about 2030, as shown by the model in Figure 3. The IPCC projection for the GMTA for 2020 is 0.8 deg C, while the prediction from the model for this value is 0.2 deg C, a large discrepancy of 0.6 deg C. If this global cooling is confirmed, it will then be time to bury the theory that CO2, a plant food, causes catastrophic global warming. Fortunately, we don’t have to wait too long for the burial. Less than ten years. It will be cheering news!

IPCC Projections

According to the IPCC [1], “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenario.”

IPCC explains this projection as shown in Figure 5 where GMTA trend lines were drawn for four periods from 2005 to 1856, 1906, 1956 & 1981. These trend lines give increasing warming rate from a low value of 0.045 deg C per decade for the RED trend line for the first period from 1856 to 2005, to a greater value of 0.074 deg C per decade for the PURPLE trend line for the second period from 1906 to 2005, to a still greater value of 0.128 deg C per decade for the ORANGE trend line for the third period from 1956 to 2005, and to a maximum value of 0.177 deg C per decade for the YELLOW trend line for the fourth period from 1981 to 2005. IPCC then concludes, “Note that for shorter recent periods, the slope is greater, indicating accelerated warming” [9].

If this IPCC interpretation is correct, catastrophic global warming is imminent, and it is justified for the world to be griped by fear of global warming. However, is IPCC’s “accelerated warming” conclusion shown in Figure 5 correct?

What the GMTA pattern in Figure 3 shows is that it has cooling and warming phases. As a result, in Figure 5, comparing the warming rate of one period that has only one warming phase with another period that has a combination of warming and cooling phases will obviously show the maximum warming rate for the first period. This is comparing apples to oranges.

Comparing apples to apples is to compare two periods that have the same number of cooling and/or warming phases.

Figure 5. Accelerated global warming according to the IPCC [9].

One example of comparing apples to apples is to compare one period that has one warming phase with another that also has one warming phase. From Figure 3, two 30-year periods that have only one warming phase are the periods from 1910 to 1940 and from 1970 to 2000. For the period from 1910 to 1940, the increase in GMTA was 0.13+0.64=0.77 deg C, giving a warming rate of (0.77/30)*10=0.26 deg C per decade. Similarly, for the period from 1970 to 2000, the increase in GMTA was 0.48+0.29=0.77 deg C, giving an identical warming rate of 0.26 deg C per decade. Therefore, there is no “accelerated warming” in the period from 1970 to 2000 compared to the period from 1910 to 1940.

A second example of comparing apples to apples is to compare one period that has one cooling and warming phases with another that also has one cooling and warming phases. From Figure 3, two 60-year periods that have only one cooling and warming phases are the periods from 1880 to 1940 and from 1940 to 2000. For the period from 1880 to 1940, the increase in GMTA was 0.13+0.22=0.35 deg C, giving a warming rate of (0.35/60)*10=0.06 deg C per decade. Similarly, for the period from 1940 to 2000, the increase in GMTA was 0.48-0.13=0.35 deg C, giving an identical warming rate of 0.06 deg C per decade. Therefore, there is no “accelerated warming” in the period from 1940 to 2000 compared to the period from 1880 to 1940.

From the above analysis, IPCC’s conclusion of “accelerated warming” is incorrect, and its graph shown in Figure 5 is an incorrect interpretation of the data.

Based on observed GMTA pattern shown in Figure 3, a global warming phase lasts for 30 years, and it is followed by global cooling. As a result, the recent global warming phase that started in the 1970s ended in the 2000s as shown by the current GMTA plateau, and global cooling should follow. Therefore, IPCC’s projection for global warming of 0.2 deg C per decade for the next two decades is incorrect. Also, divergence between IPCC projections and observed values for the GMTA has started to be “discernible” since 2005 as shown in Figure 3.

According to the Occam’s Razor principle, given a choice between two explanations, choose the simplest one that requires the fewest assumptions. Instead of applying the Occam’s Razor principle by assuming the cause of GMTA turning points to be natural, the IPCC assumed the cause to be man-made [9]:

From about 1940 to 1970 the increasing industrialisation following World War II increased pollution in the Northern Hemisphere, contributing to cooling, and increases in carbon dioxide and other greenhouse gases dominate the observed warming after the mid-1970s.

Like in the 1880s & 1910s, what if the causes of the GMTA turning points in the 1940s and 1970s were also natural?

Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-fold increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil. Also, IPCC’s conclusion of “accelerated warming” shown in Figure 5 is incorrect.

What is the cause of the GMTA turning point from warming to plateau in the 2000s? Here is the suggestion by Mike MacCracken [4]:

I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong.

According to the IPCC and the above suggestion, the 1940 GMTA turning point from global warming to cooling was caused by sulfates, the 1970 GMTA turning point from cooling to warming was caused by carbon dioxide, and the 2000 GMTA turning point from warming to plateau was caused by sulfates. It is interesting to note that sulfate and carbon dioxide gave the globe a 30-year alternate cooling and warming phases from 1940 to 2000. This is just absurd.

Instead of saying, Be awkward if we went through a early 1940s type swing!” in private, but global warming “is accelerating at a much faster pace” in public, please release the world from the fear of climate catastrophe from use of fossil fuels, as this catastrophe is not supported by your own data. It is extremely callous not to do so.

Is the theory that “human emission of CO2 causes catastrophic global warming” one of the greatest blunders or something worse of “science”? We will find the unambiguous answer within the next ten years. Hope they don’t succeed in calling the plant food a pollutant and tax us before then.

For any criticism, please contact me at orssengo@lycos.com

Girma J Orssengo

Bachelor of Technology in Mechanical Engineering, University of Calicut, Calicut, India

Master of Applied Science, University of British Columbia, Vancouver, Canada

Doctor of Philosophy, University of New South Wales, Sydney, Australia

REFERENCES

[1] IPCC Fourth Assessment Report: Climate Change 2007

“a warming of about 0.2°C per decade is projected”

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/spmsspm-projections-of.html

[2] Observed Global Mean Surface Temperatures from the Climate Research Unit of the Hadley Center.

http://www.woodfortrees.org/plot/hadcrut3vgl/compress:12/from:1880/plot/hadcrut3vgl/from:1880/trend/plot/hadcrut3vgl/from:1880/trend/offset:0.3/plot/hadcrut3vgl/from:1880/trend/offset:-0.3

[3] Climate Change Science Compendium 2009

“is accelerating at a much faster pace”

http://www.unep.org/pdf/ccScienceCompendium2009/cc_ScienceCompendium2009_full_en.pdf

[4] Climategate Email from Mike MacCracken to Phil Jones, Folland and Chris

“that explanation is wearing thin”

http://www.eastangliaemails.com/emails.php?eid=947&filename=1231166089.txt

[5] Climategate Email from Mick Kelly to Phil Jones

Be awkward if we went through a early 1940s type swing!

http://www.eastangliaemails.com/emails.php?eid=927&filename=1225026120.txt

[6] The Pacific Decadal Oscillation (PDO)

http://jisao.washington.edu/pdo/

[7] Pacific Ocean Showing Signs of Major Shifts in the Climate

http://www.nytimes.com/library/national/science/012000sci-environ-climate.html

[8] Carbon Dioxide Information Analysis Center

Global CO2 Emissions from Fossil-Fuel Burning, Cement Manufacture, and Gas Flaring

http://cdiac.ornl.gov/ftp/ndp030/global.1751_2006.ems

[9] Climate Change 2007: Working Group I: The Physical Science Basis

How are Temperatures on Earth Changing?

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/faq-3-1.html

1

85 thoughts on “Predictions Of Global Mean Temperatures & IPCC Projections

  1. A projection of an over-fit empirical model into the future is almost certainly useless. It’s equivalent to fitting a high order polynomial to the S&P 500. Don’t bet the farm that the projected future behavior will in any way relate to the model. ENSO, PDO, etc. are not clockwork sine waves.

  2. DeWitt Payne,

    I agree 100%. There are pseudo-cyclical components like ENSO, AMO, PDO, etc. that for sure contribute to the global average trend, but there is no way to quantify their contribution accurately, nor any way to project what these will be in the future. To ignore the possibility of aerosol contributions over the past 100 years and into the future (very poorly defined of course, but likely real at some level) adds even more doubt to this kind of projection.

    Have AMO and PDO contributions led to an overestimate of warming caused by GHG’s in the late 20th early 21st centuries? Could very well be, but the line of argument presented here is not very convincing.

  3. It is not difficult to determine the contribution of the ENSO and the AMO to the global average temperature trend. Simple regression provides pretty reasonable results.

    [The PDO is just an extension of the ENSO and more accurate contributions to global temperature can be obtained from the ENSO alone).

    Hadcrut3 modelled back to 1871 on a monthly basis.

    The error term left over is mostly white noise and mostly within 0.2C.

    In terms of incorporating Aerosols, this is dangerous to include because we have no solid data to rely on and mainly because including “estimated” figures results in fudged climate reconstructions.

    GISS Model E Aerosols forcing versus GHG forcing from 1880 to 2003 is ridiculous.

    Aerosols completely/more than offsets all of the GHG forcing until 1970 and then, after that, there is very little net GHG forcing to provide the warming post-1970. Throw in Land-use and GISS Model E has the 20th Century warming resulting from Natural Forcing only. They don’t even know they did this.

  4. #1 – I think the better question is it any worse than the GCMs which fit their models by adjusting aerosol/cloud forcings and ocean coupling?

    i.e. can this overfitted model refute the claim that ‘nothing other than CO2 can explain the warming’?

  5. oh my god. you really reposted what Girma wrote?

    this is insane.

    there is no physical basis for this model. he does curve fitting, then finds high correlation. (oh really?)

  6. The linear fit added to the sine wave is a good fit over the quoted time period. There probably is a reasonable chance the compound curve fit will still do fairly good for part of a next cycle, but all bets are off as time progresses. This analysis is not based on physics but just curve fitting. The fit showed trends over the last 150 years better than the IPCC models, but this is not a model. The overall temperature level may tend either up more or start down more, probably due to a combination of Solar activity, volcanoes, long term ocean cycles, and other factors we do not yet understand well. The longer term ice core data indicates that we are likely approaching the end of the Holocene. Fortunately, the end of inter glacial periods are more gradual than starts, so I don’t think it is going to be a big problem too rapidly, but this is the issue we should pay more attention to, as cooling is much more a problem than slight warming.

  7. Sod,
    The IPCC and folks like you are far more into curve fitting and jumping to wrong conclusions than skeptics. One major point in the article was to show how the concept of accelerating warming stated by CAGW supporters was a bad case of misunderstanding curve fitting. I do not agree that we can project a curve fit out very far, but look at your own positions before making a big issue.

  8. DeWitt Payne said, “A projection of an over-fit empirical model into the future is almost certainly useless.”

    Not quite useless.

    Failure of the temperatures to continue to correlate with atmospheric CO2 shows that the earlier correlation did not arise from a “cause-and-effect” relationship.

    That finding is important because it falsifies a basic assumption of CO2-induced global warming.

  9. Oliver #9:

    Failure of the temperatures to continue to correlate with atmospheric CO2 shows that the earlier correlation did not arise from a “cause-and-effect” relationship.

    Or because the actual prediction is not a simple linear relationship to begin with.

    Bill Illis is spot on when he comments that all of the warming prior to 1970 is associated (in the models) to natural forcings. It is my belief a substantial percentage (as much as 2/3s) of the post 1970 warming is natural as well.

    Given the enormous uncertainties in other forcings, specifically AGW sulfates (which tend to cool the Earth) and what may be so far a very weak net athropogenic effect, it is “too soon” to be trying to extract climate sensitivity from the measurements. Meaning there is an enormous uncertainty in the result.

    That said, I think the models are being constrained to eliminate some of the really high climate sensitivities already. (I think anything over 4.0°C/doubling of CO2 is ruled out at the 5% level).

    But given small effects and large natural variation, overfitting is almost inevitable in these circumstances.

  10. An interesting analysis, which shows that there had been only five climate shifts during the last 150 years, and – what seems astonishing that two shifts are closely related to the two world wars, WWI & WWII, that saw a lot of naval activities all over the Northern Hemisphere; with references here: http://www.what-is-climate.com

  11. Is it just me, or did others stop reading when ‘CO2, a plant food, i.e. foundation of life’ came up? Somewhat irrelevant?

  12. Carrick,

    “I think the models are being constrained to eliminate some of the really high climate sensitivities already. (I think anything over 4.0°C/doubling of CO2 is ruled out at the 5% level).”

    From where do you get this information?

  13. I agree with the critics of the ‘fitting of a curve to the data’ feel of this work. Does it have any predictive power? Time will tell. Is it any worse than the mega million dollar computer games which Gavin et al play? It’s probably got as much chance of being right – if not more, in my humble opinion.

    Cyclical patterns are caused by things going round other things in regular periods of time. Like a moon around a planet, a planet around a sun, a stellar system around a galactic centre…

  14. Bill Illis,
    “GISS Model E Aerosols forcing versus GHG forcing from 1880 to 2003 is ridiculous.”

    Agreed. The aerosol effects used by each modeling group are whatever makes each model yield hindcasts that are consistent with the instrument temperature record. Nothing more than a somewhat comical arm-waving kludge. High climate sensitivity (Hansen’s sacred 3 C per doubling) can only be correct if there are huge aerosol effects and huge ocean heat accumulations. So we can’t expect GISS to make any changes in their estimates of these any time soon.

    However, almost certainly there is some effect from aerosols, especially in the period before the dramatic reductions in sulfur emissions from coal fired plants, due to European and US regulations (done mostly to reduce ‘acid rain’ downwind of coal fired plants). Since we really have very little idea what the aerosol effect was, is, or what it may be in the future, it just adds uncertainty to any analysis based on a regression against temperatures (as you and I have both done in the past!). But note that this uncertainty due to aerosols goes only in the upward direction. If you do the regression without any included aerosol effects, but there were in fact some aerosol effects, then the regression will always yield a diagnosed climate sensitivity to GHG forcing that is lower than correct. How much lower than correct, we do not know. A regression that does not include aerosol effects yields the lower bound for climate sensitivity.

  15. The images for Figure 2, 3, 4 can not be seen!

    Mathematicians seek out patterns. Based on the observed GMTA data, the pattern is a combination of linear and sinusoidal functions. As this pattern was valid for the last 129 years, it is reasonable to assume it will be valid for the next 20 years.

    Otherwise, how are you going to tell me whether our globe is going to have further warming or cooling in the coming 20 years?

    If it worked for the last 129 years, why should it break for the next 10, 20 years?

  16. One question:

    there is no physical basis for this model. he does curve fitting, then finds high correlation. (oh really?)

    Hight correlation fitting a curve which does not have an acceleration, doesn’t mean there is no acceleration?

  17. The fit showed trends over the last 150 years better than the IPCC models, but this is not a model

    that is what we expect. it has been FITTED to that curve.

    Bill Illis is spot on when he comments that all of the warming prior to 1970 is associated (in the models) to natural forcings.

    Illis, is wrong.

    I agree with the critics of the ‘fitting of a curve to the data’ feel of this work. Does it have any predictive power? Time will tell.

    so if i throw out 500 predictions, time will tell?
    this is a stupid approach!

    Mathematicians seek out patterns. Based on the observed GMTA data, the pattern is a combination of linear and sinusoidal functions. As this pattern was valid for the last 129 years, it is reasonable to assume it will be valid for the next 20 years.

    Girma, you know about as much about math as our chickens do.

    you look at two cycles and claim that this is a sinus function. with the same argument,we could come to any conclusion.

    anyone who wants to waste an afternoon and have some fun, please take a look at the about 2000 comments in this deltoid discussion:

    http://scienceblogs.com/deltoid/2009/08/matthew_england_challenges_the.php

    can anyone explain to me, how Girma got that PhD?

  18. just to put this post by Girma into some context:

    this was his last piece of “analysis”:

    The energy expended in doing physical work and that keeps us warm are used up and there is no way to get it back. So matter from food is finally converted to energy. As a result, the mass of the earth must always decrease.

    Posted by: Girma | September 27, 2009 11:24 AM

    http://scienceblogs.com/deltoid/2009/08/matthew_england_challenges_the.php#comment-1965891

    please stop eating guys. you are destroying the planet.

  19. sod

    How do IPCC, with all the world scientists, come up with the “accelerated warming” in Figure 5?

    Look at the divergence of the IPCC’s prediction of 0.2 deg C per decade compared to the observed plateau. How did they get their PhD?

    At least, my result agrees with the current plateau.

  20. Sod:

    Bill Illis is spot on when he comments that all of the warming prior to 1970 is associated (in the models) to natural forcings.

    Illis, is wrong.

    That wouldn’t be Bill in this case. It would be all of the global climate models. It is apparent you were unclear about what was actually being discussed because your graphic was completely unrelated to this question.

    GISS MODEL E Forcings

    Note anthropogenic CO2 and sulfates nearly cancel till the mid 1970s. I’m not disputing that the Earth is warming, or that humans are playing a role.

    As to the rest, you can keep the infantile attacks on another blog where you get cheered for your juvenile behavior. If you don’t mind.

  21. Sod,

    Do yourself (and everyone else) a favor and stop the ad-homs. You really give people a terrible impression of yourself…. or is that what you want to do?

  22. How do IPCC, with all the world scientists, come up with the “accelerated warming” in Figure 5?

    Girma, i could do exactly the same thing that you did. i would fit a sin wave and an exponential function to the curve. it would be extremely similar up till today. but future temperatures would be very different from your “analysis”.

    or we could fit a 5+ order polynomial. again, very good fit, absolutely insane prediction.

    what you do is rubbish.

    Look at the divergence of the IPCC’s prediction of 0.2 deg C per decade compared to the observed plateau.

    i love the plateau stuff. i hope you figured the current high temperatures into your “model”.

    you do understand that you need to allow similar errors to both models? your 2model2 has temperatures pretty far from your curve at several places. you need to show that the current “plateau” has significantly bigger difference from IPCC projections, than yours. good luck.

  23. Note anthropogenic CO2 and sulfates nearly cancel till the mid 1970s. I’m not disputing that the Earth is warming, or that humans are playing a role.

    Carrick, you do not know what you are talking about. the claim attributed to Illis was this:

    Bill Illis is spot on when he comments that all of the warming prior to 1970 is associated (in the models) to natural forcings.

    the idea is about a distinction between natural and androphogenic forcings. that is exactly what the graph i showed above, shows:


    (blue bars are temperatures modeled from the natural, red from the androphogenic forcings)

    you make a distinction between greenhouse gas forcings and other forcings. this is something completely different!

    GHG forcings can be natural. other forcings (like land use or aerosols) can be natural or NOT natural.

    That wouldn’t be Bill in this case. It would be all of the global climate models. It is apparent you were unclear about what was actually being discussed because your graphic was completely unrelated to this question.

    GISS MODEL E Forcings

    my graphic was highly relevant. yours was not.

    ———————-

    Sod,

    Do yourself (and everyone else) a favor and stop the ad-homs. You really give people a terrible impression of yourself…. or is that what you want to do?

    you are joining the club of people, who do not know what ad hom is. i did not use any ad hom above.

    why not simply ask me “stop the insults”, but use a term that you don t understand?

    ps: as i found the “mass of the earth must always decrease” link after my first post, i really have to take back that insult. i hereby formally ask all chickens to forgive my outburst above.

  24. 22-Carrick, I have heard you say now, many, MANY times “the aerosol and GHG forcings cancel almost entirely before ~1970” (paraphrased a bit. I think you should know that, whether that is what the GISS model ASSUMES or not, there is simply no justification for asserting it as true! The time history and magnitude of aerosol forcing are completely unknown before the seventies and barely known better up to now. That’s simply because no measurements exist. That modelers take what aerosol forcing they need to cancel their excess warming, then say their models “match” history is ludicrous, really. If it does, it is almost certainly an accident (or confirmation bias). At most, one of the many different models and many different forcing histories that multiple groups use can be right, so that means that at least the others must all be wrong. The probability that GISS happens to have the forcings right is then, I would guess, less than 5% (there are at least twenty different models).

  25. I’ve noticed that all of the surface temperature charts from 1880 onward have this upward trend in spite of the decadal oscillations. Does this (if it is) “trend” have the potential of being consistent with what might have occurred at the start of the MWP?

  26. Sod,
    I think my understanding of “ad hom” is reasonably accurate, even if I avoid the use of this tactic. According to http://www.wisegeek.com/what-is-an-ad-hom-attack.htm:

    “In the modern sense of an ad hom attack, the issue at hand may be completely irrelevant. Many veterans of message board flame wars consider any personal attacks against a poster to be ad hom in nature.”

    Do you suggest that your attacks were not personal in nature and irrelevant to the subject? (eg. “can anyone explain to me, how Girma got that PhD?”) However you wish to classify them (personal insults, ad homs, or any other term you think is “more correct”), you routinely make personal attacks on other people in your posts, simply because you disagree with what they say. This avoids addressing the substance of the disagreement, and IMO makes most people think very poorly of you. Of course, I have no way of knowing if you just want people to think poorly of you.

  27. See my comments on WUWT. This is largely a useless exercise. You are merely fitting a model to the data. That model is not physical. It does a horrible hindcast back to 1850 and would be even worse going back to the MWP. The model has no physical realization. If you want to see how to build a simple model ( that is somewhat physical) then have a look at Lucia’s Lumped parameter model.

    WRT the people who want to compare this to a GCM, you miss the point entirely. A GCM makes a huge number of predictions. It does this by executing physics. It makes a prediction of SST in time and space, of air temp in time and space, ice, clouds, oceanic cycles, precipitation. The results, while uncertain, cannot be matched by a simple curve fit. period.
    the simple curve fit is not “understanding” or useful knowledge. It is numerology.

    This can be illustrated by asking the author a simple question: whats his model predict temperature in the troposphere?
    right.

    I suppose though that people enjoy seeing a 60 year cycle in the data and they comfort themselves that they have discovered a “natural cycle” well, we do see cycles in the data, the question is can you give physics based explanation of them that allows you to predict them. Until you do that you havent “understood” anything. you’ve merely noted what you regard as a regularity that needs explaining.

  28. Andrew:

    22-Carrick, I have heard you say now, many, MANY times “the aerosol and GHG forcings cancel almost entirely before ~1970″ (paraphrased a bit. I think you should know that, whether that is what the GISS model ASSUMES or not, there is simply no justification for asserting it as true!

    Of course I agree with you here. I am aware of the limitations of the models.

    What I have said is “this is what the models say.” The models may of course be wrong, but what are our alternatives?

    If we are to test a science or a theory, we must go with what the science says when testing it. We can’t make up our own versions of what we think the “science should say”, test that, then say we’ve tested the original theory. You can “make up your own version” and test that and say that your version works better or worse, but you still need to differentiate Andrew’s theory from the base theory.

    The probability that GISS happens to have the forcings right is then, I would guess, less than 5% (there are at least twenty different models).

    First “having the forcings right” isn’t even meaningful statement. “Right” in what sense?

    Right to within 5% of the actual values, 1% of the actual value, the exact values?

    Secondly, within the experimental constraints, their forcings (generally) give output that are consistent with the measurements. In the sense that I as a physicist would use “right”, I would mean it in the sense of “consistent with available data.” That is certainly the case (with a caveat that prior to 1950 the quality of the data is poor enough that consistency with the data itself has an uncertain interpretation).

    I’ve looked at several (hardly all) models, they all produce the same result, namely to explain the warming since 1980 you have to match the sulfate with the CO2 anthropogenic forcings prior to 1980.

    If you are aware of a model for which that is not true, this would be interesting too.

    I’ll take the number 5% as “pulled from the air”. At least their forcings are based on some physical constraints. They have a lot higher chance of being right than you would, were you to pull your own set of forcings from the air, whatever that confidence level really is..

  29. Girma

    “Mathematicians seek out patterns. Based on the observed GMTA data, the pattern is a combination of linear and sinusoidal functions. As this pattern was valid for the last 129 years, it is reasonable to assume it will be valid for the next 20 years.

    Otherwise, how are you going to tell me whether our globe is going to have further warming or cooling in the coming 20 years?

    If it worked for the last 129 years, why should it break for the next 10, 20 years?”

    1. What mathematicians do has nothing to do with the question. The question is how do physicists create models of natural processes. CURVE FITTING is very low on the totem pole. Numerologists do what you did. it has no explanatory value.

    2. The Validity of your pattern.
    A. whats your measure of validity
    B. the validity is meaningless as your model cannot hindcast
    C. If a major Volcano happens tommorrow what does your model predict? nothing

    3. It worked for 129 years so its reasonable to assume it will work for the next 20?

    A. reconstruct your model using only data from 1850 to 1940. see how well you do.
    B. use your model to forecast temperatures 1000 years from now. see how well you do.
    C. use it to hindcast.
    D. what does your model predict if TSI falls

    The thing is your model hasnt been shown to WORK. its been shown to fit 129 years of data. Thats not “working”
    thats a pet trick. Build your model with half the data and see how it works on the other half. Thats a measure
    of “working” and one suitable for numerology.

    4 how are you going to tell me whether our globe is going to have further warming or cooling in the coming 20 years?

    That’s easy. I would take the best physical models we have ( GCMs) and then I would do this:

    A. make an estimate of the concentration of GHGs for the next 20 years:
    1. Low estimate: values stay the same as today:
    2. High estimate: values increase according to historical trends+ any acceleration in the past 10 years.

    crunch those cases: look at the spread:
    lemme guess:

    if it is X today then in 20 years it will be…. X.3 +-.3C

    wont be cooler unless you get big ass changes in aerosols ( volcano etc)

    GCM’s are not perfect, but they are the best tool today, flawed, but somewhat useful.

  30. S Mosher suprised you didn’t point out that GCM’s could have a bias that explained the current 10 years, or past cycles, and still be useful. Perhaps I missed it.

    More useful, if we can get to determine the bias, if any, but useful none the less.

  31. Steve F,

    Sod like others is engaging in the etymological fallacy. The term ad hom has widened in reference over the years
    and so refers to something broader than when it entered the language. This is one of the way terms evolve.
    today we take “ad hom” to be any personal criticism leveled against a person you are debating with.

    ‘classically’ ad hom have two steps:

    1. your argument is wrong
    2. because you are a such and such

    http://www.nizkor.org/features/fallacies/ad-hominem.html

    But today we reckon any attack on the person as an adhom: Mann misused PCA, where did he get his Phd from online?

    so anytime anyone accuses you of not understanding an ad hom, just accuse them of being a moron who
    just committed the etymological fallacy

    http://en.wikipedia.org/wiki/Etymological_fallacy

  32. RE33.

    I think there may be a fundamental reason why a GCM cannot capture short term cycles, except by luck.

    If you think about it a GCM is spun up to a steady state with no drift( <2% drift) so if you initialize at say 1850
    you initialize with a steady state SST. not sure what the state of ocean current vectors would be, but I'm assuming zero.
    hard to get your first cycle right from that state.. but Im just thinking conceptually about how the models work.

    Generally speaking some of the models do better than others. best tool we have even if it is pounding nails with a rock

    carrick?

  33. Steve Mosher,
    “just accuse them of being a moron who
    just committed the etymological fallacy ”

    Nah… that would be an ad hom attack.

  34. Steve Masher,
    “best tool we have even if it is pounding nails with a rock ”

    I think the rock in this case is not really the GCM’s but the “optimization” of the models with whatever level of aerosol effect and ocean heat accumulation you need to get a reasonable hindcast. As practiced, this turns the GCM’s into some bizarre combination a physical model adjusted via curve fitting to match the historical data.

    So I don’t really thing they are much of a useful tool at all in terms of accurate temperature predictions. Heck, a curve fit based on combined radiative forcings (immediate and lagged by 12 years), AMO and Nino3.4 for 1880 to 1975 makes a very accurate prediction of 1976 to 2010 global average temperatures, including the 1998 El Nino peak and the relatively level temperatures after 2001.

  35. @Steven Mosher

    > The term ad hom has widened in reference over the years and so refers to something broader than when it entered the language.

    No, it has not. Just because you misuse it, and everyone in your particular social circle misuses it, does not make you correct. You no doubt disagree with this – in which case I suggest you argue your case on the relevant wikipedia talk section, under the heading “Common misconceptions about ad hominem”, and see how far you get.

    Back to the current subject… I think that Deltoid thread is required reading TBH. In it you will see the embryo of this “theory” (which was further developed in a few subsequent threads before Girma was banned) with all of the objections that have so far been levelled against it here. Has this deterred Girma? Not in the slightest.

  36. Dave,

    The meanings of expressions really do change. There is never a fixed meaning. Consider: Rule of thumb. He struck out with her. Jump in the deep end. etc.

  37. Steven Mosher,

    Question: Is the globe going to have further warming or cooling in the coming 10, 20 years? How will you find out that? What is available now?

  38. Stephen Mosher, good discussion as always.

    It is my opinion that the models don’t do a good job of explaining short-period climate fluctuations. I believe this opinion is supported by Lucia’s Student’s t-test analysis. Briefly the Student’s t-test is being used by Lucia to determine if the distribution from a given model (or set of models) belongs to the same distribution as the observed data.

    It doesn’t require the stronger statement that people sometimes suggest that it is testing the particular observed pattern of temperature fluctuations. (It is sometimes framed as impossible for a model to reproduce a particular observed ENSO. I don’t know why this is true, in fact, I suspect it is NOT true, and that future “good” models will be characterized by how well they nail the observed pattern of atmospheric-ocean oscillations.)

    I think this failure of the models is tied into the limitations of the models, which includes their finite resolution (250 km grid means you’ll have trouble resolving features smaller that 10x that = 2500 km), but may include other issues like some of the other approximations involved (e.g., one approximation that may play a role is cloud physics).

    So, if I’m right, the models aren’t generating “parallel” Earths in their simulations, they’re generating oscillatory patterns that have nothing to do with real climate at all (at least on these time scales). What this would mean in practice is they aren’t very good at predicting the distribution associated with short-period temperature trends.

    Hope this helps.

  39. Mosher:

    Totally agree that the curve fitting abortion presented above should be flushed.

    However

    You are all wrong with your best tool argument. First of all, the hammer was invented before the nail. Look it up. Second, a rock will beat a nail home into wood. It’s not efficient but produces a 100% accurate final result.

    GCM’s cannot simulate the physics of decadal ocean cycles and feedback loops. They are now running off the rails. Where is the “missing” heat? I’d say the GCMs currently drive the nail about 1/2-way but in no way do they hold the two boards together. I’m not walking on that balcony. In this way, a half-assed “tool” is worse than nothing. It builds a house of cards that the big-bad wolf blows down and eats your bacon.

    How about another analogy:

    You have cancer symptoms. The best cancer test (our “best” tool) has a 50% chance of a false positive. The cure has a 50% chance of killing you. The cancer is fatal at 25% after 20-years. (Hey, this kinda sounds like prostate cancer) Are you going to rush in for surgery after getting a positive test result. How does one apply the precautionary principle in such a case?

  40. @Steve Fitzpatrick

    I’m well aware that language changes, and that dictionaries etc. are just a representation of current usage. However, as far as I am aware, Ad Hominem has a specific meaning that is *different* to that expressed above – in fact, the usage here is explicitly listed as a common misinterpretation, and incorrect usage.

    I’m sure that with continued incorrect usage then the new incorrect meaning will eventually have to be accepted as an alternative meaning. I don’t personally think that this is something to actively strive for, and until the day comes that more official bodies than this blog recognise your incorrect usage as “acceptable/alternative/colloquial” I will continue to point out that your usage is incorrect.

    Indeed, purposely misusing a term in an active attempt to encourage a wider change to its accepted meaning, rather than simply accepting that the term was initially misused, strikes me as a little pig-headed.

  41. 31-5% is the inverse of 20. There are about 20 models used by the IPCC, and they all would need different forcing histories to match the temperature record. I am using a very strict standard here of “exactly right”, but I am also making the conservative assumption that one of the forcing histories used by one modeling group must be correct.

    With regard to whether we should be testing “the science”-I agree that we shouldn’t make up what the AGW claims are, we should deal with the claims that are actually made. But GISS’s forcings are just ONE of a set of claims with large uncertainty that is embraced by the modelers. Why only take GISS as representing the official story? Maybe some other modeling group would object that their forcing history is the one people should believe and supports “the science” of AGW.

  42. Do you suggest that your attacks were not personal in nature and irrelevant to the subject? (eg. “can anyone explain to me, how Girma got that PhD?”) However you wish to classify them (personal insults, ad homs, or any other term you think is “more correct”), you routinely make personal attacks on other people in your posts, simply because you disagree with what they say.

    i can t stop you, from using a term the wrong way.
    my question about the Girma PhD is a real one. how can a guy, who thinks that we are eating up the earth literally, have a PhD. it is just unreal.

    i don t make personal attacks because i disagree and nothing that i said, was irrelevant to this subject. Girma has demonstrated some serious lack of understanding of the most basic stuff in the past. i did provide quotes , links and counterexamples.

    if you insist to use the term “ad hom” differently, then please don t use it against potentially insulting phrases, that are the complete opposite of an “ad hom”.

    The meanings of expressions really do change. There is never a fixed meaning. Consider: Rule of thumb.

    now it is obvious, that the expression (expression, not logical argument!) comes from the use of the thumb as a measuring tool.
    but i am curious: what do you think the term “rule of thumb” did mean in the past?

  43. TTCA:

    But GISS’s forcings are just ONE of a set of claims with large uncertainty that is embraced by the modelers. Why only take GISS as representing the official story? Maybe some other modeling group would object that their forcing history is the one people should believe and supports “the science” of AGW.

    The models are not all equal, so one chooses the best models to compare against, not the average one.

    I’ll ask the question again, are you aware of a model that doesn’t arrive at the conclusion that prior to 1980 anthropogenic CO2 and sulfate emissions were nearly balanced?

  44. Sod:

    i can t stop you, from using a term the wrong way.

    Nonetheless, a refresher course in linguistic analysis would help you here, you are wrong on this account.

    i don t make personal attacks because i disagree and nothing that i said, was irrelevant to this subject.

    ORLY?

    Name one example where you attacked a person you agreed with.

  45. ORLY?

    Name one example where you attacked a person you agreed with.

    logic isn t your strong point either. (why not admit your errors above? my graph was important, yours were not!)

    i sometimes use strong words, when people continue to make the same obviously false claims.

    anyone who has taken any look at the claims made by Girma in the past, will struggle to find words that are suitable for print. everything i said were attempts of description, not baseless personal attacks.

    that you folks give him a pass on the 2eating removes mass2 claim, is beyond believe!

  46. just take a look at this graph from Girma:

    he really thinks that the space between the linear trend and the overlaid sin function has some special meaning.
    (weirdest “error bars” ever)

    shouldn t this tell us something about his math skills?

  47. 46-I don’t know the exact forcing histories that each and every model uses. Given the range of uncertainty, it isn’t unreasonable that at least a few models look pretty different in the earlier period.

  48. Sod:

    logic isn t your strong point either. (why not admit your errors above? my graph was important, yours were not

    Sod, I’m pretty sure that word logic means something different than its ordinary use in the language. LOL.

    As to the figures… heh.

    large view

    Note that the pink curve doesn’t diverge (by more than 0.1°C) until circa 1975.

    It doesn’t say anything different. Mine is one model, I believe this is a composite of many. They appear to agree.

    There’s the full link.

    (I’ll assume Sod’s URL kung fu is as good as his graph reading and linguistic skills. And his sense of humor. 🙂 )

    i sometimes use strong words, when people continue to make the same obviously false claims.

    You “sometimes” use strong words.

    In the sense of the sun “sometimes” rises in the East!

  49. TTCA:

    46-I don’t know the exact forcing histories that each and every model uses. Given the range of uncertainty, it isn’t unreasonable that at least a few models look pretty different in the earlier period.

    Well if you’re going to make claims… you need to back them up. You guys shouldn’t leave it to me to carry your water for you all of the time.

    In the mean time, Sod has helped us by “finding” a reference that combines the models. See my link in the above figure.

    It does indeed appear ubiquitous. See also my comment on “good” versus “bad” models. Since is not a democracy of course. Bad models don’t get a vote.

  50. Dammit. I blew a perfectly good sound bite. Should have said “Science is not a democracy of course. Bad models don’t get a vote.”

  51. Steven Mosher is correct that we need to provide a physical/energy mechanism to explain the curve fitting. It cannot be magic, there has to be a physical/physics rationale behind the temperature changes/cycles.

    For the ocean cycles, it is not hard to demonstrate that for an El Nino, additional energy is held in the atmosphere and for a La Nina, extra energy is lost from the Earth system to space.

    Out-going Long Wave Radiation (OLR) actually falls by massive 50 Watts/m2 to 90 Watts/m2 over the Nino 4 and Nino 3.4 regions lagging about 2 months behind an El Nino (tropical convection storms hold the heat in). Combined with the prevailing atmospheric circulation patterns, we can explain why the equatorial region and northen North America warms up/cools down in a lagged fashion to the ENSO.

    It is not a stretch then to imagine that the AMO also provides a similar energy exchange with the atmosphere. [The AMO itself seems to respond to large ENSO events with a lag of 4 to 6 months].

    So we have a physical explanation for the temperature impact of the ocean cycles. In the long-term, these changes are just a plus/minus and should average out to an impact of Zero over 50 to 100 years but in the short-term (1 year for the ENSO and 60 years for the AMO), the ocean cycles are going to affect the global temperature trends.

    —————–

    Over at Roger Piekle Sr.’s blog, Josh Willis noted a new paper is coming out that tracks the ocean heat content down to the bottom. Only 0.1 Watts/m2 is going into the deep, deep oceans …

    http://pielkeclimatesci.wordpress.com/2010/04/19/further-feedback-from-kevin-trenberth-and-feedback-from-josh-willis-on-the-ucar-press-release/

    … so Trenberth’s Missing Energy is not missing after all – it is NOT there – it is escaping into space.

    Trenberth also published a paper in 2009 …

    Click to access energydiagnostics09final.pdf

    … which contains this curious graph – the usual Radiative Forcing diagram we are used to seeing from the IPCC/Hansen but it contains an additional box at the end which also includes for the first time, the “feedbacks” that are expected from water vapour and ice albedo (which is bigger than the net GHG forcing of course) but it also includes a new term, “Negative Radiative Feedback” which is supposed to cover the Missing Energy / Forcing that can be found / A Mysterious Negative Feedback / Error in the Global Warming Theory – a huge -2.7 Watts/m2.

    ———–

  52. Steve Mosher has my vote on the models issue. Getting a wonderful fit to past data does not mean very much unless there are some physical realities underlying the equations. Bill Illis demonstrated this point very clearly early in this thread (#3).

    Even though the CGMs look really bad compared to some of the cute curve fitting exercises, CGMs can be improved to the point that they make useful predictions. Roy Spencer explains this rather well at:
    http://www.drroyspencer.com/2009/07/how-do-climate-models-work/

    Even when our understanding has improved to the point that we have CGMs that can model the climate on a “business as usual” basis, there are still many events that are essentially unpredictable over the long term. For example can anyone say when the next volcanic eruption to match Mount Tambora (1815) will occur?

  53. Carrick

    In the IPCC bad models do get a vote: in the mails its referred to as the democracy of models. for attribution studies they use a subset. thats cause they want tighter CIs ( i suppose)

  54. Girma said

    “Question: Is the globe going to have further warming or cooling in the coming 10, 20 years? How will you find out that? What is available now?”

    There are several ways to estimate this. doing a curve fit to the last 129 years is NOT your best tool.

    If I had to bet today there would be two tools I would Consider depending on your budget and time frame:

    You want an answer today? I would use Lucia’s lumped parameter model.

    I would have to estimate the future forcings, I’d use a spread of values. That would give me an estimate and a plus or minus

    You got a bigger budget and more time: you’d have to run some GCMs. Same way if you asked me to calculate a really hard problem in protein folding. time money iron. It’s not clear that your estimate of temp would be any better than Lumpy
    but you’d have a lot more predictands.

    Or you can just take the IPCC estimate of .2C per decade and throw reasonable CIs on it. models are running a bit hot now
    for which there are several explanations. You’ll see that I guess .15C per decade. Lukewarmer.

  55. Sod,

    The worst thing he did on his chart was draw the .2c decade line from a single point on the temperature curve. I suppose it could be worse, he could have pegged it onto 1998.

  56. Bill:

    Steven Mosher is correct that we need to provide a physical/energy mechanism to explain the curve fitting. It cannot be magic, there has to be a physical/physics rationale behind the temperature changes/cycles.

    There has to be MORE than a rationale. the rational merely says it would be good to WRITE AN EQUATION for this. In the end, physics is writing an equation. That equation has variables. Variables are ontological commitments.

    when we write F=MA we are not just specifying a function that fits experiemental data. We are making ontological commitments. we are positing properties force, mass , acceleration. At its heart that is what makes a model physical.
    ontological commitments: So yes, having that physical rationale ( err heat is moving around in some pattern) is the START of an explaination. And seeing the REGULARITY tells us that it may be ammenable to mathematical representations. those two things tell us : its not a waste of time to try to write equations for this .

  57. Howard

    I’m no fan of the models. but the easy sketicism about them ( the models are all wrong) is just that; easy.

    should they be used for policy? good question.

  58. Stephen Mosher, you probably are aware I don’t favor combining the models as a method of statistical analysis. The 0.1°C that I mention above is my eyeball of the error bar from that by the way (I could have used 0.2°C).

    In my opinion, it’s a meaningless statistical treatment, but again, my hardline view is this is where the consensus in the field wants us to start from. So we start from there.

  59. All of you linguistic nazis who persist in denying the fact that language changes can kiss my butt.
    you cant fight linguistic change. I bet you morons think you shouldnt end a sentence with a preposition.
    None of you even know when that rule was invented. You prescriptivist totalitarians.

  60. Steve Fitzpatrick

    one poster child word for linguistic change r is demagogue.

    pejoration is one of my favorite processes

    Bloomfield is a wonderful resource and probably classic text on the matter. or this

    Blank, Andreas (1999), “Why do new meanings occur? A cognitive typology of the motivations for lexical Semantic change”, in Blank; Koch, Peter, Historical Semantics and Cognition, Berlin/New York: Mouton de Gruyter, pp. 61–90
    Blank, Andreas; Koch, Peter (1999), “Introduction: Historical Semantics and Cognition”, in Blank; Koch, Peter, Historical Semantics and Cognition, Berlin/New York: Mouton de Gruyter, pp. 1–16

    Silly are the goddy tawdry maudlin for they shall christgeewhiz bow down before him: bedead old men, priest and prester, babeling a pitterpatternoster: no word is still the word, but, a loafward has become lord.

    google that i didnt write it.

    And for grins:counterfeit garble manufacture

    folks should go look up the original meanings of those.

    and dont get me started on Cleave.

  61. She offered her honor, he honored her offer and… u know the rest

    WRT language change i’m with Dr Johnsen. preface to his dictionary:

    When we see men grow old and die at a certain time one after
    another,century after century, we laugh at the elixir that promises to
    prolong life to a thousand years; and with equal justice may the
    lexicographer be derided, who being able to produce no example of a
    nation that has preserved their words and phrases from mutability, shall
    imagine that his dictionary can embalm his language, and secure it from
    corruption, and decay, that it is in his power to change sublunary nature,
    or clear the world at once from folly, vanity, and affectation.

  62. dave:

    “No, it has not. Just because you misuse it, and everyone in your particular social circle misuses it, does not make you correct. You no doubt disagree with this – in which case I suggest you argue your case on the relevant wikipedia talk section, under the heading “Common misconceptions about ad hominem”, and see how far you get.”

    You dont understand how language changes.DICTIONARIES, even online dictionaries are ALWAYS AND FOREVER tools of conservitavism. They are always in lag. Someday when they write the history of the change in the term ad hom, they will point to this discussion.

    linguistic anarchy now!

  63. Steven Mosher,
    Since all of the IPCC models are totally inadequate in resolution, and assume a strong positive feedback to CO2 (not demonstrated), and since clearly the physics of ocean currents, aerosols, and even aircraft contrails are not properly modeled, the assumption that the models have a chance to give long term trends is totally unjustified at this time. Being the best you can do is not the same as being useful. Curve fitting and simple extrapolation does not contain any physics, but it nevertheless is probably the best way to extend the curve fit to the next half cycle given the previously observed effects of several ocean cycles and the present low Solar activity. The extension to the future obviously will not necessarily continue, but indicators do support the short trend. Curve fitting of recent past history is actually all the models have done. The arbitrary added terms and coefficients used to fit that data to actual real pieces of physics does not make the model physics.

  64. #69 – Old Chinese is equivalent to Latin – i.e. only scholars understand it but average people may recognize some words.

  65. What we must not forget is that we need an answer.

    1. IPCC claims a warming of 0.2 deg C per decade and acclerated warming [Figure 5]

    2. Not us, but themselves in private say “slow changes over past decad” [REF 4] and “the level has really been quite stable since 2000 or so and 2008 doesn’t look too ho” [REF 5] (so in private they don’t agree with point 1)

    3. The question each of us has to answer is that in the next 10 to 20 years, will the globe have further warming or cooling.

    I bet anyone, based on my article, as the GMTA pattern was valid for 129 years, we will have global cooling in the next 10 to 20 years.

  66. Sorry Girma, but unless someone can establish a credible physical explanation for why the temperature should follow your sinusoidal pattern, it should just be viewed as curve fitting and of minimal predictive value. People have lost a lot of money in the stock markets using what is called “chartism” finding patterns in stock market activity and betting that they will be repeated, and I can’t see why this is any different. It could have a use in highlighting how weak the GCMs are at present, however. The big question to me is whether a new generation of GCMs can be effective in modelling the extremely complex components of global climate, given increasing computer power. They are handicapped by poor data about initial conditions though, and in terms of estimating forcing if they are using any of Mann’s paleo stuff to build in the last 1000 years’ temperatures, they are doomed from the start.
    Carrick when you have finished your project I would love you to do a post as requested by Jeff – I enjoyed the discussion over at Lucia’s the other week and would appreciate your collating your thoughts.

  67. Leonard w:

    where did you get the idea that feedbacks for C02 were assumed in the models. as for low solar activity are you talking about TSI or spot nonsense

  68. I’m somewhat surprised by the reaction to this posting.

    If I understand what Girma is doing here, it is very simply an attempt to find patterns in the high-quality temperature record for the recent past.

    And he has identified two very simple trends in the available data, one linear, the other sinusoidal.

    Surely that by itself is very significant. Obviously, much work would be needed to identify the physical mechanisms in play, and the link to the PDO is one possible component.

    But surely the important part of this analysis is that there is simply no need to include CO2 levels, clouds, aerosols, etc…. and Occam’s Razor would imply that the existing extremely complex models may be looking for trends that do not exist.

    Obviously the model has no predictive power, but that’s not what it is for, as far as I can tell.

  69. Bill Illis #55,

    Thanks for the link to the exchange between Pielke, Trenberth, and Josh Willis. Very interesting. My personal take: Willis pretty much says “Kevin, you are in denial. All the research groups calculating ocean heat content reach close to the same conclusions: not much heat has been accumulated in the oceans since 2003.”

    Trenberth continues to reject all reports that the ocean is not accumulating much heat, since without accumulated ocean heat it’s awfully hard to explain how radiative forcing estimates and climate sensitivity estimates from models are consistent with measured surface temperatures. I can suggest a very simple explanation: the actual sensitivity is about 1.5 C per doubling of CO2, not 3+ C, and the modelers have used aerosols “estimates” to cancel about half of the true radiative forcing.

  70. re #77,

    It should have been “… the modelers have used aerosols and heat accumulation estimates..”

  71. Gregory

    Why I chose the cosine function was because when I detrend (remove the trend) from the observed GMTA data, I got the following oscillating anomaly that can be approximated by a cosine function.

    http://www.woodfortrees.org/plot/hadcrut3vgl/from:1880/detrend:0.706/offset:0.52/compress:12

    Gregory, I am not sure we can use my model for more than 20 or 30 years. We had only 129 years of data. My main motivation was that IPCC’s prediction of 0.2 deg C per decade for this and the next decade had failed, and what I am trying to answer is to find whether we will have further warming or cooling in the next 20 years.

    Thank you.

    Girma

  72. Sod,

    Still waiting to see YOUR bio/background posted up.

    The fact that you post on every other thread with rude or critical comments, yet refuse to post on thebackground one its, umm… telling.

    Doc

  73. Steveta_uk:

    If I understand what Girma is doing here, it is very simply an attempt to find patterns in the high-quality temperature record for the recent past.

    Well there are much better ways of analyzing the frequency content of a signal… detrended FFT is one of them.

    And Girma is right, there is spectral structure in the temperature signal. This spectra analysis of a long-duration (1000 year) temperature is one:

    figure

    Note the 55 year period which is close to his 60 year period.

    This is the dominant frequency component in the spectral analysis (meaning it has the largest amplitude), so it isn’t surprising that together with the linear trend it explains much of the variance.

    Anyway the issue that I have (and others probably) isn’t with this part of the analysis, it’s how he uses this analysis. You can’t use this to predict future changes in temperature. At best, what you can say is that the observed spectral fluctuations should carry forward into the future. This might be useful for monte carlo studies of how long a period you would need to observe over to at a high CL resolve a given CO2 climate sensitivity.

  74. Steve Mosher,
    #64. What disgraceful linguistic anarchy. I was always taught that ending a sentence with a preposition was something up with which you should not put.

  75. Paul;
    Prezakly! It is, after all, a combination of two elements:
    “pre” and “position”. If it had been intended for use at the end, it would have been called a “postposition”. Which is a horse race of an altogether different color!

  76. Paul_K said

    April 27, 2010 at 12:12 pm
    Steve Mosher,
    #64. What disgraceful linguistic anarchy. I was always taught that ending a sentence with a preposition was something up with which you should not put.

    There is always a group of people who try to fight against language change. In the 18th century there were a group of thinkers we refer to them as the Grammarians who tried to impose a set of prescriptive rules ( like no double negatives and no ending sentences with prepositions. )The double negative argument was put forward on the specious grounds that two negatives cancel each other, as if language were MATH or LOGIC. Truth be told we often use double negatives for emphasis.
    people dont compute language like they compute logic. The preposition at the end of the sentence? as I recall, this rule was derived from Latin. That is the Grammarians thought that Latin was the model language and because one CANT end a sentence with a preposition in LATIN one should not in english. However, sentences that end in prepositions are common in english and in other related languages, like german where a seperable prefix is often REQUIRED to go at the end of the construction:

    something like that.. its been a while.

  77. perfect example of ad hominem. of course, by a “sceptic”:

    http://scienceblogs.com/deltoid/2010/04/leakegate_leake_based_story_re.php

    Rahmsdorf has taken a look at the Pachauri novel, and finds that the content has been misrepresented in the press:

    Rahmstorf comments:

    After I have read it, I find this a bizarre summary of the novel, apparently aimed to discredit Pachauri.

    There is not a single “lecture on climate change” in the whole book – the theme is only mentioned in passing in about five sentences, e.g. when the hero visits the Himalayas and refers to the shrinking Gangotri glacier.

    The novel is in fact neither “environmental” nor “raunchy”. To the contrary: the novel’s hero Sanjay Nath, whose life story is told, lives in complete celibacy for most of his life. At first as a student in India that’s against his will (his love to a student from Calcutta is unrequited). Then, as a postdoc in the US he sleeps with a woman for the first time, a stranger in a motel (on page 211 of the novel), an experience which makes him decide to live in celibacy for the next five years, which in fact he does for over twenty years. Only in his late forties does he meet the woman of his life and very quickly decides to marry her – but during the preparations of the wedding she tragically dies from cancer. The handful of love scenes in the book is in fact described just in a few sentences and wouldn’t even get a 15-year-old boy excited.

    Perhaps the Sunday Times mixed up Pachauri’s book with Ian McEwans new novel “Solar”? That does in fact include a full lecture on climate change (and a good one, too) as well as some far more explicit sex than Pachauri’s “Return to Almora”. (Which, in fact, to western audiences could be much more easily ridiculed not for its sex scenes but for its central theme of reincarnation.)

    reply by a sceptic:

    Rahmstorf obviously isn’t into serious literature. Does that say something about his science too?

    Just asking.

    Posted by: Dave Andrews | April 27, 2010 4:25 PM

Leave a reply to Paul_K Cancel reply