Bo Christiansen Variance loss

This is a repost from Eduardo Zorita’s blog, regarding a paper produced by Bo Christiansen which analyzed variance loss in climate reconstructions.  It’s a science post so keep all comments/questions on that topic or face the red heat laser of doom.

.

Mann 07 made the claim that RegEM didn’t create variance loss and alleged proof using pseudoproxies.  This was of course ‘proven’ false here recently with a few simple posts and the ‘difference of opinion’ was identified in the artificially trendless noise added to the mannian pseudoproxies.

This is a very ‘hot’ field in climatology because the reasonable scientists have finally recognized the math issue that I found obvious literally within moments of reading my first hockey stick paper at CA.  I’m not that smart BTW, so why is it that Mann can find every excuse to miss the point?

Anyway, I read the pre-print of this paper but if someone can send a non-paywall copy of the final version, I would appreciate it.  There is a cool equation in the pre-print which would make a fun post by itself.

This paper again demonstrates what I consider the primary reason for unprecedentedness and repeatability of so many hockey stick proxy papers.   Remember, the primary defense of the hockey sticks is that so many have reproduced the result.  The reason is twofold.  At CA Steve has focused on unique proxies – they are unique,  here, we have focused on math.

BTW, I doubt any of these scientists are as skeptical of other AGW claims as some of us are so don’t attach some of the tAV fare to them.  That doesn’t mean that they don’t do good, honest and open science – don’t lump their fine work in with the opinions of those of us who are outsiders.

————

Guest post by Bo Christiansen: On temperature reconstructions, past climate variability, hockey sticks and hockey teams

Posted by eduardo
Bo Christiansen from the Danish Meteorological Institute works actively on the development of statistical methods for climate reconstructions, a field of intense debate in the past few years. Hopefully we will enter a phase in which scientific debates remain .. well, in the scientific realm. Enjoy his post.

Anthropogenic emissions of greenhouse gases – in particular CO2 and methane – change the radiative properties of the atmosphere and thereby impose a tendency of heating at the surface of the Earth.  In the past the Earths temperature has varied both due to external forcings such as the volcanic eruptions, changes in the sun, and due to internal variability in the climate system.  Much effort has in recent years been made to understand and project man-made climate change.  In this context the past climate is an important resource for climate science as it provides us with valuable information about how the climate responds to forcings. It also provides a validation target for climate models, although paleoclimate modelling is still in its infancy.  It should be obvious that we need to understand the past climate variability before we can confidently predict the future.

Fig 1. Pseudo-proxy experiments with seven different reconstruction methods. The black curve is the NH mean temperature, the target which we hope the reconstructions will catch. But this is not the case: All reconstructions underestimate the pre-industrial temperature level as well as the amplitude of the low-frequency variability. Note that the reconstructions are very good in the last 100 years which have been used for calibration. The three panels differ in the strength of the variability of the target. From Christiansen et al. 2009.
.
Unfortunately, we do not have systematic instrumental measurements of the surface temperature much further back than the mid-19th century. Further back in time we must rely of proxy data. The climate proxies include tree rings, corals, lake and marine sediment cores, terrestrial bore-hole temperatures, and documentary archives. Common to all these sources is that they include a climate signal but that this signal is polluted by noise (basically all non-climatic influences such as fires, diseases etc.). From these different noisy proxies information such as the global mean surface temperature is sought extracted. A famous and pioneering example is the work by Mann et al. 1998, in which the mean NH temperature is relatively constant with a weak decreasing rend from 1400-1900 followed by a sharp rise in industrial times – the so-called “hockey stick”. There has been much debate about this reconstruction, and its robustness has been questioned (see e.g.). However, some other reconstructions have shown similar shape and this has encouraged some to talk about the ‘hockey team’ (e.g., here). This partial agreement between different reconstructions has also led to statements such as ‘It is very likely that average Northern Hemisphere temperatures during the second half of the 20th century were higher than for any other 50-year period in the last 500 years’ by the IPCC. That different reconstructions show a ‘hockey stick’ would increase its credibility unless the different reconstructions all shared the same problems. We shall see below that this is unfortunately the case. All proxies are infected with noise. To extract the climate signal – here the NH mean temperature – from a large set of noisy proxies different mathematical methods have been used. They are all, however, based on variants of linear regression. The model is trained or calibrated by using the last period where we have access to both proxies and instrumental data. This calibration period is typically the last 100 years.  When the model has been trained it is used to estimate the NH mean temperature in the past (the reconstruction period) where only the proxies are known.  To test such methods it is useful to apply them to long simulations from climate models. Like in the real-world situation we split the total period into a calibration period and a reconstruction period. But here we know the NH mean temperature also in the reconstruction period which can therefor be compared with the reconstruction.  The proxies are generated by adding noise to the local temperatures from the climate model. The model based scheme decried above is known as the ‘pseudo-proxy’ approach and can be used to evaluate a large number of aspects of the reconstruction methods; how the different methods compare, how sensitive they are to the number of proxies, etc. Inspired by previous pseudo-proxy studies we decided to systematically study the skills of seven different reconstruction methods. We included both methods that directly reconstruct the NH mean temperature and methods that first reconstruct the geographical distributed temperatures, The method used by Mann et al. 1998 was included as well as two versions of the RegEM method later used by this group. Perhaps surprisingly the main conclusion was that all the reconstruction methods severely underestimate the amplitude of low-frequency variability and trends (Fig. 1). Many of the methods could reproduce the NH temperature in the calibration period to great detail but still failed to get the low-frequency variability in the reconstruction period right. We also found that all reconstructions methods has a large element of stochasticity; for different realization of the noise or the underlying temperature field the reconstructions are different. We believe this might partly explain why some previous pseudo-proxy studies have reached different conclusions.
.
It is important to note the two different kinds of errors which are examples of what is known in statistics as ensemble bias and ensemble variance. While the variance may be minimized by taken the average over many reconstructions the same is not true for the bias. Thus, all the reconstruction methods in our study gave biased estimations of the low-frequency variability. We now see the fallacy of the ‘hockey team’ reasoning mentioned above; if all reconstruction methods underestimate the low-frequency variability then considering an ensemble of reconstructions will not be helpful.
.
The question that arises now is if the systematic underestimation of low-frequency variability can be avoided. Based on an idea by Anders Moberg and theoretical considerations I formulated a new reconstruction method, LOC, which is based on simple regression between the proxies and the local temperatures to which the proxy is expected to respond. To avoid the loss of low-frequency variance it is important to use the proxy as the dependent variable and the temperature as the independent variable. When the local temperatures have been reconstructed the NH mean is found by averaging. Pseudo-proxy studies (Fig. 2) confirms that the low-frequency variability is not underestimated with this method. However, the new reconstruction method will overestimate the amplitude of high-frequency variability. This is the price we must pay; we can not totally remove the influence of the noise but we can shift it from low to high frequencies. The influence of the noise on the high-frequency variability can be reduced by averaging over many independent proxies or by smoothing in time.
Fig. 2 Pseudo-proxy experiment showing the ability of the new method (LOC, blue curve) to reconstruct the low-frequency variability of the target (black curve). Here the target has been designed to include a past warm period. This past warm period is well reconstructed by the new method but not by the two versions of the ReEM method. From Christiansen 2010.
.
I have applied the new reconstruction method, LOC, to a set of 14 decadally smoothed proxies which are relatively homogeneously geographically distributed over the extra-tropical NH. This compilation of proxies was used in the reconstruction by Hegerl et al. 2007.  The proxies cover the period 1505-1960, the calibration period is 1880-1960, and observed temperatures are from HadCRUT2v.  The result is shown in Fig. 3 together with eight previous reconstructions.  The new reconstruction has a much larger variability than the previous reconstructions and reports much colder past temperatures. Whereas previous reconstructions hardly reach temperatures below -0.6 K the LOC reconstruction has a minimum of around -1.5 K.  Regarding the shape of the low-frequency variability the new reconstruction agrees with the majority of the previous reconstructions in the relative cold temperatures in the 17th century and in the middle of the 19th century as well as in the relative warm temperatures in the end of the 18th century. I consider these real world results mainly as an illustration of the potential of the new method as reconstruction based on decadally resolved proxies are not particularly robust due to small number of degrees of freedom. Work is in progress to apply the new method to an annual resolved and more comprehensive proxy compilation.
Fig. 3 The new real-world reconstruction (thick black curve) shown together with some previous reconstructions. All reconstructions are decadally smoothed and centered to zero mean in the 1880-1960 period. From Christiansen 2010.

Where does all this lead us? It is very likely that the NH mean temperature has shown much larger past variability than caught by previous reconstructions. We cannot from these reconstructions conclude that the previous 50-year period has been unique in the context of the last 500-1000 years. A larger variability in the past suggests a larger sensitivity of the climate system. The climate sensitivity is a measure how how much the surface temperature changes given a specified forcing. A larger climate sensitivity could mean that the estimates of the future climate changes due to increased levels of green-house gases are underestimated.

149 thoughts on “Bo Christiansen Variance loss

  1. A larger climate variability in the past as is hypothesised above indicates to me that we do not understand the past as well as we thought we did, or at least as well as the hockey team thought it did. It does not by iitself imply a greater climate sensitivity but rather that the forcings are not well understood. To me the implication is that natural forcings are larger than previously thought by the hockey team, and that a greater proportion of 20th century climate change. It also leads me to suspect that anthropogenic forcing could be proportionally smaller than previusly thought by the hockey team. So there is every chance that more of the 20th century increase in global temp is natural than is acknowledged by the hockey team. All this is possible with little change to theorised climate sensitivity.

  2. #1 It’s an interesting point which is sometimes made that those who argue for a higher historic variance are arguing for a higher sensitivity. In my view, higher variance means higher fluctuation (forced or unforced) in average temp. Climate models don’t show that of course so the government meme is that skeptics are arguing for higher sensitivity, it also provides cover for reasonable scientists to make a comment in publication. You can sometimes see this point made in papers which are daring enough to be somewhat critical of the consensus.

  3. The core question is then “sensitivity to WHAT?” Physics says CO2 opacity is achieved with about 30m of atmosphere, so changing its concentration is irrelevant.

  4. “A larger variability in the past suggests a larger sensitivity of the climate system. The climate sensitivity is a measure how how much the surface temperature changes given a specified forcing. A larger climate sensitivity could mean that the estimates of the future climate changes due to increased levels of green-house gases are underestimated.

    Quite frankly this is nonesense! And it annoys me intensely that climate scientists make such a statement.

    If the temperature of the planet was greater in the recent multi-millenial past which (based on the vast volumes of documented and other historically evidence) and it clearly was during the holocene climatic optimum, minoan warm period, roman warm period and medieval warm period then it means that in all honesty we DO NOT KNOW why we currently have a modern warm period.

    Given this past significant clearly wholely natural climatic variability the balance of probablity is that our current warm period is due to the same causes that led to such significant varibility in the past and NOT due to man’s emissions of GHG (primarily CO2). Whatever caused our planet to warm significantly in the past did not cause a climatic ‘tipping point’ or ‘runaway’ warming so why therefore given this fact should we be concerned about the possibility of ‘dangerous cliamte change’ due to (clearly non-existent based on this evidence) positive feedbacks in our climate system? Answer? We shouldn’t be as the current modern warm period is much more likely not due to man’s emissions of CO2 but rather is due to as yet to be fully understodd natural causes as clearly was the case in the past. Our current warm period is much more likely due to natural climate variability which we have yet to properly research and so better understand because of the current obsession on the part of some well-funded (so-called) climate scientists to blame it on CO2.

  5. Very interesting article. I would love to see this method applied to a set of proxies covering the MWP. As already commented, the greater variability does not necessarily imply larger sensitivity. My favorite theory is that a possibly large share of the variability isn’t caused by external forcings at all, but rather internal, especially the natural variability in the ocean currents and its interaction with cloud cover.

  6. Intriguing last paragraph. Sure, larger variance might mean greater sensitivity, but would also mean that the climate mean-reverts from greater variance just as happily as from smaller.

    I rather agree with Jeff that the para is Harry Potter’s Invisibility Cloak, allowing them to get away with science critical of the consensus without having a Fatwa issued against them.

  7. I think some of the comments are partially right, but not completely. Strictly speaking, larger variations would indicate larger sensitivity if the climate variations occurred in phase (or after a short lag) with the variations in the external forcing. For the Late Maunder Minimum, this seems to be the case. For the Medieval Warm Period it is not that clear.
    However, an interesting point that is usually overlooked is that the magnitude of climate sensitivity and the magnitude of natural variations may be related. I say ‘maybe’ because it is not at all proven, but it is plausible: consider the example of a ‘soft system’ in which for instance clouds or ocean currents are free to vary and may even reinforce each other -because no other processes can ‘reign them in’ back to normal. Such system will also be more sensitive to external perturbations.
    There exist a theorem in thermodynamics, the so called fluctuation-dissipation theorem, which would support this view for homogeneous and linear systems. The climate is much more complex, but the basic idea is roughly the same

    To the origin of the present warm period (‘…due to as yet to be fully understood natural causes..’). This explanation may be alluring but it is course not sufficient. The earth system has been gaining energy in the last ,say 50 years at least. Assuming for the moment this is totally natural, one has even then to explain by which mechanisms it has occurred. Was cloud cover less than before?, was the sun stronger? was more water in the atmosphere than before ? if so why? etc etc. So, if for the moment one forgets about CO2 and focuses just on the other purely scientific issues, it turns out the there are not answers for this warming yet. I mean quantitative answers like ‘the planet received short wave radiation in excess of radiative cooling by xxx w/m2 “, because this type of cloud cover was reduced by xxx% per year on those and those areas’. It may be that such explanation arises in the future but it is not here yet, and I think this is the major drawback of alternative theories to explain the present warming

  8. Eduardo

    Thank you for joining in on the discussion on this thread. Your presence and opinions are much appreciated.

    “To the origin of the present warm period (‘…due to as yet to be fully understood natural causes..’). This explanation may be alluring but it is course not sufficient. The earth system has been gaining energy in the last ,say 50 years at least. Assuming for the moment this is totally natural, one has even then to explain by which mechanisms it has occurred. Was cloud cover less than before?, was the sun stronger? was more water in the atmosphere than before ? if so why? etc etc. So, if for the moment one forgets about CO2 and focuses just on the other purely scientific issues, it turns out the there are not answers for this warming yet. I mean quantitative answers like ‘the planet received short wave radiation in excess of radiative cooling by xxx w/m2 “, because this type of cloud cover was reduced by xxx% per year on those and those areas’. It may be that such explanation arises in the future but it is not here yet, and I think this is the major drawback of alternative theories to explain the present warming”

    “The earth system has been gaining energy in the last ,say 50 years at least”

    Why only consider just the last 50 years Eduardo? Why not the last 150 years, 500 years, 1000 years, 10,000 years. I’m sorry but I’m afraid that this is the problem. We cannot and never will be able to explain and stand any chance being able to reliably make predictions/projections of our future climate until we fully understand and can explain our past climate. I’m prepared to live with not being able to fully understand why we enter and recover from glacial periods and why the earth expereinced teh Younger Dryas for example, but as a minimum we must be able to understand and full account for the changes in earth’s climate over our Holocene period BEFORE we start to spend BILLIONS of dollars attempting to mitigate and adapt to what may be completely the wrong cause of the late 20th century warming period namely our emmissions of GHGs.

    “It may be that such explanation arises in the future but it is not here yet, and I think this is the major drawback of alternative theories to explain the present warming”

    I totally agree with you on this one, BUT why should we have to rely on computer models with/without GHG(primarily) CO2 forcings as the main evidence for GHGs as the primary cause of late 20th century warming? There is nothing unusual nor unprecedented about the late 20th century warming period? The rate of warming is not significantly different to that experienced from 1910 to 1940. The peak temperatures we experienced during that warm period (seen mainly in the US but only because of lack of coverage elsewhere throughout the world) were greater than those we experienced during the late 20th century warming period 1970 to 2000 even with the 1998 El Nino. Why do you think that GHGs (primarily CO2) are a better explanation of the late 20th century warming than other alternative theories?

  9. #8, Thanks for the comment, I appreciate the nuance of the unknowns and lack of modeled alternative explanations you discuss.

    Most here believe in CO2 based warming and we get endless discussions of how much, feedback from moisture, cloud, etc., so the main issue is certainty of conclusion. Also, there’s that awful conservative streak which doesn’t like the proposed solutions. I hope that the regulars here will keep the post on topic with the paper, and not digress into less interesting discussions of models, money and politics of climate.

    Bo Christiansen’s method attempts to quantify the variance loss and correct for it and is partially successful. If I understand the method from the pre-print, it should be more successful if the proxy variance has a higher frequency (low AR component). I’m not sure if I’ve got this point right and need to read the final version of the paper.

  10. Jeff my company makes CO2 lasers up to 8kW and and we would like to provide a quote for the “the red heat laser of doom” if possible. Did I miss the RFQ:)

  11. “A larger variability in the past suggests a larger sensitivity of the climate system.”

    Larger than what? This kind of statement is problematic IMAO, because it is A. Not quantitative, either in saying how much larger, or what the baseline is and B. Abuses ceteris paribus to an unreasonable degree.

    A larger variability would imply a larger sensitivity, if you knew what the sensitivity implied by the old estimates of variability was, that would mean something, since the relative term “larger” would at least be in reference to something. But we don’t, and that is why we cannot assess this notion quantitatively, because the forcings are unknown. However, even though uncertainty is clearly being shown to be present the reconstructed temperatures, the statement above clearly assumes that there is no possibility that the forcing variability was likewise systematically underestimated. That does not seem reasonable to me.

  12. It is hard to understand a comment that “The earth system has been gaining energy in the last, say 50 years at least” when there is so much residual doubt about global temperature reconstructions, even in that last 50 years. Remember as well, that temperature changes measured in particular locations like an onion skin of the atmosphere, or at the air/solid interface of Earth, do not of themselves indicate a gain or loss of energy. If they are real, they might simply indicate a change in the distribution of global energy. The change can be vertical (we know too little about deeper ocean temperatures) or lateral (we know too little about the polar regions). Alternatively, the balance of IR incoming and outgoing near the tropopause has too short a history of measurement.

    It is easy to construct an instrumental “adjusted” temperature time series that satisfies the most common objections to the present main series. Simply add 0.2 degrees C to all temperatures before 1950 (since many were artificially reduced by about that figure); then, since 1950, reduce the trend to the present incrementally to hit year 2010 at an anomaly value of 0.3 deg C (since this offsets the worst of the much-discussed effects of TOBS, missing data, extrapolations over 1200 km and so on). It is possible that this “skeptics preferred” graph is closer to reality – good arguments exist.

    If this graph, adjusted as many “deniers” would like it, is then used to recalibrate proxies, a vastly different scene could be expected to emerge when the main accepted proxies are recalculated.

    On the matter of whether sensitivity or variance explains the greater scatter as we go back in time, surely variance is dominant because the number of observations decreases and their X-axis calibration against time becomes more uncertain. But in much of this error work, there still seems to be confusion between “precision” and “accuracy”, with less attention paid to the latter, sometimes named “bias”.

    Finally, this deep analysis is probably contained essentially within the envelope of noise. Is it worth the effort? (I don’t mean the effort to artificially reduce the noise, I mean the effort to separate a valid signal from the noise.)

  13. Ken,

    ‘Why only consider just the last 50 years Eduardo? Why not the last 150 years, 500 years, 1000 years, 10,000 years’

    I totally agree. I was just was focusing on the last 50 years because it is the somehow the contended period. But you are right, there is so far no quantitative explanation of the temperature differences between glacials and interglacials, for instance. Likewise the Medieval Warm Period. Not all climate science is just the present climate and the next 100 years. Actually the most interesting climate science is not this period, in my view.

    ‘There is nothing unusual nor unprecedented about the late 20th century warming period?’

    Let us agree for the moment that humans have not had any role in the last 50 years. What I was trying to explain is that, even in this case, we should be able to really explain the warming (or the cooling or whatever you think it happened), even if it was not unusual. To say that it is natural or unprecedented is not enough. we should be able to explain mechanistically what is happening. Thats my point: to invoke natural oscillations is a sort of hand-waving. In the end , the explanation has to be in terms of energy-in-energy-out-and-what-happened in-between.

    ‘Why do you think that GHGs (primarily CO2) are a better explanation of the late 20th century warming than other alternative theories’.

    The problem is that I dont see these alternative theories standing now, in the hard, quantitative sense that I explained above. What I am arguing is that we have to require the same standards for those alternative theories as we are requiring for AGW. If we are presently unconvinced by AGW, we should also be unconvinced by all other theories.

  14. #14: If we are presently unconvinced by AGW, we should also be unconvinced by all other theories.

    Here’s why I (a climate layman but trained in mathematics, logic and statistics) think mainstream AGW theory is unconvincing: The fact that the Arctic warmed just as fast and almost to the same level as now (in some of the best documented sites, e.g. on Greenland even above the current level) 70-80 years ago makes AGW a very weak explanation for the current Arctic warming, which is marketed as the evidence of AGW. In addition: with the hockey stick pretty much broken we are left with the accounts of my norse ancestors that Greenland was in fact warmer thousand years ago – and that makes the current Arctic warming look even more likely to happen without humans interfering with the the CO2 content of the atmosphere.

    In fact I think this pretty much falsifies the standard AGW+strong positive feedbacks theory. This still leaves the part of AGW theory that actually satisfies your requirements for a stringent mechanistically explaining theory – the warming due to CO2 itself. But an AGW theory without positive feedbacks is not very scary, in fact it then remains an open question whether increased (doubled, even quadrupled) CO2 in the atmosphere could be mostly beneficial (for plant growth, for postponing the next little ice age or even glacial).

    I can only speak for myself, but I think many other posters to tAV think in a similar way: Yes, there is some warming due to CO2 emissions, but NO, there is no compelling evidence that this warming is harmful. Quite the contrary: There is plenty of historical evidence that cooling a degree C or even only a half below the current temperature level is really harmful. As much as I love the sight of the glaciers of my home country, I can’t help thinking that they are really looming natural disasters…

  15. I agree that saying “natural variance” is a handwave but I bet models can be tuned to demonstrate the energy transfer can occur between ocean and air to make this happen – whether they are realistic or not is another question. When we start to consider UHI, which I believe is a real an substantial effect, it may be an uncomfortably easy explanation for some. None of this negates the radiative effects of CO2, if UHI is even 20% of the temp signal, it opens a lot of room for consideration.

    #14, Have you ever experimented with ‘calibrating’ to a lower trend signal and looking at the variance loss from that? It does seem to have an effect on the historic signal loss.

  16. For me the interesting thing about greater long term climate variation (that is, over 100+ years) is that it pretty much forces you to draw one of three conclusions:

    1) The climate models have the sensitivity about right, and there are much larger scale changes in historical forcing than is currently believed, or

    2) The expected size of changes in historical forcing is about right, and the climate models have the sensitivity far too low, or

    3) The climate models do not accurately capture the behavior of the climate system at relatively long times scales (and maybe not at short times scales either!).

    Since (I believe) no climate model shows shows the kind of historical variation (MWP roughly equal to today, LIA period ~1.5C colder) that is consistent with a range of historical records, and since very high climate sensitivity would seem to be precluded by the measured temperature response to increasing GHG forcing over the last 100 years, either there are larger historical variations in forcing than currently understood, or the climate models simply do not accurately capture the behavior of the system.

    Considering how poorly climate models capture the (measured) rapid temperature rise form 1920 to 1945, it seems to me the most probable explanation is that the models are just a very long way from an accurate representation of the Earth’s climate.

  17. Eduardo

    “If we are presently unconvinced by AGW, we should also be unconvinced by all other theories.”

    Same here! I am personally unconvinced and IMO if I want to call myself a scientist (which I do) I should never be ‘convinced’ about anything. Rather I should continually seek to refute the current consensus on any scientific matter (as I do whether it be climate change, the origin of the universe, of life on our planet etc).

    However as I mentioned before it not unreasonable for scientists to way up and make decisons based on the balance of probabilities as to whether or not the advice they are providing to policy makers is scientifically sound (robust) or not and whether or not given the balance of probabilities (risks and consequences) that their advice can be used to justify unprecedented expenditure on measures to mitigate such low probability hazards that have potentially high consequences. I’ve been involved as a safety and risk management consultant for many years and have applied the ALARP principle (http://en.wikipedia.org/wiki/ALARP) many times in the energy and transport sectors.

    IMO and experience if we were to apply the ALARP principle to the whole current AGW issue then the outcome would be that there is no justification for spending trillions of dollars to mitigate a situation in which we currently don’t know what the likelihood of the hazard event (dangerous climate change) is nor it’s likely consequences (no. of deaths and serious injuries avoided).

  18. #17 Steve Fitzpatrick

    For me the interesting thing about greater long term climate variation (that is, over 100+ years) is that it pretty much forces you to draw one of three conclusions:

    1) The climate models have the sensitivity about right, and there are much larger scale changes in historical forcing than is currently believed, or

    2) The expected size of changes in historical forcing is about right, and the climate models have the sensitivity far too low, or

    3) The climate models do not accurately capture the behavior of the climate system at relatively long times scales (and maybe not at short times scales either!).

    Shouldn’t one also consider the possibility that the climate responds differently to different forcings? That the amplification mechanisms are poorly understood?

  19. However, an interesting point that is usually overlooked is that the magnitude of climate sensitivity and the magnitude of natural variations may be related.

    I think that what Zorita and Christiansen say here is what Steve M has often referred to as the retort he often receives about the reconstructions being “wrong” and the consequences thereof. His reply is that reasoning makes it is even more critical to determine the validity of the reconstructions and particularly whether the large past variances were missed. To understand specifically that lament I think one has to assume the “problem” lies with the GHG warming as being greater than predicted with the currently accepted sensitivity and even without any added positive feedback effects. On top of the GHG warming, you also have to impose the greater warming (cooling) on the “natural” variations of climate.

    If under these circumstances one sees warming as a totally detrimental development for the earth as a whole, such as projected/insinuated by the IPCC and others, then one would project more alarm.

    On the other hand, if natural variations in the recent past, relative to human existence at near the civilized form of today, were actually larger, that to me would indicate that the human experience has handled/adapted to those variations. In fact, a rather lengthy naturally occurring cold period in the future added to a GHG warming tendency could, in effect, moderate the climate to something near current conditions.

    Also, cold periods in the past have been associated with detrimental outcomes for human and a warming tendency that avoids those cold dips could be a good thing.

    Finally, if reconstructions have been wrong, and thus our climate models, the state of our understanding of climate comes under question and says to me to go slow with projections of future climate unto the human condition.

  20. I don’t see that this addresses the fundamental problem of proxy selection (or weighting) based on a modern calibration period – that it selects for a combination of signal and noise matching the observed modern increase in temperatures, while (necessarily) leaving signal and noise uncorrelated in the pre-calibration past.

    Still, it’s nice to see any attempts to improve reconstruction techniques.

  21. Dr. Z.

    “To say that it is natural or unprecedented is not enough. we should be able to explain mechanistically what is happening. Thats my point: to invoke natural oscillations is a sort of hand-waving. In the end , the explanation has to be in terms of energy-in-energy-out-and-what-happened in-between.”

    Agreed.

  22. #19,
    “Shouldn’t one also consider the possibility that the climate responds differently to different forcings? That the amplification mechanisms are poorly understood?”

    Well, sure, but I think that is covered by “3) The climate models do not accurately capture the behavior of the climate system at relatively long times scales (and maybe not at short times scales either!)”; the key being “do not accurately capture”… for whatever reason.

  23. Dr. Zorita, your 2 statements seem odd to me… maybe someone can explain it to me.

    To say that it is natural or unprecedented is not enough. we should be able to explain mechanistically what is happening. Thats my point: to invoke natural oscillations is a sort of hand-waving. In the end , the explanation has to be in terms of energy-in-energy-out-and-what-happened in-between.

    So then to invoke AGW is also a sort of hand-waving. If the current explination that blames CO2 was incorrect regarding “energy-in-energy-out-and-what-happened in-between” then why are so many scientists still convinced human produced CO2 is the sole reason (feedbacks included) for the warming? No one has mechanicallisticaly replicated human caused CO2 warming AND natural warming in the same model. If you can’t tell me how it warmed naturally in the past, how does anyone know you can correctly say why it is warming un-naturally now?

    My point is, if it is known that natural oscillations caused the past warming and cooling, isn’t it up to science to show exactly why it was warmer or cooler in the past natually, then showing how that is not the case now, BEFORE blaming man for current warming? Isn’t that why the hockey stick was so important, so that such a discussion and hard scientific work could be skipped?

    The problem is that I dont see these alternative theories standing now, in the hard, quantitative sense that I explained above. What I am arguing is that we have to require the same standards for those alternative theories as we are requiring for AGW. If we are presently unconvinced by AGW, we should also be unconvinced by all other theories.

    It seems interesting to me that if you alter the assumption in which CO2 was hunted down as the cause of the unprecedented warming, why is it that other theories must disprove CO2 to be valid in today’s climate when no one has shown why it was different in the past? CO2 is known to NOT have caused the past climate oscillations that were greater than what we experience now. How is it that CO2 needs to be disproved against natural variation instead of showing that all the past causes of warming are not the case now and THEN show that CO2 is the cause? CO2 now has a step in it’s proof missing. DIS-proof of natural warming in todays climate. I don’t see how anyone can believe in AGW when this proof is missing.

    You say you don’t see alternate theories standing in the hard, quantitative sense that you explained above… do what now? You are saying that the “alternate theories” (I would assume even those that have yet to be published) would not stand to scrutiny even though we know there MUST be something else in nature other than CO2 that has caused past climate occilations. CO2 has not proven itself to pass that quantitative sense. If you want to say CO2 is the cause of present warming, AGW scientists must show how today is different from past. They have not done so. Nature has proven that it can vary the climate in ways greater than what we experience today. It is up to AGW to prove otherwise.

    If I know something has happened 10 times in the past for natural reasons, I am not going to immediatly believe anyone who assumes that man caused the latest example until they prove that it was not caused by one of the 10 previous reasons. There is not an “IF” I am not conviced, no contrary proof has been offered. If it is the first time, sure any marginal proof is good, however we know now that it has happened in the past and so before blame can be placed on man, we must be shown proof of why the 11th time was different from the 10 others. This has not been done.

  24. GS;
    your distinction between precision and accuracy is key. In any quantitative procedure, you only get the number of significant digits allowed by the LEAST precise one.
    E.g.: if I have 2 identical isothermic volumes of identical material, and measure one with imprecise instruments to have a temperature of 19°C, and the other with very accurate instruments to have a temperature of 11.321548°C, and then try to predict the net temperature resulting from mixing them, the best I can do is project 15°C will be the new measurement. 2 significant digits is all she wrote for that operation.

  25. P.S. For the above, if you’re bothered by expansion and other subtleties of gas temperature averaging, substitute “mass” for temperature and “gm.” for °C.

  26. Boris,

    I do hope you joined in this discussion to make a contribution and not just to troll or try to distract those contributing to it.

    Yes the work of Bo Christiansen and its implications is very interesting. It its simplest terms it shows that the proxy reconstructions reduce (obfuscate?) the amount of variability in the data and as a consequence may (most likely do IMO) give a fals eimpression of past multi-centennial timescale climate variability. This therefore directly impacts on their contribution as to whether or not global temperatures (particularly pre-industrial) have been significantly greater (e.g. MWP) than and less than (e.g. LIA) they are today i.e that our planet has experienced (non-GHG caused) significant climate change in the recent (millenial to mult-milenial timescale) past. What is your opinion on that?

    Stilgar – #24 is a greater summary of the current situation.

  27. @ 13
    Geoff,

    The energy imbalance is rather not recorded in the surface temperature, but in the ocean water temperature and/or sea-level. Now, we can discuss the reliability of sea-level measurements, but I think it is established that sea-level is rising. Satellites and tide-gauges both indicate it. I dont have any other explanation for sea-level rise that water temperatures are increasing and/or land ice is melting. Both processes require a net energy intake by the Earth. Now we could argue that sea-level has been rising before the increase of CO2, but I am not discussing this. My point is that a theory of climate, any theory, should be able to explain for instance why sea-level is rising. The argument like it has been rising for a long time is not enough. If one thinks that CO2 is not responsible, one has to find another factor

    ‘Finally, this deep analysis is probably contained essentially within the envelope of noise. Is it worth the effort? (I don’t mean the effort to artificially reduce the noise, I mean the effort to separate a valid signal from the noise.)’.
    Sorry, I miss your point here. Are you arguing that we cannot know past climate or that the variations are very larger or very small ?

    I think that it is worth the effort because any kind of independent information will have added value, even if it is noisy. In the end it may turn to be not that useful, but who knows. The goal of Christiansen’s studies and other’s is try to estimate the error variances and the biases as well with controlled data sets

  28. I have a fair amount of skepticism that the proxies have any detectable temperature signal at all. However, if working methods were developed for extracting ‘favorite’ proxies without distorting the historic record AND those methods produced very similar results (not just similar looking) in the historic record (from independent datasets), we might actually learn what went on in history.

    If the effort is not made to verify that independent proxy datasets produce the same historic results – not just hockeysticks – we will never know if they have any relationship to temperature whatsoever. Today, it’s just a combination of red noise asserted to be temp with a hand waive that is no better than saying all temperature change is natural. I do have hope that reason will prevail down the road though, and someone will figure out that the red noise is a greater influence on mannian hockey sticks than the temperature itself.

  29. @ 24

    Stilgar,

    I think this is drifting away from the original post, but nevertheless a couple of short comments.

    ‘You are saying that the “alternate theories” (I would assume even those that have yet to be published) would not stand to scrutiny even though we know there MUST be something else in nature other than CO2 that has caused past climate occilations.#

    No, I didnt say this. I cannot say if a future theory will or will not stand scrutiny. I havent even said here that CO2 is the correct right theory. I just said that it seems to me that the standards by which we *all* judge a theory are not homogeneous. People that believe in AGW tend to not pay attention to the data that dont fit (say increasing sea-ice cover in the Southern Hemisphere), but likewise people that dont believe in AGW tend to accept other hypothesis very uncritically, sometimes as they were magic.

    ‘My point is, if it is known that natural oscillations caused the past warming and cooling, isn’t it up to science to show exactly why it was warmer or cooler in the past natually, then showing how that is not the case now’

    yes, this is actually what climate science tries to do. One has however to consider some caveats. The most important is that science never proves a theory. All theories can turn to be wrong in the end, and so all are provisional. This happened even to the Newton theory of gravitation. So to show ‘excactly’ is an ideal that never is reached, not only in climate science. The situation now is rather that among the known processes that appear to influence past climate (GHG, orbital variations, solar variations, etc) only GHG appear now to be able to explain the current recent warming. I have tried to word this sentence carefully. Should another theory appear that explains the observations better or explains more observations, we should adopt that theory. My point is that this theory should be at least as quantitative and as explanatory as GHG, and be scrutinized by the same critical standards.

    In the mean time.. From here onwards, science stops and policy starts. How large is the certainty, the risks and costs..this is another matter

  30. #32: The situation now is rather that among the known processes that appear to influence past climate (GHG, orbital variations, solar variations, etc) only GHG appear now to be able to explain the current recent warming.

    But again: It doesn’t explain the 30-40s warming, especially in the Arctic. It doesn’t matter that we have no better hypothesis for the current warming when that hypothesis can be so easily falsified just by considering a few decades more than the most recent 60 years of data.

  31. 31 Jeff. Agreed. My thoughts are drifting towards doubts about any proxies being useful. Part of it stems from an uneasy feeling that the errors of measurement at all stages are much, much greater than commonly expressed. It’s like the model ensemble case, when the error is assessed between a number of submitted comparisons. I’ve long argued that the errors should be calculated by including all of the model runs from all modellers in the round robin, apart from those runs that were rejected for obvious reasons like transcription errors. So, for dendroclimatology, I strongly disagree with the notion that there are special trees that do contain useful proxy data. I’m not alone, I’m sure. So why not lump all tree measurements into a big blob and then calculate the variance in temp and time estimates, to see if the method has legs?

    I’d be delighted to find a reliable proxy because it would reduce the amount of argument and needless work we have at present.

    30 Eduardo. It follows that I’m saying that even the “best” proxies we have are too prone to high measurement variance. We should be looking for new proxy ideas more than trying to refine old data. If we have run out of new proxy ideas, then someone has to make the decision to stop proxy work. I was brought up on finding new mineral deposits. If, after a certain amount of work, it became apparent that an anomaly was not going to become an economic ore deposit, we moved on as quickly as possible. There seems to be little concept in climate science of pre-planned review points, pass or fail, when dead ideas can be buried.

    You get your best return from putting your scarce resources towards the most promising prospects that you have.(Bar Lady Luck sticking her nose in now and then).

  32. Three times in the last century and a half the rate of temperature rise has been the same as it was in the last quarter of last century, and only the last of these was associated with a rise in CO2. This, to me, is good, but certainly not perfect, evidence that CO2 is not the cause of the recent temperature rise. We can also correlate those three temperature rises with oceanic oscillations and other natural cycles.

    How do I know this? Why Phil Jones told me so.
    ==============

  33. 33,34,35

    I think all those are healthy critical question. Separating the scientific and policy questions allows a better discussions. I dont have complete answers to all, but I do have some comments.

    ‘we have no better hypothesis for the current warming when that hypothesis can be so easily falsified just by considering a few decades more than the most recent 60 years of data.’

    The origin of the warming around 1940 is indeed not clear, but it by itself doesnt disprove GHG as origin of the recent warming. Both might perfectly have different originating factors. All will be clearer when both warming periods could be explained. The 1940 warming has a different spatial structure, more concentrated on high-latitudes as Espen wrote, whereas the recent warming if we believe the several data sets, has a more global character. So, they could well have different origins. Many periods in the past have been warmer than today, the Midholocene Optimum quite likely, but this doest disprove either that GHG are contributing to the warming today. For the Holocene optimum, about 6000 years ago, it seems more clear that the orbital configuration was the major player.

    ‘My thoughts are drifting towards doubts about any proxies being useful.’
    There are proxies contain a clear signal and others that are more contaminated by other influences. Think for instance of the tree line – altitude or latitude. There are today fossil tree trunks that indicate where the tree line was in the past. This seems a quite clear signal, although perhaps the time resolution is not that good. Historical evidence indicating periods of freezing lakes or absence thereof is also quite clear. So there are indeed good proxies.

    If we have run out of new proxy ideas, then someone has to make the decision to stop proxy work.’
    Sure, there are bad proxies, no doubt. But to dismiss a whole branch of research just like that seems to me premature. There are new proxies that may yield interesting information, I am thinking of isotopic composition, for instance. Research in proxies is not at all that expensive, compared to satellite missions or computing centers. Climate research as a whole also gets a small amount of funding compared to other sciences. look for instance at the funds allocation in the next European Framework program http://cordis.europa.eu/fp7/budget_en.html

  34. Eduardo,

    Thanks for your contributions; yours is a voice of reason. The work of Christiansen et al addresses one of the weaknesses of the methodologies used by Mann et al (and many others), but does not address what seems to me to be the most fundamental weakness of these methods.

    It seems to me that the whole process of proxy selection based on concordance with the instrument temperature record is fraught with problems, as Jeff Id, Steve McIntyre and others have pointed out many times. Searching for correlation between the brief instrument temperature record and a multitude of noisy (both red and white) proxy series, and then selecting/weighting/calibrating of proxy series based on correlation with the instrument record, is very likely to generate terribly distorted results.

    The selection and calibration of proxy series ought to be based on physical/chemical/mechanistic arguments, not based on concordance with the instrument temperature record. The argument ought to be something like “the ratio of O16 to O18 in sea shells varies with temperature at a specified rate due to a specific physical process, which has been verified in the laboratory”, or “the ratio of deuterium to hydrogen in rainfall is known to vary by a certain amount per degree change in temperature due to differences in vapor pressure of DHO versus H2O”. If tree rings vary in width based on average temperature, then it should be possible to quantify this for each species of tree used, by collecting data from multiple sites over a period of ~10 years, measuring how much the ring width varies under known (measured) temperatures, and at the same to test for sensitivity to a host of other potentially confounding factors, such as moisture, CO2 level, average light intensity, etc. I find it truly unbelievable that climate science can spend billions of dollars (not to mention Euros, Pounds, and Yen) on GCM simulations, but fails to demand (and generously fund) this kind of robust proxy selection and calibration.

    A reconstruction calculated from proxies which are selected and “calibrated” based on sound physical rational and data is inherently more robust, and can’t be corrupted by random concordance with the brief instrument temperature record. Indeed, concordance of such a reconstruction with the entire instrument record then becomes a robust validation, rather than a source of error. If a set of proxies were rationally selected and calibrated, and then simply averaged to reduce noise and generate a reconstruction, and if that reconstruction were concordant with the instrument record, then this would be a reconstruction I would be very strongly inclined to believe.

    As things are presently done in paleoclimatology, there is no reason at all to believe the reconstructions, because the entire methodology seems near certain to produce spurious results.

  35. Kim

    #35 “Three times in the last century and a half the rate of temperature rise has been the same as it was in the last quarter of last century, and only the last of these was associated with a rise in CO2. This, to me, is good, but certainly not perfect, evidence that CO2 is not the cause of the recent temperature rise. We can also correlate those three temperature rises with oceanic oscillations and other natural cycles.

    How do I know this? Why Phil Jones told me so.”

    Indeed the ‘good Dr Phil’ told this to Roger Harrabin during a Q&A email exchange earlier this year following Climategate (http://news.bbc.co.uk/1/hi/8511670.stm)

    The very same ‘good Dr Phil’ back in march 2009 also advised the lead authors and reviewers of IPCC AR4 (including Susan Solomon, Kevin Trenberth) that it would be a very bad idea to try include a reference to a papre published by Chylek and Lohmann which attempts to find a GHG warming signal (with due allowance for the effect of the NAO taken into account) in Greenland. The ‘good Dr Phil’ proceeded to rubbish this claim and in fact sent the ‘team’ a series of charts that show that it was somewhat warmer in SW Greenland during the 30s/40s (particular during the winter months) than it is today. So according to the ‘good Dr Phil’ we shouldn’t be worrying about sea level rises due to melting Greenland icesheets as they didn’t seem to cause us much of a problem back then. In one of his emails he even attachmented aerial photos of a Greenland glacier that was retreating during the 20s/30s but has since stopped retreating. So much for ‘unprecedented’ (not according to Phil Jones) Artic warming during the late 20th century eh Kim – even the ‘good Dr Phil’ knows that the 2007 sea ice extent low was not unusual when one actually takes the trouble to go and actually look for evidence of Artic warming prior to 1979!

    Now back to the pesky 20th century warming and cooling periods. Eduardo perhaps you could have a quick look at the following link.

    http://diggingintheclay.wordpress.com/2010/01/18/mapping-global-warming/

    and in particular at Figs 7,8, 9 and 10. Note the legend for these maps given in Fig 1. All 4 maps show the warming/cooling trends in degC/century for individual WMO stations (with duplicate data for the same station appropriate combined into a single series for each WMO station ID and ‘imod’ combination). In particular please contrast Fig. 8 (1910 to 1940 warming period) with Fig 10 (1970 to 2010 warming period) and be sure to make due allowance for the relative lack of global coverage for the 1910 to 1940 period relative to the 1970 to 2010 period. Can you see any significant differences particularly in the Northern Hemisphere during these two (IMO remarkably similar)warming periods? Note these are ‘raw’ data warming/cooling trend maps for the GISS land surface temperature dataset (basically GHCN with USHCN v2 and SCAR) and there has been no ‘anomlisation’ or ‘gridding’ of this data (sorry Zeke H, Nick S, Ron B, Mosh and co.) just straight forward linear trend fitting to the individual station ‘raw’ temperature data. For the 1910 to 1940 map notice the ‘dark red’ dots for SW Greenland, Iceland, Northern Scandinavia and Russia (the ‘good Dr Phil’ wasn’t telling porky pies to his IPCC compatriots).

    Now scroll further down that article on ‘Digging in the Clay’ (http://diggingintheclay.wordpress.com/) and look at Figs 10 and 11 (which should be labelled Figs 11 and 12).

    To quote from that thread/article

    “Now Figure 10 is clearly dramatic! Just look at all those dark red dots (> 5 deg.C/century warming trend) in the Northern US and all of the Canadian stations, and similarly most of the former USSR i.e. Russia, Kazakstan etc, Mongolia and North Eastern China stations! Despite this clearly alarming warming trend, there is at the same time an observed cooling trend in much of central China and, puzzlingly, in much of the Balkans, Greece and Turkey? Looking at the Southern Hemisphere for this DJF seasonal period i.e. summer period in the SH, Western Australia appears to refuse to be warmed!

    It is then even more puzzling to contrast Figure 10 with Figure 11 which shows the JJA seasonal period trends. Much of the alarming warming trend evident in the DJF trends for the Northern US and Canadian stations has vanished and in fact has been replaced for some of the Central US stations with a cooling trend. Similarly many of stations in central China as in Western Australia refuse to show any warming trend! It also looks as though if you want to get a good tan, then ‘the Med’ whether it be the South of France, Italy, Greece or Turkey is a good place to head for as temperatures are clearly rocketing up there in the summer. Don’t forget to pack your sun protection cream though!

    Seriously though, these seasonal trend maps would appear to indicate that that much of the claimed global warming is hardly global at all. In fact it looks to be more accurately Northern Hemispshere warming, and for that matter primarily Northern Hemispshere WINTER warming! Clearly CO2 is choosey! It’s happy to only take full effect during the winter time in the Northern Hemispshere and even when it does it is also happy at the same time to allow some exceptions. It looks very much like Western Australians need to apply to Kevin Rudd for a carbon tax rebate.”

    Perhaps some Australians (not just western Australians) must have seen these maps spread the word back to their friends back home as subsequently the Australian ETS scheme was ‘postponded’ and my poor namessake was told to take an early bath. Sorry Kev!

    So Eduardo, based on this evidence, can you please explain why we should consider the late 20th century warming trend (from 1970 to 2010) to be in anyway exceptional? Why should we have to invoke GHGs to explain this warming period? I won’t even ‘go there’ as regards UHI and the effects that may or may not have on the late 20th century warming trend as that would be a completely different thread.

  36. A preprint of the new paper (submitted to J. Clim., second revision) is here

    The Christiansen et al. 2009 paper is here

    or here

    Rutherford et al. wrote a comment which is here

    and our reply is here

    I agree with the comments pointing out that the last paragraph in my posting is incomplete: There is also the possibility that the forcings have been underestimated. This is well described in the comments by Zorita.

    In relation to the question weather the recent warming is natural or man-made it is important to realize that the radiative effect of CO2 is well understood. In fact the warming has been PREDICTED by Arrhenius more than 100 years ago.

  37. In relation to the question weather the recent warming is natural or man-made it is important to realize that the radiative effect of CO2 is well understood

    A common theme here.

  38. Unfortunately the links didn’t work. I did read the pre-print version before and very much appreciate the work you have put in to this subject.

  39. Bo

    “In relation to the question whether the recent warming is natural or man-made it is important to realize that the radiative effect of CO2 is well understood. In fact the warming has been PREDICTED by Arhenius more than 100 years ago.”

    Please be aware of the fact that many of us ‘climate skeptics’ on this blog and others like CA and WUWT etc are well aware of this fact. It often gets very tiresome when the statement you’ve made is constantly re-iterated by warmists as a way to somehow lump us in with people who don’t beleieve the laws of radiative physics. People like myself are not ‘denying’ that CO2 warms the planet we are just very skeptical as to whether or not the IPCC’s pronouncements of ‘very likely’ are justifiable or not. We are prepared to agree to ‘makes a contribution to’ but very much doubt that the late 20th century warming can only be explained by invoking man-made GHG emissions.

  40. Sorry if I have been crude, but I have not followed this blog very closely. I find that KevinUK’s position as described in #42 makes sense.

    One consequence of Arrhenius work is that the present period is different from previous periods. Even if there have been previous periods (in the recent past with conditions more or less comparable to the present) with temperatures as warm as or warmer than the present period this is not a good argument for the warming of the recent century not being driven by CO2.

    In a way this is related to how you choose the null-hypothesis for testing that the recent warming is unusual; Arrhenius work suggests that we should be satisfied with less than the usual significance level.

  41. #36

    Thank you dr. Zorita for replying to my message. You write: The 1940 warming has a different spatial structure, more concentrated on high-latitudes as Espen wrote, whereas the recent warming if we believe the several data sets, has a more global character.

    I agree that the periods appear to be different – exactly how different depends on how you trust the data quality. Also, the current period started at a higher level (ocean heat content was probably higher?). But the main problem with your argument here is that what you say implies that the “Arctic amplification” appears to have beeen weaker now than it was back then. But the current weaker Arctic amplification is touted as confirmation of AGW theory!

    (please note that I also don’t deny the basic physics of CO2 and IR)

  42. Bo,

    “One consequence of Arrhenius work is that the present period is different from previous periods. Even if there have been previous periods (in the recent past with conditions more or less comparable to the present) with temperatures as warm as or warmer than the present period this is not a good argument for the warming of the recent century not being driven by CO2.”

    This whole AGW issue isn’t just about the level of temperature, it’s just as importantly about the rate of warming also. At the risk of diverting onto policy the rate of warming is most important because we are currently being to to ‘act now’ in order to at least mitigate if not attempt to avoid dangerous/catastrophic climate change (by constraining future GHG emissions so that projected future temperature increases are kept to < 2 degC). One of the key pieces of evidence that underpins this imperative to 'act now' is the GCM projections of future climate change with and without GHGs included within the model projections.

    According to the IPCC there is no other way to explain the late 20th century warming period without invoking GHG (primarily CO2) forcing. This is nonesense as the modellers have yet to even attempt to look at alternative parameterisations (e.g. atmosphere to ocean interaction models, cloud models etc) within their GCMs which if they were as equally tuned to as the CO2 forcing would just as easily show IMO an alternative explanation for the late 20th century warming as well as the earlier 1910 to 1940s warming and 1940 to 1970 cooling periods.

    You say "One consequence of Arrhenius work is that the present period is different from previous periods". I'm intrugued What aspects of Svante Arrhenius's work causes you to think that the present period is different from previous periods?

  43. #43

    I don’t think you have been crude at all. On the contrary one needs to engage and discuss.

    One consequence of Arrhenius work is that the present period is different from previous periods.

    How can this be an established fact if there is uncertainty with LF climate variance?

    Even if there have been previous periods (in the recent past with conditions more or less comparable to the present) with temperatures as warm as or warmer than the present period this is not a good argument for the warming of the recent century not being driven by CO2.

    If you are talking about sensitivity, I don’t see how warmer historical climates is supportive of sensitivity to C02 and CAGW theory. I think it would be more accurate to say that one can’t rule out CAGW. Where is the scientific foundation showing that climate response must be the same for (eg) solar as it is for C02?

  44. #45 and #46

    The current period is different because we have a good idea about the forcing. If we did not have that we would need the current period to be warmer than, say, 95 % of the previous
    periods (of the same length) before we started to worry. But now we have the additional informaton about the forcing – that changes the statistics.

  45. #47

    Whether or not the knowledge wrt forcings is on solid ground, the knowledge of climate response to different forcings can only extend so far with only the instrumental record to test hypotheses against.

  46. @Kevin

    ‘So Eduardo, based on this evidence, can you please explain why we should consider the late 20th century warming trend (from 1970 to 2010) to be in anyway exceptional?’

    We do not need to consider the trend to be exceptional. It does not need to be exceptional. I understand why you are asking this – the word ‘unprecedented and so on…But please consider this example: let us imagine that solar activity was very high in the MWP or that the increase in solar activity was also very high before the MWP maximum. The trend could have perfectly been larger than in the 20th century. Even the absolute temperature might have been larger than today. This would not indicate that the AGW today is wrong – they are different conditions, each one with one prevalent forcing. I am afraid that the discussion about unprecedentedness has been quite misleading. It is important when discussing climate impacts, but not at all when discussing the origin of climate variations.

    ‘and for that matter primarily Northern Hemispshere WINTER warming! Clearly CO2 is choosey!

    Indeed climate models are not perfect, but one aspect that all models agree on is that warming should be stronger in wintertime,and in the northern hemisphere. Unfortunately, or fortunately, solar forcing would produce the same pattern of temperature change at the surface. They only deviate higher aloft in the atmosphere. So your observation seems to support climate models 🙂

    @ Espen
    ‘problem with your argument here is that what you say implies that the “Arctic amplification” appears to have been weaker now than it was back then. But the current weaker Arctic amplification is touted as confirmation of AGW theory!’

    Good question. To my knowledge, the 1930-40 were really warm in Greenland, but I can imagine that the Arctic Siberian data at that time are not very reliable. So it could be that you are right. Another point is that the observed warming over Antarctica has been weak and other factors have to be called upon, ozone in this case.
    So there are open questions – no doubt

    But let me also ask questions. What caused in your view the warming in the 1930 ? Kevin seems to suggests it was changes in cloudiness. What caused those changes in cloudiness ? why did cloudiness change in the 30 and not later or before. why did those changes last for 20 years ? why was the warming more accentuated in Greenland ? was the Gulf stream stronger, and if so why. Is the present warming caused by the same mechanism, and if not why ?Try to provide a response that is consistent globally and try to be as strict as you are with the AGW theory. You will see that very soon one starts hand-waving and invoking spooky actions and even ‘teleconnections’ 🙂

  47. Eduardo;
    The point about “choosiness” is not trivial. If CO2 is to be taken as the default continuing “driver”, it must be demonstrated to be so — in EVERY warming cycle, it must be detectable, and quantifiable, or else its predictive value is nil. It just won’t do to say, “Well, it’s the dominant force THIS time (because we can’t prove anything else is.)”

    It is NOT the default explanation.

  48. It is my recollection of the papers produced by Zorita and Christiansen that they have been primarily interested in the general validity of the temperature reconstruction methodologies and less in some of the points made by McIntrye and McKitrick like, for example, the lack of a rigorous a prior selection processes for proxies and/or PCs (cherry picking).

    I am not sure who originated the use of psueo-proxies (it may well have been Zorita) to test the reconstruction methodologies, but there is no doubt that the results of that testing by various scientists has led to very different results. Therefore that testing cannot be all that straight forward. As a matter of fact the psuedo-proxies used in testing can vary also.

    Most of the critical analyses of the Mann HS paper and its progeny would allow for a bigger variation in past temperatures. In fact, Mann and some of his collaborators have appeared to be very sensitive about admitting to the possibility of larger variations than those denoted in his iconic HS paper. Recent Mann papers have claimed more valid reconstruction methodologies than he and his coauthors used in the past, but then with the disclaimer that it does not matter and the HS remains valid and lives on even if the team has moved on.

    Now what is rather ironic about this problem of admitting to a larger past temperature variation is that it goes counter to the more alarmist claim that a larger past variation in temperatures could be used to argue for a greater sensitivity for GHGs, and after super imposing that on a greater natural variation into future times, could, under assumptions about any temperature change from the current being detrimental to human kind, be considered more catastrophic than the more alarmists amongst the consensus currently predict.

    I can certainly follow the rather simplistic scenario that is contained in this more alarming view, but the question becomes: is it really that simple? Much of the alarm is tied, in my mind, to the view that man has adapted to the current temperatures/climate and that any change is detrimental. It is rather like the ultra conservative view that the constructive/destructive actions of a capitalist system, while a necessary and vital part of its workings, is always bad (and thus the need to bailout banks and car companies).

  49. Cite Arrhenius at your peril. As many have pointed out, modern measurement and extrapolations are quite different:

    http://wattsupwiththat.com/2009/04/13/6995/#comment-114317

    Because Arrhenius was wrong on his calculation of α; 5.35 W/m^2 is an over-exaggerated sensitivity. Some authors have come over the revision of Arrhenius formula for calculating ∆T and have found α is not constant, but it varies depending on partial pressure and Cp of CO2. If the incoming load of IR remains constant, doubling CO2, according to Fourier’s algorithm, would cause cooling, not warming. For a doubling of CO2 causes warming, the source of heat (Sun) must experience a constant increase of the intensity of discharged radiation.

  50. 30. Eduardo Zorita said – “The energy imbalance is rather not recorded in the surface temperature, but in the ocean water temperature and/or sea-level. Now, we can discuss the reliability of sea-level measurements, but I think it is established that sea-level is rising. Satellites and tide-gauges both indicate it. I dont have any other explanation for sea-level rise that water temperatures are increasing and/or land ice is melting.”

    If you subscribe to the possibility that the devil lies in the detail, then the subject of sea level rise tied to global ocean warming has to account for all aspects of the ocean. It is not sufficient to say that the top X00 metres has warmed as shown by measurement. It is also needed to show whether the bottom X00 m (and, indeed, all in between) has or had not warmed/cooled as well. Thermal expansion is a a product of the whole ocean temperature change. Now, the detail that is largely unexpored is the effect of undersea heating at places like those with sea floor spreading, deep volcanos, the underwater parts of the Ring of Fire, etc.

    I am not doubting the hypothesis that there has been global warming and that a consequence has been sea level rise. However, a comprehensive proof of that hypothesis has to include detailed whole ocean temperature measurements, which at this time are far from adequate in coverage.

    On the balance of reasonable probability your comment is fair; but it is not correct beyond reasonable doubt. This is a problem so common in climate science that it becomes almost an impossibility to rebut the many, many arguments that have room for doubt. That accumulated doubt can be regrouped to show that the world is cooling, if that was the aim of a determined person with excess leisure time.

  51. 38. Kevin UK said “Perhaps some Australians (not just western Australians) must have seen these maps spread the word back to their friends back home as subsequently the Australian ETS scheme was ‘postponded’ and my poor namessake was told to take an early bath”

    Credit where credit is due, perhaps? It was Australian scientists like Warwick Hughes and John L Daly (yes, a Brit by birth but an Aussie at death) who were among the first to ring the alarm bells. This was long before we saw the maps to which you refer.

  52. 49 Eduado said _ “What caused in your view the warming in the 1930 ?”

    A question back to you, if I may. What, in your view, caused the warming of 1998?

  53. #49 Eduardo: What caused in your view the warming in the 1930 ?

    I personally think that that huge heat sink known as the ocean is complex enough to cause variability at least at the levels we’ve seen in the 20th century, and probably at even much larger scales (amplitudes) (since the circulation time of the thermohaline circulation is up to 1600 years!), especially when you factor in the complexity of the relationship between cloud cover and SST. I’m afraid we will never arrive at simple “Newtonian” explanations here – the whole system is far too chaotic in its nature.

  54. #57;
    Prezakly. G&T, e.g., made the same point vociferously about the non-linear fine-grain thermal interactions in the atmosphere itself. It’s one of those “hard” problems that do not yield to any conceivable brute-force assaults, much less wishful simplifications.

  55. Geoff

    #54 “Credit where credit is due, perhaps? It was Australian scientists like Warwick Hughes and John L Daly (yes, a Brit by birth but an Aussie at death) who were among the first to ring the alarm bells. This was long before we saw the maps to which you refer.”

    Hopefully you did spot that I had my tongue well and truly in my cheek when I wrote that part of the article/thread on DITC? We do indeed owe a deep debt of gratitude to people like Warwick Hughes and in particular John Daly for opening our eyes as to what has been going on in regards to the whole AGW issue. When I first got involved in researching the whole AGW issue ‘Still Waiting for Greenhouse’ (http://www.john-daly.com/) was one of the first web sites I visited (its seems that we are all still waiting for greenhouse?). I still visit it now and it comes as now surprise to me to find that many of the issues (e.g. poor quality temperature data in Australia, adjustments of that data to create warming trends etc) that John Daly alerted us all to many years ago now have since been confirmed by sterling auditors like Ken Stewart (and your good self). It always helps when giants have preceded you as you then get to ‘stand on their shoulders’ (http://en.wikipedia.org/wiki/Standing_on_the_shoulders_of_giants). John Daly IMO was most definitely an intellectual giant.

  56. Eduardo

    #49 “We do not need to consider the trend to be exceptional. It does not need to be exceptional. I understand why you are asking this – the word ‘unprecedented and so on…But please consider this example: let us imagine that solar activity was very high in the MWP or that the increase in solar activity was also very high before the MWP maximum. The trend could have perfectly been larger than in the 20th century. Even the absolute temperature might have been larger than today. This would not indicate that the AGW today is wrong – they are different conditions, each one with one prevalent forcing. I am afraid that the discussion about unprecedentedness has been quite misleading. It is important when discussing climate impacts, but not at all when discussing the origin of climate variations.”

    “We do not need to consider the trend to be exceptional. It does not need to be exceptional”. Those (e.g. the IPCC, the UK Government etc) who would seek to get us all to ‘act now’ would seem to disagree with you on this point.

    “let us imagine”. No Eduardo, I’m afraid there is far too much imagining going on at the moment. We need to deal with cold harsh reality (not virtual reality) and that fact is there is nothing in the least bit concerning about a warming rate of 0.6 to 0.7 deg.C over the last century or so particularly when there is plenty of evidence of repeated mild warming and cooling variations of this magnitude including at much higher levels of GHG concentrations in our atmosphere in the past. Those (like Ben Santer and Kevin Trenberth) who seek to ‘up’ this number to a much more ‘scary’ number are applying non-scientifically supportable tactics such as the use of ensembles of GCM projections with built in unproven assumptions of net positive feedbacks within the climate system that are not evident within the historical climate variations of the past.

    “This would not indicate that the AGW today is wrong – they are different conditions, each one with one prevalent forcing.”. I agree with you that the conditions are different today than they were 50 years ago, 100 years ago, 100 years ago 10,000 years ago. So what! Where is the evidence from past climate that shows we are facing the possibly of altering our climate in a dangerous way should we choose not to drastically ‘de-carbonise’ our economies? Answer? There isn’t much if any clear evidence that a) our GHG emissions are having a signifcant effect on our climate, b) that even if we were having a discernable effect on our global/regional climate, that any future warming will be significant and that that warming will not have as much beneficial as (assumed automatically to be by those who wish us to ‘act now’) detrimental effects i.e. that there will be no net adverse effects of such a warming.

  57. 59. Kevin UK. Thanks for the attribution to others. Of relevance to the present references here to sea level, here is a story from John Daly.

    http://www.john-daly.com/altimetry/topex.htm

    Not all of the questions it raised have been answered. The residual doubt problem again. Note that some of his observations relate to bias, e.g. “The problem of sea state is that both the wave crests and the wave troughs are both returning echoes to the satellite. With wave heights of several metres being typical, the T/P system has to somehow resolve this mish-mash of thousands of echoes from within the footprint, some from wave tops, some from wave troughs, into some sensible average. The statistical correction is further complicated by the fact that wave troughs give better focusing to the beam than do the wave tops which tend to scatter the signal. This would give an impression of lower sea level if there is a strong swell in the sea. That problem is addressed via a statistical model.”

  58. Eduardo,

    Thanks for stopping by and engaging the discussion, there seems to be a bit of hand waiving on our side in response to your points as you have predicted. This question below, stopped me in my tracks and will likely take a lot of consideration. BTW: There are a lot of readers here who are just remaining silent to this conversation and watching the progress.

    But let me also ask questions. What caused in your view the warming in the 1930 ? Kevin seems to suggests it was changes in cloudiness. What caused those changes in cloudiness ? why did cloudiness change in the 30 and not later or before. why did those changes last for 20 years ? why was the warming more accentuated in Greenland ? was the Gulf stream stronger, and if so why. Is the present warming caused by the same mechanism, and if not why ?Try to provide a response that is consistent globally and try to be as strict as you are with the AGW theory. You will see that very soon one starts hand-waving and invoking spooky actions and even ‘teleconnections’

    I’m not knowledgeable enough on the 1930 climate to even begin to answer but I can waive hands with the best of them. My skepticism of the models is with more subtle problems and I do recognize their absolutely critical value to the prediction of climate. It’s the magnitudes I have a problem with. I’ve done some of my own work on UHI based on Anthony Watts project, I cannot blog on it – by promise. There is a substantially larger UHI effect than climatology realizes in my opinion.

    To put it in perspective, the UAH trend is 0.14C/Decade – RSS is higher due to a 1992 step difference from satellite transitions. UAH is the correct one from my own work which was confirmed by Dr. Christy’s work on radiosondes. They are only subtly different anyway.

    This trend is lower troposphere as you know, but what is less discussed is that the lower troposphere trend is, by model, supposed to be higher than surface station trends. Yet surface trends are higher. I believe that much of the difference between ground and sat is due to UHI.

    Despite Santer’s recent paper, model trends have been running on average statistically above surface trends. If UHI is even 15% of trend, models need a rework. Since SST is 70% of trend though, even a large shift in ground measurements only shifts the total trend a bit. The discrepancy is difficult to resolve.

    This though is the conversation which is endless in blogland, it’s less interesting to me than paleo math because there are no clear answers.

    My 1930 handwave is therefore this, the energy content of the ocean is transferred to the atmosphere more easily than models predict. This happens through distribution of the warm water across a larger surface area by winds and current. El Nino style. There is a net heat loss to the ocean which is not measured during this period because of lack of wide spread deep water temp measurements. This means natural fluctuation is greater than is modeled based on the distribution of temperature in ocean models.

    That leaves us with sea level rise – which has tapered off slightly in recent years. We didn’t have CO2 global warming 50 years ago, yet we still had rising seas (faster than today if I remember right), and melting glaciers.

    So considering all of these details, we throw in a bit of solar based warming. It’s not an unreasonable choice to consider that we have underestimated the radiative increases from the sun in its more active state. No, it’s not enough to do the whole of the warming but maybe it’s a bit more than we think. The data is systematically being chosen and interpreted to minimize warming by the sun – they may be 100% right but that is what is happening in solar climate science.

    I can go on for another thousand words but the point is – if you consider some adjustments to:

    temp trends
    Solar forcing
    ocean surface temperature variability
    cloud feedback and variability
    gradual recovery from an unusually low temperature period – little ice age (which may be
    underestimated)
    undetermined factors.

    Then the historic variance loss which Dr. Christiansen and yourself have demonstrated, becomes pretty critical to the story. I always like it when climate scientists claim that the hockey sticks have little meaning. It’s flatly untrue – no pun intended.

    Imagine for a moment that 1930 SST has been underestimated due to instrument problems and UHI has distorted the land record. Just that alone is enough to cause a rethink of CO2 magnitudes and feedbacks, yet so little effort is put into data quality.

    Anyway, that’s my skeptic rant. I very much appreciate your willingness to discuss the issue, it’s the experts that make science blogging fun.

  59. jeff

    #62 “There are a lot of readers here who are just remaining silent to this conversation and watching the progress.”

    I hope so and so far no trolls.

    “Anyway, that’s my skeptic rant.”. You comment in #62 is far from a ‘rant’ Jeff. Rather it is a common sense (not hand waving IMO) appeal to look at other factors which IMO have just as plausibility in explaining the cause of the late 20th century warming period as the IPCC proclaimed ‘very likely’ man-made GHG emissions.

    Now I’m afraid Eduardo I’m still missing your point. Just exactly what is unusal about the late 20th century warming period that distinguishes it from previous warming periods such as the 1910 to 1940 warming period. If your reply is the difference in GHG/CO2 atmospheric concentrations between the two periods then I’m sorry I need a lot more than that. Where is the evidence (other than aerosol tuned GCM hindcasts) that all other alternative explanations have been fully accounted for and so disgarded as less plausible explanations than GHG/CO2 atmospheric concentration increases? As I’ve already posted I (and may others) want to see full explanations as to what caused significant climatic variability (e.g. from warm epocs like the MWP to cold epochs like the LIA) in the recent multi-millenial timescale past and a full explanation as to why what caused these significant variations in past climate cannot also explain recent warming/cooling. In short, if we can’t explain and fully account for past climate varibility then why should we have any confidence in projections of future climate?

  60. Jeff,

    I think you have framed it about right. The real issue is not that radiative forcing from GHG’s must have a warming effect, the issue is the magnitude of that expected warming. When GCM’s do not reproduce the level of historic variation in temperatures (especially at longer time scales), we can reasonably conclude that there are either serious issues with our understanding of long term variations in forcing, that the GCM’s do not accurately capture the behavior of the system, or perhaps some combination of both.

    But there is an additional key argument against the validity of the GCM projections: they cover a broad range of predicted climate sensitivity, so most of them HAVE TO BE quite wrong.. they certainly can’t all be right. Since all are clearly based on the same ‘basic physics’ of the climate system, we can reasonably conclude that the non-basic stuff (parameterizations, approximations, assumptions, etc.) that goes into each model is what largely controls the projected climate sensitivity, NOT the basic physics. Arguing that GCM’s must be valid because they apply ‘basic physics’ correctly can be most charitably described as humorous.

    Given the large uncertainties involved, it is no wonder that people focus on past climate via reconstructions to better evaluate if the warming experienced in the last 50 years is ‘extremely unusual’, ‘unprecedented’, ‘alarming’, etc. Most people recognize that significant and unexplained historic temperature changes (like the MWP and the LIA) automatically cast doubt on the validity of projections of future warming by GCM’s. Protestations that “paleoclimate reconstructions are not important” or “irrelevant” are either misinformed or disingenuous. The IPCC is not staffed by a bunch of idiots.. they put Michael Mann’s hockey stick graph front-and-center for good reason.

    Those who argue that GCM warming projections are meaningful in terms of public policy carry a heavy burden of proof: Exactly which of the models are wrong and which are accurate? What are the specific errors in the incorrect models? What is missing from our understanding of past climate forcing which allows the ‘accurate’ GCM’s to be consistent with historical data? How can the missing past forcing be quantified? There is no such burden on those who choose to discount GCM projections based on obvious internal and external discrepancies, and I think it is not rational to suggest there is.

  61. In my mind discussions, like this one bring to fore the major problem I have with climate projections and reconstructions, i.e. uncertainty. Also brought forth is the problem with discussions like this one where we tend to generalize and talk past one another because we do not quantify much and lots of the apparent real or imagined differences can be a misconception about the other guy’s quantities. And what makes quantifying so difficult that it is most often not attempted? Uncertainties.

  62. JeffID:

    I’ll see your armwave and raise you a pipe-dream.

    I agree that oceans disgorge heat very efficiently. Just look at 1997/8 and 2009/10 El Nino’s. These are negative feedbacks where the atmosphere functions like a cooling tower for the ocean.

    I am very skeptical the Sun is driving the natural variation. My own pet theory is based on the fact that we have been in a cooling trend for the past 8K years and the RWP, MWP and *perhaps* the current warming (at least the 1930’s) are the consequences of negative (meaning opposite) feedback systems related to this overall cooling regime. I’m guessing that there are long-term (250 year to 1,000 year frequencies) cycles with ocean circulation that are responsible for the periodic heating and cooling. No concomitant forcing needed.

  63. Let us see, the tendency to arm wave (conjecture) is directly related to the degree of uncertainty of the issue being hand waved for and against. It should not be a problem as long as we define it properly and we all know that these are conjectures – from the more thoughtful to the wild guesses.

    Where I have a problem with this arm waving practice is when conjecture gets accepted as something more than that. Like when a conjecture appears in a publication than is referenced in another paper from the primary source and then tertiary sources reference the primary and secondary and before long (and with the normal hesitancy to check references a few layers back) we have something that is passing for settled science and being quoted on the blogs and even the IPCC (in a pinch).

    A clear example of that which I complain was the published Mann conjecture concerning the historical tropical storm/hurricane counts in the NATL basin and owing a credible count by ships in the past low tech era as being owed to the fact that the ships counted the storms efficiently since they lacked the technology of the current and later times to avoid them. Over at CA, we derisively called it the “dumb ships” conjecture. However, before not very long we would see other publications referencing Mann and then more Mann and secondary sources in later publications for concluding that the past TC/hurricane counts were not under counted.

  64. I also think that the discussion is civilized so far. I dont believe you are payed by the oil industry or that you are stupid, so we can afford to disagree on some points and agree on others

    Some thoughts to some of the questions..

    Kevin,

    “let us imagine”. No Eduardo, I’m afraid there is far too much imagining going on…”

    I was trying to illustrate a logical fallacy, not an event that did take place in reality. I do not think that the fact that temperatures were higher than today in the past invalidates the AGW theory. If in one evening I eat 6 pizzas and I get a sour stomach, I cannot dismiss my worries simply saying that many people have got stomach problems in the past. I think it is at least reasonable to try to establish a relationship between both observations. Pizzas dont need to to be the explanation for all my stomach problems in the past or other people’s problems for that matter.

    ‘ Where is the evidence from past climate that shows we are facing the possibly of altering our climate in a dangerous way should we choose not to drastically ‘de-carbonise’ our economies? Answer?……’

    I didnt raise those points about policy – we are trying to discuss what can we learn from past climates.

    Jeff,

    ‘My skepticism of the models is with more subtle problems and I do recognize their absolutely critical value to the prediction of climate. It’s the magnitudes I have a problem with..’

    I think this is valid point. The warming response produced by the models has a wide spread so indeed not all can be right. Let us assume now that all models are wrong. You accept that an increase of GHG must lead to an increase in temperature. You argue that there exists circumstantial evidence that the response is smaller than IPCC models indicate. I even would buy you that, but the evidence is circumstantial. The CO2 concentration is higher now than in the last million years. So all our arguments based on past experience now may be wrong, erring on one side or the other. I dont know what the answer is, but it strikes me that sometimes arguments are banged on the table (by both sides) with 100% certainty, when it is clear that no 100% certainty exists. What if our all hand-waving is wrong and the response turns out to be high ? : you agree that GHG have an influence, and GHG concentrations will grow in the future to very high-levels. I think one doesnt need to be eco-leftist to be objectively worried, even if all IPCC models are wrong.
    Now, the point that some claims on future climate change may be not justified doesnt invalidate my reasoning. If by 2050 temperatures have risen considerably, I cannot wash my hands saying– ‘mm,how bad that Mann and Gore fooled us in 2000 with red herrings’. The real climate doesnt care about realclimate 🙂

    ‘This means natural fluctuation is greater than is modeled based on the distribution of temperature in ocean models…… ‘

    Ok, now we are talking. If I understand you well, you suggest that the interaction of the global ocean and atmosphere may be causing very long timescale oscillations, ENSO-like Climate models may not be representing this interaction adequately. High surface temperatures in the 30s were indicative of energy of the oceans being released to the atmosphere..
    It could be. Models could be weak in simulating long-term variability. But how would you fit the fact that sea-level was rising also in the 30s?. If the ocean was giving up energy, sea-level should have been dropping as the surface warms and the energy is released to space.

    ‘That leaves us with sea level rise – which has tapered off slightly in recent years. We didn’t have CO2 global warming 50 years ago, yet we still had rising seas (faster than today if I remember right), and melting glaciers. ..’

    Let us assume that past sea-level in the 20th century was not caused by CO2. Nevertheless we would need to know the cause of sea-level rise, even if it was secular or natural. Sea-level rise indicates energy imbalance, and therefore the planet was taking up energy. Why did it do so if the external forcing did not change?

    ‘It’s not an unreasonable choice to consider that we have underestimated the radiative increases from the sun in its more active state’
    Not unreasonable, but you are now accepting that the climate is sensitive to external solar forcing. The forcing by CO2 has also been at least as strong as solar forcing in the 20th century. I interpret that you think that the sensitivity to CO2 is weaker than the sensitivity to solar forcing. OK, that might be true, but you would agree that we are now on thinner ice. That the sensitivities may be different is indeed speculative. What happens if they are similar, and CO2 concentration rises to 600 ppm ?

    ‘gradual recovery from an unusually low temperature period – little ice age (which may be
    underestimated) undetermined factors’

    I do not agree here – actually this is my main point. There is no recovery to normal conditions by undetermined factors, or natural oscillations that go on on their own . If the climate warmed or cooled is because something brought it to warm up or cool down. Either the sun or GHG, or other external agent, or all together. We need to be specific here.

    Kevin,
    ‘As I’ve already posted I (and may others) want to see full explanations as to what caused significant climatic variability (e.g. from warm epocs like the MWP to cold epochs like the LIA) in the recent multi-millenial timescale’

    There just a few simulations of the past millennium, but models really have no problem in simulating the LIA or the MWP. Bo has shown some examples in his figures of climate simulations, and you can find more <a href="http://coast.gkss.de/staff/zorita/ABSTRACTS/zorita_et_al_2004.html"here
    or here
    This is because both climate anomalies were probably forced, so it is relatively easy to drive the model with past forcing, and get a certain global average temperature response. Much more difficult is to simulate unforced, i.e. natural, variations, like ENSO, or PDO or AMO, because these requieres to include the right mechanisms of interaction between atmosphere and ocean.
    Now it is not clear whether or not the MWP was forced or caused by internal interactions between atmosphere and ocean.

    Howard,
    ‘My own pet theory is based on the fact that we have been in a cooling trend for the past 8K years and the RWP, MWP and *perhaps* the current warming ‘

    The cooling trend is reasonably understood , although not completely, It is related to the changes in of the orbital configuration of Earth. Basically the closest point to the sun in the Earth orbit is shifting from July 8000 years ago to January now. Climate models reproduce this cooling trend nicely.

    ‘cycles with ocean circulation that are responsible for the periodic heating and cooling. No concomitant forcing needed’

    What are these cycles ? How can I be completely sure that the present warming is part of those cycles if we cannot pin down the reason for those cycles ? Why do you rather believe in the existence of these cycles than to the observations that CO2 and temps have been rising in the 20th century and that there seems to be a mechanism for the effect of GHG ?

  65. Eduardo;
    That mechanism is time-reversed; warming increases atmospheric GHG. All the contortions attempting to turn cause into effect are disingenuous, not just confused.

  66. Why disingenuous ?

    The concentrations of CO2 in the atmosphere are accompanied by other 2 observations. One is that the relative concentrations of the isotope 14C in the CO2 is decreasing, and this indicates that the additional CO2 is fossil, i.e. it has not been in contact with the free atmosphere for a long time – more than say 100 thousand years. The other observation is the relative concentration of the isotope 13C in the CO2 is also decreasing. This indicates that the additional CO2 is likely stems from organic carbon, since carbon in plants is depleted of this isotope.
    When calculating the expected depletion in 14C and 13C from fossil fuel burning, it matches the observations.
    So there is a strong basis to think that the increase in atmospheric CO2 is indeed from fossil fuels.

    what other explanation can be proposed for the 14C and 13C depletion ? For instance, a degasing of CO2 from the ocean as water warms cannot explain the 14C depletion

  67. Eduardo,

    Thanks you once again for taking the time to contribute to this discussion.

    You still don’t appear to have answered my basic question. What is exceptional about the late 20th century warming particular when compare dto the 1910 to 1940 warming period that means that we need to invoke GHG (primarily CO2) as the explanation for this late 20th century warming period?

    Could you please provide me (and others reading this thread) with a reply to this specific question. Once you’ve answered this question perhaps we can then widen the discussion to other alternative explanations e.g. ocean cyclcs oscillations (and what causes them) etc.

    I appreciate that the different 20th century warming and cooling periods and the much more significant climatic periods like the MWP and LIA don’t have to have been caused by the same contributory factors (and that a different mix of factors can have caused each warming/cooling period) but I first want to establish what is exactly unusual about the late 20th century warming period which leads us to believe that it is in any way exceptional or unusual.

    I personally think that it is not unusual. I can see very little evidence that would lead to the conclusion that it is unusual. The level of MGST reached towards thee end of the late 20th century warming period is indeed higher than it was at the end of the 1910 to 1940 warming period (this has yet to be the case for the continenetal US i.e temperatures at the peak of the 1910 to 1940 warming period in the US have yet to be exceeded in the 21st century), but that in my opinion is only as a result of the further underlying warming trend evident in instrumental temperature record over the last 150 years due to the recovery from the nadir of the LIA (the cause of which is yet to be fully established but is most definitely not due to man-made GHG emissions).

    As I’ve previously stated I’m prepared to accept that GHGs (particularly water vapour to the tune of at least 90% of the GHG effect) are responsible for elevating our planet to a higher temperature than it would other wise have been at without thir presence i.e. thatthe GHG effect is very real and very well established. I am prepared to accept that the concentration of CO2 in our atmosphere is at a multi-millenial timescale high due largely to our post-WW2 industrial and population expansion but I definitely at this stage fail to see why we should be concerned about that as IMO we are yet to see any clear evidence for the IPCC proclaimed fact that man-made GHGs are ‘very likely’ having a significant effect on our planet’s climate. To re-iterate if we don’t know to any reasonable level of certainty what caused our past climate to vary in the way it has done, then don’t see why we should be attempting to ‘decarbonise’ our economies. This could turn out to be completely the wrong thing to do. The only sensible thing to do IMO is to carry on collecting data (evidence) and to collect that data in a much more accurate and precise way than we have done in teh past. We should continue to look at as many possible explanations as to what has caused our past climate to vary so much as we can and not just pre-judge that our most recent warm period is unusal (there is no evidence to support this conclusion) and that it is ‘very likely’ due to our uncontrolled GHG emissions. And finally IMO there is most definitely no evidence (GCM projections are not and never will be evidence) that if we don’t ‘act now’ that future climate change will be in any way dangerous or catastrophic.

  68. Eduardo:

    Thanks for your response. My belief is that understanding the past is the key to predicting the future. I can believe orbital mechanics as a forcing. You say that models using orbital forcing reproduce this trend, but AFAIK, these models do not reproduce the millennial-scale cycles of warming within the 8K yr cooling trend. In my mind, this is the hard bit that needs to be solved so that we can separate the natural climate variability from AGW.

    The issue of long term ocean cycles is a complete WAG. This is based on my experience as a geologist. There is a fundamental “rule” in structural field geology: small mimics large. This means that meter-scale structural features of fracture patterns, small folds, and the strain pattern imprinted in rock fragments will be repeated and lead you to the location of the kilometer-scale structures.

    Currently, we see decadal oscillations in ocean circulation patterns that are not completely understood. El Nino, for instance, seems to have a dramatic effect on average global temperature and weather patterns. These short-term cycles, coupled with the fact that complete turnover of deep ocean water take thousands of years, leads me to speculate that century-scale and millennial-scale ocean circulation cycles with ENSO-like features likely exist as well. These cycles may be related to the unexplained warming cycles during the last 8K years of cooling. Like I said, this is a wild a$$ed guess.

    Back to the hockey stick, my immediate doubt of this plot was because I have never seen a flat curve of any data in my geologic experience. My WAG was the hockey stick is impossible because this is not how the world behaves.

  69. Eduardo,

    Thanks again for taking the time here. I’ve rarely seen the Air Vent this quiet, yet view’s aren’t down.

    There must be a lot of thinking going on in the background.

    What if our all hand-waving is wrong and the response turns out to be high ? : you agree that GHG have an influence, and GHG concentrations will grow in the future to very high-levels. I think one doesnt need to be eco-leftist to be objectively worried, even if all IPCC models are wrong.

    I agree you don’t have to be over the top econuts to be worried, but I strongly disagree with proposed solutions such as limitation or forcing the implementation of solar and biofuel – which don’t work with today’s tech. Solar will work someday, biofuel will never work, just by the numbers. As an engineer, I’m very much bothered that the science crowd is all for limitation policy which I believe is flatly wrong and will increase CO2 emission, and against implementation of nuclear, which is the one tech which can create enough energy for us to continue our lifestyles. But all that is a different topic.

    This part is very interesting and I agree:

    But how would you fit the fact that sea-level was rising also in the 30s?. If the ocean was giving up energy, sea-level should have been dropping as the surface warms and the energy is released to space.

    Of course you are correct as the plots of sea level show rising water back as far as the early 1900’s. This is, I believe, inconsistent with aerosol model theory in which the earth was not gaining energy until more recently. I’ve not had time to examine the sea level data directly and wonder what you think of the quality of sea level measurements from this timeperiod.

    Of course there are more things than melting ice and gaining energy which could cause a sea level shift at certain sensors. Ground upwelling, changing the positions of the sensors, sub-crust magma shifts causing different gravitational distributions and these sorts of things. These are real effects which again I’ve had no time to study but I’m very curious what the climatological explanation is for the pre-1950 sea level rise which is interpreted as continued century long gaining of energy by the ocean.

    BTW, I don’t believe I’ve ever participated in any discussion along these lines here and don’t consider my self expert enough on the topics to really know anything. My climate background is limited to hockey sticks, sea ice, gridded temps and Antarctic temps. The majority of it is just downloading, plotting and interpreting data.

    I just don’t want new people here to think I’ve claimed any undeserved expertise or have made claims that AGW is false. That’s what RC wants us to be but in fact most here just read and complain a bit about extreme conclusions and unreasonable demands from government. As an example, the gridded temp trends produced here from GHCN were necessarily higher than those produced by the pro’s if we happened to use the exact same data and gridding. This was due to implementing an improved anomaly combination method by RomanM who is a retired university statistician. But it’s a better method, so we used it. IMO, all of climatology should.

    Thanks again and I’ll spend some time reading your link.

  70. #72;
    The concentrations of various isotopes gives information about biological uptake patterns, and some info about solar/cosmic radiation, but nothing about what is responsible for added or reduced CO2 levels, much less dwell time. Mixing is a complex pattern derived from winds and molecular weights, etc. Even if the entirety of the carbon load of the atmosphere became 14C over time, that would be irrelevant to the issue of uptake vs. source quantities in total.

    The “disingenuousness” comes from the blatant and patent efforts to disguise past warmer periods, and periods of past fast warming, and to suppress information from inconvenient weather stations, and to alter records and destroy original raw data, and discount without serious consideration clear alternatives to AGHG loading, and suppress publication of alternate and dissenting research, and put forward ludicrous bafflegab explanations for such things as the missing equatorial upper thermosphere hot spot and for the time-reversed temperature and CO2 fluctuations, etc., etc. To put it bluntly, you are defending liars.

  71. #76, IMO you are too strong in your language. There is definite dishonesty in climate science, medical science and really all things in life. I think you’ll find it in business too. Dr. Zorita has spent a substantial portion of his career trying to fix the problems with hockeystick math and probably has paid some price for it. When we consider that the powerful and obviously influential Michael Mann is still in denial about variance loss, yet can’t seem to even type in an AR coefficient greater than 0.32, something is wrong.

    But what if they could extract the signal reliably, as Bo Christiansen came closer to achieving. The discussion then reverts to CA standard fare, are the proxies up to snuff!!

    One step at a time, we march to the answer..

  72. Eduardo: “There just a few simulations of the past millennium, but models really have no problem in simulating the LIA or the MWP. Bo has shown some examples in his figures of climate simulations, and you can find more <ahref="http://coast.gkss.de/staff/zorita/ABSTRACTS/zorita_et_al_2004.html"here
    or here
    This is because both climate anomalies were probably forced, so it is relatively easy to drive the model with past forcing, and get a certain global average temperature response. Much more difficult is to simulate unforced, i.e. natural, variations, like ENSO, or PDO or AMO, because these requieres to include the right mechanisms of interaction between atmosphere and ocean.
    Now it is not clear whether or not the MWP was forced or caused by internal interactions between atmosphere and ocean."

    I am not so sure that models have no problems simulating the MWP. The problems arise because recent reconstructions of secular solar irradiance show much less (almost none) variation than the earlier, less reliable reconstructions. The AR4 has a figure of simulations going back a thousand years that seems to show that some of the climate models reproduce the MWP. But these climate model simulations (eg. Gonzalez-Rouco et al (2003)) use a rather dramatically variable solar irradiance. The development in solar science is noted in the text of the document but the figure that shows climate simulations for the last 1200 years still include the models that use the outdated solar forcing values.

    The temperature rise 1910-1940 is also hard to explain without the use of outdated solar forcing values.

    Jeff Id: "I’m very curious what the climatological explanation is for the pre-1950 sea level rise which is interpreted as continued century long gaining of energy by the ocean."
    Yes, very interesting question.

  73. Jeff;
    The dishonesty in other fields isn’t for all the marbles. And that’s what the CAGW Cult is demanding: a chokehold on the planet’s energy and financial budgets, in perpetuity. That makes it more than individual peculation or deception, I think. Don’t you?

  74. What if our all hand-waving is wrong and the response turns out to be high ? : you agree that GHG have an influence, and GHG concentrations will grow in the future to very high-levels. I think one doesnt need to be eco-leftist to be objectively worried, even if all IPCC models are wrong.

    Herein lies the major sticking point in these discussions and the one where we cannot any longer talk simply science, but need to consider policy and its potential effects. If one does not see the uncertainty of the science here, then, of course, the policy issues become less complicated – given the current majority/consensus on the terms of mitigation to be applied. I am here only addressing the issue arising from certain assumptions about climate and policy that lead some to say “regardless of past temperature variations, we can be very worried and we should be worried, very worried”

    If one sees nothing but detrimental consequences of climate warming into the future and knows it will warm by x amount and believes it will occur with great certainty and further if one sees little or no unintended consequences from the mitigation actions the path becomes quite clear and I think the one that has been chosen by a consensus of scientists and advocates.
    On the other hand, we can have the complete denialist who with great certainty can say that the effects are all natural, or nearly all, and thus we have nothing to mitigate.

    That some can be worried objectively, or otherwise, is not a worry for anyone whom might be worried by a false start for mitigation since worry is something in good supply and sometimes for good reasons and sometimes not, but it is concrete actions and the proposal of them that can cause a counter worry.

    The implications of these concerns from this discussion for more historical variance in temperatures and a greater temperature sensitivity for forcing factors, I find very incomplete and thus misleading by the assumptions that seemed to also be implied –
    but never stated.

    A greater temperature variation going forward as indicated perhaps by proper temperature reconstructions and an added GHG influence on temperature than is now currently accepted means what in terms of beneficial and detrimental effects on human kind and what do we know from historical climates. That is one of the questions that we tend to dance around without addressing it head on. It is rather obvious that while the uncertainty in predicting future temperature and other aspects of climate is large, the uncertainty in predicting beneficial/detrimental effects by incremental changes (warmer and cooler) in climate is necessarily much larger, notwithstanding the show of hands for confidence limits we get from the IPCC.

    The other question that others (I exclude Jeff ID here) often avoid is what are your underlying assumptions about the likely mitigation actions with regards to costs and unintended consequences.

    It does not necessarily take an eco-nut or -leftist to avoid these questions or to naively answer them or assume an answer exists for them.

  75. Kenneth;
    A long-winded exposition of the “precautionary principle”. But even the CAGW Cultists acknowledge that draconian economic measures will only have very small effects under their own assumptions over the next century.

    Under those conditions, acting urgently and irreversably, “as if” there was an emergency, is flat-out insane. But the promoters thereof are not insane. They are ambitious.

  76. Next up: a program to distribute activated charcoal pills to the planet’s population to suppress methane-rich flatulance!

    Good luck with that.

  77. A few preliminary words to clarify my position. I am not trying here to convince you of any action on GHG. I am just trying to communicate some bits of the state of the climate science, which comprises a few really well established facts, some others not se well established, and others that are quiet speculative yet. You, as members of an educated society, should weigh the evidence against the risks and decide what to do.
    Consider that a scientific theory can never be proved beyond doubt. It is always challenged by new data and new theories, in a process that never ends. Geosciences are a bit special, because it is difficult to conduct experiments, and so there will be always unsolved questions.

    Kevin,
    The temperatures in the 20th century are probably not exceptional. There were warmer periods in the past. For us it is exceptional because without invoking GHG it is almost impossible to explain the positive trend in the global mean temperature and its spatial pattern: more warming at high latitudes, more warming in winter, cooling in the stratosphere, etc. The argument then goes as follows: if this fingerprint can be detected now, and if GHG continue to rise unabated in the futre, this may represent problem. But obviously, temperatures *right now* are not per se an unusual problem.
    The 30s and other periods in the past may have been warmer, but from the logical point of view this only would indicate that for instance, ecosystems and polar bears can survive in a warmer environment. But the core of the argument is that in the present period, with a lot of observations the fingerprint of CO2 can be detected to a high probability. This detection and attribution is based on models – you may not like it, but this is the case. Essentially, the same models driven by all external factors excluding CO2 cannot replicate the present warming.
    Now you may argue that since models are not that good at reproducing internal variability, this attribution is not 100% sure. Could a so-far-unknown natural mode of variability have the same appearance as the CO2 fingerprint simulated by climate models? Thats a valid point, in my view. The question boils down now to what extent one considers to be a critical point or just a deficiency of models and what would be the probability that the ‘CO2’ fingerprint could appear by chance as a natural process.

    My stance is that as far as this unknown natural model producing the same CO2 fingerprint is not identified or somehow proposed, it is pretty reasonable to think that the effect of CO2 has been detected. To think otherwise would amount to put forward a theory that cannot be tested – un unknown process can in the end explain everything

    ‘As I’ve previously stated I’m prepared to accept that GHGs (particularly water vapour to the tune of at least 90% of the GHG effect) are responsible for elevating our planet to a higher temperature ….we are yet to see any clear evidence for the IPCC proclaimed fact that man-made GHGs are ‘very likely’ having a significant effect on our planet’s climate.’

    The word significant is here a bit misleading. What the IPCC says is that the fingerprint has been detected in statistical significant way, as I explained before. In other words, that the pattern of warming may be due to natural variations. The level of current level of temperatures ‘per se’ is not catastrophic, obviously.

    I detect some inconsistencies in your reasoning , though. If you accept that CO2 may have an affect of temperature, then you would also accept that there is a level above which concentrations of GHG may become dangerous. Maybe your level would be at 500, 700, 1000 ppm, but in principle you accept that there is a limit. You may disagree with others on where this limit is, but not in principle. Where would you out the limit? If we burn all fossil fuels available to us doing nothing against, we can reach very high levels.

    Howard,

    ‘these models do not reproduce the millennial-scale cycles of warming within the 8K yr cooling trend. In my mind, this is the hard bit that needs to be solved so that we can separate the natural climate variability from AGW…’

    I would agree. The level of knowledge of natural variations is limited, and models have deficiencies. The question is whether these deficiencies really dismiss the *core* predictions for the future. It is a subjective decision. I currently think they dont.

    ‘Back to the hockey stick, my immediate doubt of this plot was because I have never seen a flat curve of any data in my geologic experience. My WAG was the hockey stick is impossible because this is not how the world behaves.’

    As Jeff has explained, I do not need to debate on this

    Jeff,


    These are real effects which again I’ve had no time to study but I’m very curious what the climatological explanation is for the pre-1950 sea level rise which is interpreted as continued century long gaining of energy by the ocean. ‘

    The uncertainty in the pre-1950 rates of sea-level rates is of course higher than for the recent two decades. And as the IPCC explained , an explanation of this rate is not complete. In other words, the estimated rates of ocean warming together with the estimated rates of melting and the rates of water retrieval of ground water and dam constructions do not add up completely to the estimate rates of sea-level rise. But to my knowledge, there is no non-climatological explanation. The subsurface heat flux from the Earth interior is of the order of 50-80 mW (mili Watts), and we require an energy imbalance of the order of 0.5 Watts.

    For post-1950 data, and even more so for post-1990 data, the sea-level rise can indeed be explained by all this factors.

    Brain,
    ‘The concentrations of various isotopes gives information about biological uptake patterns, and some info about solar/cosmic radiation, but nothing about what is responsible for added or reduced CO2 levels’

    I dont understand your reasoning. If the global airborne CO2 is continuously being depleted of 14C is because it is being mixed with a source of Co2 depleted in 14C. What is illogical in this reasoning ?

    is your point that an increase of in the surface temperatures of 0.8K globally is responsible for an increase from 280ppm to 400 ppm, when the glacial interglacial changes of globally 7 K were at most responsible for a change in the CO2 concentration from 180ppm to 280 ppm ?

    Niels,

    ‘But these climate model simulations (eg. Gonzalez-Rouco et al (2003)) use a rather dramatically variable solar irradiance.’
    Well, not that dramatic – unless you believe Mann and co-workers. They used the value that was the consensus at that time (Lean et al.). The ‘consensus’ has shifted later towards lower amplitudes, that is true, but recently there are again groups that put forward larger amplitudes.

    Now, if the solar variations turn to be small in the past and the temperatures variation turn out to be large, we would have indeed two drastic explanations: either the climate sensitivity is very large, or internal variations are much larger than currently thought. Both would tend to invalidate climate models, but each of those in completely opposite directions.

    ‘The temperature rise 1910-1940 is also hard to explain without the use of outdated solar forcing values.’

    Agreed, that period is difficult to explain. Others may have an answer, but I I dont

  78. Eduardo, phrases like “not 100% sure” are weasel-words. The conclusions are not even statistically 10% sure. The analysis of the data is terrible statistically. Further, the models are not just weak, they’re a joke. The basic climate model assumed a flat earth with 24/7 average sunshine with uniform average temperature. This is not even a joke. It’s an abomination.

    As for the “very likely”, that was a qualitative 90% “probable” WAG (Wild-Assed Guess). Totally irresponsible, unscientific, meaningless.

    Get it through your head that the mandate and remit of the IPCC is to facilitate the acceptance and enforcement of AGW remediation. It is a political body that has dictated conclusions to its “scientific” report writers, often over their objections. To support it is to support egregious lies and manipulation.

  79. Brian H, you appear to be in a especially nasty frame of mind. I agree that a show of hands, even by the IPCC selected “experts” is not the same as objective and statistically established confidence limits. I think the thread is about what the participants are conjecturing since the detail is not there to argue for anything more specific than that – and that would go particularly so for your one liners.

    Perhaps I am misunderstanding, but I think the main consideration of the thread has been what could the consequences be if we established that historical variances in temperature (outside those influenced by the factors putting the earth into and out of the ice ages) are greater than is commonly expected in current time. Those consequences might seem to some to be counter to what is sometimes understood. I, for one, appreciate the views that Zorita and Christiansen have brought to thread on that matter.

    I just get a little riled when I think that the worry about those consequences is made without stating the assumptions that ground that worry.

  80. “Commonly expected”? This is the justification for rushed and abominably shoddy science, and slamming the brakes on the global economy?

    I don’t think so. Already hundreds of millions amy well have died from starvation due to myopic redirection of agricultural resources to “bio-fuels”, doubling food prices in bare-survival environments, without the minutest effect on CO2 levels. Behold the future towards which we are being hustled. And yes, I do get “nasty” when I’m being ‘hustled’. Putting hustlers in charge of the planet is NOT on.

  81. Brian H, have you got any non-weasel word science to back up the claim that “hundreds of millions may have died” etc? I’d be interested to read about that.

  82. Eduardo,

    . In other words, the estimated rates of ocean warming together with the estimated rates of melting and the rates of water retrieval of ground water and dam constructions do not add up completely to the estimate rates of sea-level rise.

    That’s exactly my point. We don’t know why sea level rose pre 1950 so we cannot claim causal knowledge for today. It’s that simple. The concept is that the earth wasn’t gaining energy until recently yet we have ocean levels which show gain. It’s a good size Achilles heel for climate science.

    I said ocean transferred heat to the air in 1930.

    You replied “nope” it gained energy – note the sea level rise.

    I asked about data quality and the explanation for continuous rise for a hundred years.

    You replied about the pre-1950 stuff not quite adding up.

    I will then reply with – ocean data cannot support the AGW consensus claim any more than it did mine. In fact, I would say it directly refutes models in their current state. And again I wonder about data quality, which could resolve the contradiction. Certainly, we would expect water expansion from added heat, but we also measured water expansion in the pre-industrial times.

    It is a contradiction – which is why I mentioned a long term recovery from a colder than accepted little ice age.

    I am perfectly willing to accept a data quality explanation for the problem. Where I am firm in my opinion is when the data is used to refute my hypotheses yet the same data is used to support the consensus – which is also contradicted.

    In the engineering world, this kind of curve would grind us to a halt until we resolved the discrepancy or found a low cost way to test it. In the AGW world, where tests and QC are very weak, it’s improperly labeled uncertainty.

    It should be labeled unknowainty.

  83. BTW, again I don’t think anything is proven by ocean level. It’s just an interesting discussion which has logical problems to resolve. Again, I really find your work on paleo variance more interesting than the discussion of the unknowable. Unfortunately, discussion of the unknowable is common fare on climate blogs.

  84. The problem is replacing the unknowable in a “model” with a WAG plug, and then touting the resulting “scenario” as science. It is no such thing.

  85. Eduardo: be your level would be at 500, 700, 1000 ppm, but in principle you accept that there is a limit. You may disagree with others on where this limit is, but not in principle. Where would you out the limit?

    I’m not Kevin, but here’s one way to view it:
    I think it’s very hard to attribute more than at most 0.3 K temperature rise to CO2 (or other AGHG), because that’s roughly the difference between the current warm period and the previous (and only if you’re putting what I think is a little to much confidence in the data quality). Assuming a rise from 300 ppm in 1940 to 390 ppm today, this gives about +1.4 K at 1000 ppm. That doesn’t sound scary to me, and the added CO2 and increased agricultural areas in the north may be just what we need to produce enough food at that time – and it may even be a good insurance against the next glacial. Even 2000 ppm may still have more benefits than disadvantages, this should gives us really plenty of time to move to really sustainable energy sources (e.g. first thorium, then fusion).

    Now, one may argue that +1.4 K may be enough to melt significant parts of the ice sheets, but I think that worry is exaggerated, since this melting will be on such a large time scale (thousands of years) that adapting to rising sea levels will be a non-problem.

  86. Brian H

    #86

    “Get it through your head that the mandate and remit of the IPCC is to facilitate the acceptance and enforcement of AGW remediation. It is a political body that has dictated conclusions to its “scientific” report writers, often over their objections. To support it is to support egregious lies and manipulation.”

    Much as I am with you on most of that statement, please take Steve M’s advice and try not to go ‘a bridge too far’. i mad eteh mistake of scaring away Isaac Held once on CA (I also had a go at Judith Curry but thankfully she wa smuch more thick skinned) and very much regret that because as a consequence we didn’t get to discuss the IPCC’s ‘strong net positive feedback from water vapour and clouds’ with a world expert (along with Brian Soden) on this pivotal to CAGW subject. Eduardo appears to be much more like Judith Curry than Isaac Held and I’m very much enjoying this thread thus far so could I please request that you ‘dial down’ (as I and others have done here)your frustration meter if possible?

    Eduardo

    #85

    “Kevin,
    The temperatures in the 20th century are probably not exceptional. There were warmer periods in the past. For us it is exceptional because without invoking GHG it is almost impossible to explain the positive trend in the global mean temperature and its spatial pattern: more warming at high latitudes, more warming in winter, cooling in the stratosphere, etc.”

    Thank you for your answer to my question. I’ve very much heartened by the fact that you acknowledge that 20th century temperature sare not exceptional. base don that answer I’ll assume that you also agree that the late 20th century temperatures levels and warming rate are not exceptional also when compared to historically (albeit largely reconstructed from proxies) past multi-centennial temperatures. Contrary to what Gavin Schmidt thinks, it is important not to deny past climate change as (and again sorry for once again repeating this statement) without understanding and being able to fully explain our past climate we do not stand any chance of being able to predict/project our future climate (and man’s possible effects upon it) to any reasonablly acceptable level of certainty.

  87. #97, I agree, it’s nice to have some of the experts come by and answer some questions and I’d rather not make them unwelcome with vitriol. Besides, I’m a big fan of Dr’s. Von Storch, Zorita and Christiansens work in the paleo areas.

  88. #99 It would be a difficult thing to request. I may do a thread on it in the coming weeks but wouldn’t expect much participation. This isn’t an easy issue for those with careers in climatology. Of all that we might hope for from blogging, openness and gradual progress towards correcting some of the known errors are the main things. That’s why I’m a fan of their and others work on the topic. There were some great comments on Ammann’s recent letter covered here as well.

    I have another post on M07 variance loss to do which I may get to tonight. After that, I’ve got some of Steig’s temp stations to play with but am not sure what will capture my attention.

    If you would like to write a level-headed post on the topic, I would be happy to post it.

  89. Bo’s LOC method looks identical to what I did in my papers:
    Loehle, C. 2007. A 2000 Year Global Temperature Reconstruction based on Non-Treering Proxy Data. Energy & Environment 18:1049-1058
    Loehle, C. and Hu McCulloch. 2008. Correction to: A 2000 Year Global Temperature Reconstruction based on Non-Treering Proxy Data. Energy & Environment 19:93-100

  90. RE: #47 Bo says this period is different because we know the forcings. This is only true for an ideal atmosphere with no clouds, no convection, and no changes in water vapor. This also assumes that the atmosphere reacts the same to all forcings. But consider that a forcing caused by altered cosmic rays, which directly affects cloud formation (and moreso over the poles) is not the same as a forcing caused by increased CO2.

  91. E.g., that take into account the 4th-power radiative implications of the huge diurnal swings in desert regions vs. the miniscule swings in forested and frequently clouded zones.

  92. Espen,

    I think that everyone would agree that the estimation of the climate sensitivity is burdened with large uncertainties. In you calculation you dont take into account many other uncertain forcings – e.g. aerosols, solar forcing, etc – but we all know we could be discussion about those forcings until the next ice age. I would just point out that presently we observe an energy imbalance of about 0.8 watts/m2 – This figure is not controversial, Roger Pielke Sr. says so , so you probably would believe him. The difference in CO2 concentrations between now and 1940 that you have indicated amounts to a radiative forcing of about 5.4 ln(390/300) = 1.4 watt/m2. This means that more than half of this forcing is not in ‘equilibrium with the surface temperature, or in other words, if the CO2 concentrations remain frozen in the future at 390 ppm, the surface temperature should still rise until a new equilibrium is reached. This would mean that your final calculation is an underestimation by more than a factor 2 and the equilibrium temperature response at 1000 ppm – even with your assumptions- would be rather 3.2 K.
    Now, this is a back-of-the-envelope calculation, so we may have made some errors in our assumptions. Perhaps could the correct result be 2.8K or 3.6 K ?

  93. #106, I know you addressed Espen, but in your simple example, the assumption is that 100% of the ‘imbalance’ is CO2 driven? Do you have an opinion on how much is CO2 vs other?

    No gotcha’s intended, I just noticed the assumption and have heard/read several opinions on the matter. These are the details that keep me up at night. It’s not a yes/no question it’s a how much and what to do.

    So far, my opinion is that the warming is completely benign and tracks sat records better than surface. I’m really unimpressed with the quality of the surface data and find it disturbing that there isn’t more concern about the obvious quality problems. Again, this is way off topic from the original post but if you have the time, I’m interested.

  94. Eduardo,

    If there is a current global energy imbalance of ~0.8 watt/M*2, then surely this should be evident in rapid ocean heat accumulation. Yet ARGO data post 2004-2005 suggests very much lower heat accumulation in the ocean. On what basis do you conclude a current imbalance of ~0.8 watt/M^2 is correct?

    This is a serious question; large ocean heat accumulation, and long ocean lags are pretty much required if the climate is as sensitive as 2.8C per doubling of CO2. I have not seen any convincing data.

  95. Re: eduardo (Aug 3 18:10),

    I would just point out that presently we observe an energy imbalance of about 0.8 watts/m2 – This figure is not controversial, Roger Pielke Sr. says so , so you probably would believe him.

    Did I miss something? When did he change his mind? The last I read, he didn’t believe it was that high because the ocean heat content and sea level weren’t going up fast enough for the imbalance to be that high. There’s a very suspicious rapid change in the ocean heat content series around 2003 that isn’t matched in the sea level data. This is almost certainly due to problems splicing the old XBT data to the new ARGO data. If that ends up getting removed, then OHC isn’t going up anywhere near fast enough to account for an imbalance of 0.8 W/m2. If the heat is going into the deep ocean, then the delayed climate sensitivity is going to be a lot lower than modeled because the time constant for the exchange with the deep ocean will have to be a lot smaller than thought.

  96. Eduardo: I would just point out that presently we observe an energy imbalance of about 0.8 watts/m2 – This figure is not controversial, Roger Pielke Sr. says so , so you probably would believe him.

    Really? What do you mean by “presently”?

    I don’t understand your reference to Roger Pielke Sr., I’ve gotten the impression that he’s very open on this question:
    http://pielkeclimatesci.wordpress.com/2010/05/21/update-on-jim-hansens-forecast-of-the-global-radiative-imbalance-as-diagnosed-by-the-upper-ocean-heat-content-change/

    The ocean heat content has been remarkably flat since the argo floats went into use, but that’s a very short timespan. Maybe we can have more certain knowledge about the energy balance in 5 or 10 years…

  97. Espen, Steve

    the figure of 0.8 w/m2 of radiative imbalance is around year 2000. As you said a few years is too short a period to obtain a reliable figure. Also, sea-level is rising though in the last few years the rate has diminished a bit and is mostly caused by melting – roughly to about 75% or so, and heat (for each W/m2) is more effective to raise sea-level than expansion.
    The heat balance in the last decade is not closed, as Trenberth expressed clearly in other occasion, so these are figures that are still debatable. So to focus on 2004-2005 is a bit unfair. If the Earth were now in 2005 really in equilibrium, sea-level rise should stop now. Do you think that sea-level will not rise any more?

    My point was rather the following. Imagine that you are a scientist which is consulted by a policy maker on this matter. Could you give him a range for the estimation of climate sensitivity, taking into account possible uncertainties ? For instance, you stated that CO2 caused a warming of 0.3 K by comparing the temperatures in 1940 and 2000. But you, or others here , also stress the important potential role of natural variations. How do we know that the climate was in equilibrium in 1940 or and 2000, should we assume a range for the differences, a range for the imbalance , etc, also towards the direction that we personally dont ‘like’, but assuming that to this virtual politician would also pose inconvenient questions ?

    Jeff,

    no, I dont assume that the imbalance is only caused by the increase in CO2. Many other forcings have been active in the 2nd half of the 20 century. I was trying to follow through after Espen’s comment.

  98. #112, I didn’t mean it like that, I meant to ask what is your best estimate of the true level.

    I really don’t have enough experience with this, so I ask the experts first when I can. Especially those who give honest answers as you have done here.

  99. Eduardo:

    We’re drifting even further away from the subject and into policy now, but I don’t want to miss this opportunity for a layman to have an interesting discussion with an expert in the field, so I hope Jeff won’t object to yet another round:

    The heat balance in the last decade is not closed, as Trenberth expressed clearly in other occasion, so these are figures that are still debatable. So to focus on 2004-2005 is a bit unfair.

    Steve wrote post 2004-2005 – i.e. up to today, i.e. the period for which we have the best data for OHC.

    My point was rather the following. Imagine that you are a scientist which is consulted by a policy maker on this matter. Could you give him a range for the estimation of climate sensitivity, taking into account possible uncertainties ?

    I don’t think it’s currently possible to give a range narrow enough to make policy decisions based on that alone (but see below).

    For instance, you stated that CO2 caused a warming of 0.3 K by comparing the temperatures in 1940 and 2000. But you, or others here , also stress the important potential role of natural variations. How do we know that the climate was in equilibrium in 1940 or and 2000, should we assume a range for the differences, a range for the imbalance , etc, also towards the direction that we personally dont ‘like’, but assuming that to this virtual politician would also pose inconvenient questions ?

    I don’t think the climate is ever really in equilibrium, do you? The reason I chose 1940 and 2000 is just to try to remove the presumably natural fluctuations that caused the early 20th century warming from the signal, by choosing two years where the climate system may have been (not in equilibrium but) in similar states. The resulting signal may still be 99% “natural”, if we believe those that see longer-period natural oscillations (i.e. we’re still recovering from the LIA).

    On the other hand, I’m also still open to the extreme other possibility – that all the warming since 1950-1960 (or even more than that) is due to humans. I just don’t think there is strong evidence for such a position.

    Bu my biggest problem with policy decisions is that I don’t think the claimed consequences of warming are sufficiently founded in science. There has been far too much focus on negative horror scenarios, but geological and historical records show us that in general, a warmer earth is far better than a colder earth. In addition, an atmosphere with a little more CO2 may allow us to grow more food with less water input. I think this is why the refutation of the hockey stick is so important (see, I managed to bring the discussion slightly on-topic again ;-)): The fact that even in very recent times (on a geological scale) in this interglacial we had much colder and at least as warm periods as the current, and that historical records seem to indicate that the warm periods were the good times, should at least give us good reason to tell the policymaker that warming is way better than cooling.

    So shouldn’t the policymaker prefer modest AGW to no AGW? I think he should, because no AGW would mean that the risk for cold periods are as high as they ever were, and IMHO the consequences of a new LIA may be far more devastating than the consequences of 2-4 degrees of warming.

    So my advice to a policymaker would be: Plan for the possible adaption to both colder and warmer climate, and don’t let one get in way of the other. A particular consequence of this is to avoid the current biofuels like the plague…

  100. We should be urgently focussed on ending the CO2 famine by bringing its level back to the 0.1% level. Unfortunately, the downside risk is that the negative feedbacks in the system are so strong that risks setting off the long overdue Global Cooling. Those darn tipping points … ! 😀

  101. Eduardo #112,

    Fair enough, there is (and pretty much has to be) variation in the measured imbalance in the short term. What I objected to was your statement of “a current imbalance of 0.8 watt”, when the best available data (ARGO, rate of sea level rise, and ocean mass measurements from GRACE) clearly indicate a very much lower current radiative imbalance. And the current heat content data is almost certainly far more reliable than the heat content estimates pre-ARGO. Your 0.8 watt/square meter is comparable to the value calculated by Hansen et al (2005) based on the 1993 to 2003 period, which happens to also be the 10 year period with the most rapid measured global temperature rise since 1979. The ten year period from 2000 to 2010 tells a very different story, and the nine year period from 2001 to 2010 tells the exact opposite story. (see http://www.woodfortrees.org/plot/wti/plot/wti/trend/plot/wti/from:2000/to:2010/trend/plot/wti/from:1993/to:2003/trend/plot/wti/from:2001/to:2010/trend for a comparison of these trends).

    What would be the true radiative imbalance in the absence of ‘natural variation’? I sure don’t known, but I am certain that nobody else knows either. Considering the long term temperature history, it sure seems likely to be a lot less than imbalance calculated from the 1993 to 2003 period. Any claim of a large current imbalance (like 0.8 watt/M^2) has to be supported by solid data, and such data does not exist; indeed, the best available data says the current imbalance is very low. That some GCM models say there SHOULD be a current imbalance of 0.8 watt per M^2 doesn’t mean very much, since the best available data clearly does not support this value.

    Honestly Eduardo, your assertion of a high current imbalance, stated as if it were fact (in the face of clear contrary evidence), is exactly the sort of thing that gives thoughtful people good reason to doubt the predictions of climate science (and at the same time drives those same people absolutely crazy!). Broad assertions which are not supported by data are bad for climate science.

  102. Jeff,

    The best estimate of the amount of 20th century warming that can be attributed to anthropogenic GHG strongly depends on the aerosol forcing, a field in which I am not expert. Also the solar forcings is somewhat debated. So I have to rely on this on interpreting the work of others. Recent estimation of this aerosol forcings, e.g. Myhre, tend to indicate a weaker effect than reported in the IPCC AR4. This would imply a lower sensitivity, since in that case aerosols did not offset much of the warming. However, I would not just rely on a single analysis. To cut the story very short, I would answer your question with roughly 70% of the observed 0.75 K warming. Again, I think it is important not to be trapped in one’s own estimations, but to allow for the possibility to be in error, as many things are uncertain.

    Consistency Between Satellite-Derived
    and Modeled Estimates of the
    Direct Aerosol Effect
    Gunnar Myhre
    In the Intergovernmental Panel on Climate Change Fourth Assessment Report, the direct aerosol
    effect is reported to have a radiative forcing estimate of –0.5 watt per square meter (W m−2),
    offsetting the warming from CO2 by almost one-third. The uncertainty, however, ranges from
    –0.9 to –0.1 W m−2, which is largely due to differences between estimates from global aerosol
    models and observation-based estimates, with the latter tending to have stronger (more negative)
    radiative forcing. This study demonstrates consistency between a global aerosol model and
    adjustment to an observation-based method, producing a global and annual mean radiative forcing
    that is weaker than –0.5 W m−2, with a best estimate of –0.3 W m−2. The physical explanation
    for the earlier discrepancy is that the relative increase in anthropogenic black carbon (absorbing
    aerosols) is much larger than the overall increase in the anthropogenic abundance of aerosols.

    Espen and others,

    if after examining the evidence you reach conclusion that the GHG warming will be either beneficial or not harmful, well thats your opinion. You are entitled to that and I would not discuss it. My point would be that, from the scientific point of view, one should also consider what under reasonable assumptions *could* happen. We have only one go at this, so to speak, so we would like to be pretty sure. Note that my best estimate could also be that the sensitivity may be low – actually that is what I would like it to be -but intuitions here are not enough. I would have to explore all reasonable possibilities, taking into account all uncertainties.

    Steve: again I would try to scour all sources and not just focus on ‘ours’. I copy here two abstract of recent papers. As I said thermal expansion has recently slowed, but it has not gone to zero, and sea-level is still rising, although at a slightly lower pace in the recent few years. If we all are interested we could go for a back-of-the-envelope calculation of the imbalance in the last five years or so. There are a number of assumptions involved, for instance the depth of heat penetration, or say the but we could expose them open and work out what would be a ‘consensus’ range, and all of us learn a few things on the way.

    It seems that we have drifted quite a lot from the original topic.

    Sea level budget over 2003–2008: A reevaluation from GRACE space gravimetry,
    satellite altimetry and Argo
    A. Cazenave a,⁎, K. Dominh a, S. Guinehut b, E. Berthier a, W. Llovel a, G. Ramillien a, M. Ablain b, G. Larnicol b

    a b s t r a c t
    From the IPCC 4th Assessment Report published in 2007, ocean thermal expansion contributed by ∼ 50% to
    the 3.1 mm/yr observed global mean sea level rise during the 1993–2003 decade, the remaining rate of rise being essentially explained by shrinking of land ice. Recently published results suggest that since about 2003,
    ocean thermal expansion change, based on the newly deployed Argo system, is showing a plateau while sea
    level is still rising, although at a reduced rate (∼ 2.5 mm/yr). Using space gravimetry observations from GRACE, we show that recent years sea level rise can be mostly explained by an increase of the mass of the oceans. Estimating GRACE-based ice sheet mass balance and using published estimates for glaciers melting, we further show that ocean mass increase since 2003 results by about half from an enhanced contribution of the polar ice sheets – compared to the previous decade – and half from mountain glaciers melting. Taking also into account the small GRACE-based contribution from continental waters (b 0.2 mm/yr), we find a total ocean mass contribution of ∼ 2 mm/yr over 2003–2008. Such a value represents ∼ 80% of the altimetry-based rate of sea level rise over that period. We next estimate the steric sea level (i.e., ocean thermal expansion plus salinity effects) contribution from: (1) the difference between altimetry-based sea level and ocean mass
    change and (2) Argo data. Inferred steric sea level rate from (1) (∼ 0.3 mm/yr over 2003–2008) agrees well with the Argo-based value also estimated here (0.37 mm/yr over 2004–2008). Furthermore, the sea level budget approach presented in this study allows us to constrain independent estimates of the Glacial Isostatic Adjustment (GIA) correction applied to GRACE-based ocean and ice sheet mass changes, as well as of glaciers melting. Values for the GIA correction and glacier contribution needed to close the sea level budget and explain GRACE-based mass estimates over the recent years agree well with totally independent determinations.

    Closing the sea level rise budget with altimetry, Argo, and GRACE
    Eric W. Leuliette1 and Laury Miller1

    [1] An analysis of the steric and ocean mass components of sea level shows that the sea level rise budget for the period January 2004 to December 2007 can be closed. Using corrected and verified Jason-1 and Envisat altimetry observations of total sea level, upper ocean steric sea level from the Argo array, and ocean mass variations inferred from GRACE gravity mission observations, we find that the sum of steric sea level and the ocean mass component has a trend of 1.5 ± 1.0 mm/a over the period, in agreement with the total sea level rise observed by either Jason-1 (2.4 ± 1.1 mm/a) or Envisat (2.7 ± 1.5 mm/a) within a 95% confidence interval. Citation: Leuliette, E. W., and L. Miller
    (2009), Closing the sea level rise budget with altimetry, Argo, and GRACE, Geophys. Res. Lett., 36, L04608, doi:10.1029/ 2008GL036010.

  103. Eduardo: My point would be that, from the scientific point of view, one should also consider what under reasonable assumptions *could* happen.

    Absolutely! But the forecasting of the actual impact of and vulnerability of the projected amount of warming has so far been the weakest part of the science – at least it seems so to this layman, considering the non-scientific sources used in the IPCC AR4 WGII report.

  104. Re: eduardo (Aug 6 07:41),

    Domingues et al, Improved estimates of upper-ocean warming and multi-decadal sea-level rise, Nature, 453, pp 1090-3, 19 June 2008 has OHC going up by at most 0.6 W/m2 from 1961-2003 if you include the somewhat iffy deep ocean. That’s a thermosteric expansion rate of ~1.5 mm/year. Cazenave et al has the thermosteric rate from 2003-2008 at ~0.3 mm/year. An imbalance of 0.8 W/m2 looks more like an upper limit rather than the expected value. The heat associated with melting land based ice contribution to sea level rise is much smaller.

  105. Espen #119

    “Absolutely! But the forecasting of the actual impact of and vulnerability of the projected amount of warming has so far been the weakest part of the science – at least it seems so to this layman, considering the non-scientific sources used in the IPCC AR4 WGII report.”

    I’d be careful what you wish for i.e. so called stronger science coming from the IPCC in the AR5 WGII in this particular area i.e Amazongate, Africagate etc. next time aroudn we’ll be having Peter Cox as an ARr lead/contributing author/reviewer (not sure which he’ll be as don’t have time to check just now). Now if I tell you that Peter Cox was responsible when he was with the UK met Office for a paper which predicted ‘Amazon dieback’ and MGST increases by the end of the 21st century of in excess of 8 deg.C then I don’t think the science is going to necessarily bet any better in AR5.

    “Amazonian dieback under climate-carbon cycle projections for the 21st century. P.M. Cox, R.A. Betts, M. Collins P. Harris, C. Huntingford and C.D. Jones “, March 2003

    In fact I fully expect it to get considerable worse when alarmists like Peter Cox are involved in the process. Now the so called dynamic global vegetation model he created and used in that study was called ‘Triffid’. What does that tell you about him? Was he ‘extracting the Michael’ when he decided to call his model that?

    http://en.wikipedia.org/wiki/Dynamic_global_vegetation_model

    If you want to get on in the climate alarmism business then it pays to produce ever more alarming scenarios (ask Myles Allen) that involve models with strong positive feedback mechanisms. In Peter Cox’s case he is a past mater at it. His reward for producing the kind of outputs that the IPCC want is that he is now a Professor of ‘Climate System Dynamics’ in the Faculty of Engineering, Computing and Mathematics at Exeter University. And he is but one example Espen.

    http://www.secam.ex.ac.uk/profile/pmc205

  106. Eduardo #118,

    OK. Lets do that back-of-envelope calculation. Amy Cazenave and company estimate a temperature driven sea level rise of ~0.3 mm/year from ocean mass measurements and satellite measured sea level rise for 2003 to 2008 (independent of any ARGO heat content estimates). How much net radiative imbalance does that suggest for the 5 year period?

    To simplify the calculation, let’s assume that all the imbalance accumulates as heat in the top 100 meters (~10 Kg/cm^2). And let’s calculate expansion based on fresh water rather than salt water (just because I happen to have very good data on fresh water density). If the average temperature of the top 100 meters is 15C, and the temperature were to rise to 16C, how much would that top 100 meters expand?

    height 10 Kg @15C 10009.01813 cm
    height 10 Kg @16C 10010.58118 cm
    expansion 1.563058981 cm/deg
    (based on density values from CRC Handbook of Chemistry and Physics)
    inferred expansion – 0.030 cm/yr (Cazenave, et al)

    The required delta T for 100 meters to expand 0.03cm/yr is 0.019193134 degree/year
    (The actual temperature rise at the ocean surface would of course be much less than this, since more than the top 100 meters is warming; according to Josh Willis at NASA JPL, most historical heat accumulation has been in the top 400 meters).

    The heat to warm 1 cm^2, 100 meters deep by 0.019193134 degree is about:
    0.191931337 Kcal/year

    And the heat to raise one square meter, 100 meters deep is about:
    1919.31337 Kcal/year
    At 1 KWH = 859 Kcal, the thermal imbalance is 2.234357823 KWH/(yr*M^2)
    The number of hours in a year is 8760
    Net Imbalance 0.255063678 watt/M^2

    This is a very long way from a “current imbalance of about 0.8 watt” per square meter.

    If we assume the independent AGRO ocean heat content data (based on four independent analyses) is more accurate, then the current imbalance is probably closer to ~0.315 watt per square meter, but this is still a very long way from 0.8 watt/M^2.

    So, I still do not understand why you so confidently assert the current radiative imbalance is about 0.8 watt/M^2.

    The ocean data clearly say otherwise. These data are in obvious conflict with the ERBE data that K. Trendberth is so worried about. You may be able to offer reasons why you think that satellite altimetry, GRACE, and ARGO data have to be wrong, and ERBE has to be right, and please feel free to do so. But there is no justification I can see for any degree of certainty about a current imbalance of 0.8 watt per square meter.

  107. #15;
    We’re not left just with the accounts of your Norse ancestors. Datable tree stumps emerging from under the edges of retreating glaciers have a certain persuasive power, too! 😉

  108. #34;
    Yep, and the GHG/CO2 glitter turns out to have be iron pyrite, AKA Fool’s Gold. Time to move on! 😀

  109. Eduardo,

    It seems you have disappeared. Too bad, we were just reaching the point of discussing actual numbers for radiative imbalance.

    But this does not really surprise me… it has happened several times before in my blog exchanges with climate scientists; whenever the discussion reaches the point of numbers, they disappear. Why will climate scientists not address the real technical issues? I have no idea.

    Forgive me Eduardo, but there is nothing good I can conclude from this.

    Take care,

    Steve

  110. Steve,
    well, my last comment was on August 4th and on August 7th you are already missing my presence. It just happens that I am on holiday, eating paella and playing football instead of hockey. Thats why we are the world champions. You are not right that climate scientist are lily-livered: if you wish, instead of dicussion the energy imbalance we can fight it out and the winner is declared to have to correct numbers.
    Now more, seriously.. One thing to be corrected in my previous post is that it seems that Pielke Dr. does not accept an imbalance of 0.8 K/w2 for the recet years.
    As I said, my view is the data in the last few years cannot be all correct. The thermal expansion measured by Argo doesnt really fit with the Jason sea-level data and the Grace rate of polar melting. So we have to be cautious before acepting any if them. Thats why I focused on the more long-term imbalance.
    Anyway, considering the last few years, it does seem that the thermal imbalance could be smaller in the very recent period than the figure of 0.8 w/m2.
    If this lower figure truns out to be robust and remains low in the next few years or next decade, this would indeed indicate that the sensitivity can be lower.
    New robust data lead to new estimations. I would remind that the ARGO data have been corrected once, and this may also happen to the Jason and Grace data as well.

    Espen,
    to the quality of IPCC WGII, I would broadly agree

  111. #128, Eduardo,
    Thank you for your reply. You are much younger than me, and I am no position to fight it out with you… but I would be happy to challenge you to a round of golf, low gross has the right radiative imbalance number.

    There is always going to be some conflict between data sources; in this case, it seems to me that the weight of the data indicates a much smaller radiative imbalance.

    “If this lower figure turns out to be robust and remains low in the next few years or next decade, this would indeed indicate that the sensitivity can be lower.”

    Agreed. Should I still be around in 5 or 10 years, I hope we can return to this issue.

  112. Steve, well I am now in the home province of Severiano Ballesteros 🙂 so I will try to learn a bit.
    Here are my back-of-the envelope calculations. I have used the following constants for sea-water:

    Thermal expansion at 20C 2.41 x 10(-4)
    heat capacity at 20C 3.95x 10(3) J/kg

    With these constants for the average water masses expanding, I get that the energy imbalance of
    0.52 W/m2 per mm/year of sea-level rise. This is likely an underestimation since the average temperature would be lower, the thermal expansion coefficient would be therefore smaller, and the imbalance required for a given expansion would be somewhat higher. I dont have the numbers at hand here.

    I have looked up two sources for the steric contribution of sealevel rise in the last few years:

    Closing the sea level rise budget with altimetry, Argo, and GRACE
    Eric W. Leuliette1 and Laury Miller1:

    Thermal expansion 0.8 +- 0.8 mm/year from Argo
    Total sea-level rise 2.7 +- 1.5 mm/year from Jason

    Sea level budget over 2003–2008: A reevaluation from GRACE space gravimetry,
    satellite altimetry and Argo
    A. Cazenave a,⁎, K. Dominh a, S. Guinehut b, E. Berthier a, W. Llovel a, G. Ramillien a, M. Ablain b, G. Larnicol b

    Thermal expansion 0.31 +- 0.15 mm/year from Argo
    Total sea-level rise 2.5 +- 0.4 mm/year from Jason

    These numbers would give us a ‘central estimate of:

    0.41 W/m2
    and
    0.14 W/m2

    To combine the uncertainties of both in a single estimation could be tricky, as we really dont know what assumptions went into the individual uncertainties but let us write down the ‘fringes’ of both estimations:
    0.81 w/m2
    and
    0.07 w/m2
    And the arithmetic mean of the central estimates would be 0.28 w/m2, perhaps a bit higher due to the lower average water temperatures, let us take 0.3 w/m2.

    Now to roughly estimate the sensitivity, we assume that the global temperature has increased by 0.8K from the preindustrial period. In this period the total forcings would have been (here I am writing the high-ish numbers to get a lower bound of the sensitivity):

    CO2: 1.7 w/m2
    Methane:0.4 w/m2
    ozone: 0.5 w/m2
    aerosols:-0.5w/m2 (here I use the low-ish negative forcing provided by Myhre, see my previous comment)
    solar : 0.20 w/m2
    Total :2.30 w/m2

    of which 2.30-0.30 = 2 w/m2 have been already ‘realized’ (total minus imbalance)
    The sensitivity to 2xCO2 would then be (0.8/2) x 3.8 = 1.52 K

    Now it is clear that this estimation is strongly dependent on the values assumed for solar forcing and for aerosol forcing. But I think it is difficult to get much lower numbers. Higher numbers would be plausible, or at least not to be ruled out.

    To the sea-level data, I agree with Jeff that this is a really intesting topic. The Jason data indicate a much higher rise in the Southern Hemisphere, whereas sea-level in the Northern Hemisphere has risen only very slightly in the last few years. And why is melting now much more important than thermal expansion ?

  113. Except that it’s probably 0.6 (of which half occurred BEFORE the surge in CO2 emissions) and the 1.7 w/m2 is probably high by a factor of about 7.

    Which reduces your result to about 0.11°K.

  114. # 131 Eduardo,

    Thank you for your back-of-envelop calcualtions. One minor point: the ~0.3 mm per year value form Cazenave et al is the inferred thermal expansion based on the difference between satellite sea level and Grace ocean mass. Their ARGO based value was 0.37 mm per year thermal expansion.

    I had already done the same estimate of climate sensitivity as you did, but with the following differences:

    1) I used -0.3 watt for the aerosol effect because Myhre said “This study demonstrates consistency between a global aerosol model and adjustment to an observation-based method, producing a global and annual mean radiative forcing that is weaker than –0.5 W m−2, with a best estimate of –0.3 W m−2.”

    2) I did not assume any solar intensity increase (too uncertain for my taste)

    3) I included an estimate for NO2 increases of 0.33 watt per square meter.

    4) I included estimated forcing from halocarbons of 0.34 watt per square meter

    5) I used tropospheric ozone of 0.34

    So in total:

    CO2 – 1.7
    methane – 0.4
    ozone – 0.34
    N2O – 0.33
    halocarbons – 0.34
    aerosols – (-0.3)

    Total – 2.82 watts per square meter

    Less what is accumulated in the ocean (-0.3), leaves 2.52 watts per M^2

    Sensitivity then works out to (0.8/2.52)*3.8 = 1.21 degrees per doubling, with a little under 0.1C currently “in the pipeline”. The most doubtful number I used was for N2O (0.33) which might be as low as 0.2, depending on who you believe.

    Using your value for aerosols (-0.5) and a value of 0.2 for N2O, I get a sensitivity of (0.8/2.19)*3.8 = 1.39 per doubling, with ~0.18C currently “in the pipeline”.

    I agree that the biggest uncertainty is heat accumulation in the deep ocean, since the coefficient of expansion is much lower at 0C-5C. But there does not seem to be a lot of credible data to suggest a there is substantial heat accumulation in the very deep ocean. I have for a long time believed that the true sensitivity is likely to be relatively low. I am open to new information that suggests otherwise, but so far I have not seen it. The IPCC best estimate of 2.85C per doubling just does not appear to me to be reasonable. For this value to be correct, > 1.8 watts per square meter have to currently be accounted for by aerosol effects and ocean accumulation, and as far as I am aware, there is no credible data which supports this.

    As to “why is melting now much more important than thermal expansion?”, the rate of melting ought to be proportional (more-or-less) to the rise in temperature above the temperature where there would be neither net melting nor ice accumulation. If we guess that temperature is ~1 C (global average) colder than today, then most of the increase in melt contribution would have to have taken place in the last 40 years or so. GHG’s increase warming much more at high latitudes than at low, so the GHG driven warming should have a greater influence on recent melt rate than warming due to natural variability. The characteristic response profile of sea level to rising temperatures ought to be a relatively fast thermal expansion plus a gradually increasing melt contribution:

    rate of rise ~ k1 * dT/dt + k2 * (T – Teq)
    {T is temperature, t is time, Teq is the no-melt temperature}

    Since total dT/dt for the 2003 to 2008 period was near zero, melt pretty much had to dominate thermal expansion, just as Cazenave et al found. The above does not account for lag in heat absorption by the ocean, which is probably why there was still some thermal expansion between 2003 and 2008, in spite of no net surface temperature change.

  115. I’m curious about how there can be “some thermal expansion between 2003 and 2008, in spite of no net surface temperature change.” The lag is not in thermal expansion, which is an instantaneous physics process. It can only be in transport of heat from one locale to another. If higher surface temps are not available to drive deeper heating, what lag are you referring to?

  116. #134,
    If the ocean is gradually accumulating heat, there would be expansion, in spite of a constant surface temperature. This can happen because the temperature profile of the ocean is warmer near the surface and colder (MUCH colder) at increasing depths. The lag is not in the expansion (which is an instantaneous effect of temperature, of course), it is in the approach to equilibrium of water that is not at the surface. You can still have net heat flux into or out of the ocean surface, even at constant surface temperature, during the transient period following a change in surface temperature. In other words, the underlying layers could still be warming, even while the surface temperature is fixed.

    For example, suppose we were to fix the surface temperature of the ocean at some slightly higher level than today (say +0.1C) by adding whatever heat is needed to maintain that new (slightly higher) surface temperature. Initially we would have to add heat at a high rate to warm the near-surface layers, which mix rapidly. After that, there would be a gradual fall off (most likely an exponential decay function) in the amount of heat needed to maintain the same surface temperature, as ever deeper and slower reacting layers were influenced by the slightly warmer surface.

    So even though the average ocean surface temperature did not change significantly between 2003 and 2008, there had been a fairly rapid (for the ocean) increase in average surface temperature between ~1980 and ~2000, and heat would still be expected to be flowing into the ocean (at a gradually declining rate) as a result of that earlier warming.

  117. #135;
    OK, sort of. The net heat flow into the lower depths would have to be made up in real time at the surface, either from the atmosphere or excess of solar influx over all cooling effects (evaporation, blackbody radiation, cool precipitation, etc.). That is, if the surface temp is constant, its heat losses to the depths must be made up 1:1 in real time. Which means that holding it constant in effect makes it transparent, and it can be disregarded for the purposes of analysing the amount and source of heat entering the lower layers.

    No?

  118. Added to #136;
    There is no reason to think the heat moving to the depths would be steadily declining, much less exponentially, since the temperature spread is always large. I think you’d have to look at things like the thermocline and so on to get much more than a WAG about the details of the heat transfer.

  119. #135;
    Thinking more about your description, it seems to amount to saying a (one-time) thermal wave is moving down through the water column after the surface temp has stabilized. I guess something like that is possible, but that it would take years to do so seems unreasonable.

  120. #136, 137, 138,
    Brian,

    Well, clearly you are beginning to understand the factors that are involved.

    The one-time “thermal wave” you talk about is something that is happening all the time, albeit at a smaller scale than the one-time thermal wave that you talk about.

    That it takes years to propagate is not unreasonable; it is perfectly natural, considering the scale in distance and time. Consider how temperature changes propagate down into the depths of the earth’s surface, as measured by boreholes. Yes, temperature changes are terribly dampened by the enormous thermal masses involved. Yet temperature changes MUST be preserved (at least in theory) by the gradual propagation of heat through the solid surface of the Earth.

    The ocean is much the same, but a bit more (shall we say) ‘fluid’ in it’s reactions to changes in surface temperature.

    Welcome to the reasoned analysis of ‘global warming’. Nothing is simple, but rational analysis should always rule, as it always has in science, in every field.

  121. Hm, a bit condescending, are we? ;P

    Anyhow, this is trying to drift off the point, which is that any heat that appears in the record at time T must have been somewhere at T-1, T-2, etc. So either a “surge” passed through the transparent-steady-temp surface water, or there was a corresponding rise in that layer’s temperature to initiate the sequence. There ain’t no such thing as free heat, so I’m trying to pin down where you posit that it came from. However long the lag before it shows up in the (necessarily selective) deep sea measurements.

  122. P.S.
    Speaking of lags, the “rational analysis should always rule” has often been subject to longish detours. The “should” part is not automatic; I think what this whole climate-pseudo-science battle is about is trying to enforce the rule of rational analysis before quarter-baked theories and conclusions are used to justify massive disruptions and derailing of human livelihoods, economies, and “industrial society”.

    This time it’s pretty much for all the marbles.

  123. #140,
    Not intended to be condescending. Reality rules, we simply describe,… well, or not so well.

    Forgive me. I have spent a lifetime working as a scientist; if I seem short in my replies, it is because I have suffered many fools, and my remaining time is limited.

    Steve

  124. De Witt Payne said
    “then the delayed climate sensitivity is going to be a lot lower than modeled because the time constant for the exchange with the deep ocean will have to be a lot smaller than thought.”

    Am I reading you wrongly? Suppose the deep ocean floor is undergoing a higher than usual heating action. This will cause thermal expansion that will be seen at the surface with no time lag. Is this what you meant, expressed another way?

    With thermalexpansion of the oceand, assuming a constant pot to contain the oceans, sea level rise will only happen if the oceans as a whole become hotter. (There are slight exceptions to this with salt water near zero celsius). Because we have very little data on deep temperature changes, we have no justification in saying that the top x00 metres are heating therefore the sea level will rise. We have to know the whole oceanheat pattern.

Leave a comment