the Air Vent

Because the world needs another opinion

Uncertainty be damned. When in doubt – GO BIG

Posted by Jeff Id on May 5, 2010

I’m glad that climate science doesn’t let things like physics, common sense or reason get in the way of their models.  They have taken the ridiculous uncertainty of modeling climate in the future and expanded the range all the way to 25F.  Imagine Canada with Florida temperatures, ice hockey gone with the ice caps, evaporated in the sweltering environment.  Proof that ever bigger budgets produce ever bigger computers which require ever bigger results.

Don’t let climategate stop you boys, it’s doom I say dooom.

Global Warming: Future Temperatures Could Exceed Livable Limits, Researchers Find

ScienceDaily (May 4, 2010) — Reasonable worst-case scenarios for global warming could lead to deadly temperatures for humans in coming centuries, according to research findings from Purdue University and the University of New South Wales, Australia.

Researchers for the first time have calculated the highest tolerable “wet-bulb” temperature and found that this temperature could be exceeded for the first time in human history in future climate scenarios if greenhouse gas emissions continue at their current rate.

Wet-bulb temperature is equivalent to what is felt when wet skin is exposed to moving air. It includes temperature and atmospheric humidity and is measured by covering a standard thermometer bulb with a wetted cloth and fully ventilating it.

.

While the Intergovernmental Panel on Climate Change central estimates of business-as-usual warming by 2100 are seven degrees Fahrenheit, eventual warming of 25 degrees is feasible, he said.

“We found that a warming of 12 degrees Fahrenheit would cause some areas of the world to surpass the wet-bulb temperature limit, and a 21-degree warming would put half of the world’s population in an uninhabitable environment,” Huber said. “When it comes to evaluating the risk of carbon emissions, such worst-case scenarios need to be taken into account. It’s the difference between a game of roulette and playing Russian roulette with a pistol. Sometimes the stakes are too high, even if there is only a small chance of losing.”

Yup, roulette, we need to shoot ourselves immediately so we can avoid the potential for…. What was that problem again?


60 Responses to “Uncertainty be damned. When in doubt – GO BIG”

  1. Frank K. said

    Yet another plea by the climate industry for money and attention.

    Please note that part of Dr. Huber’s NSF research $$$ awards $$$ contain the following note:

    “This award is funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5).”

  2. timetochooseagain said

    Seriously, it’s called air conditioning.

    And what about all the times when a deadly cold temperature won’t happen, eh?

  3. Pat Frank said

    As usual, no physical uncertainty limits. Modern climate science as revealed truth.

  4. j ferguson said

    For the writers and publishers of such studies, there seems to be no risk whatever – no chance of ridicule, no chance of losing their grants, no loss of circulation, no risk of loss of respect or credibility.

    Are we, collectively, really that stupid?

  5. Gary said

    #4 – Two keen-eyed writers from different centuries (Twain and Vonnegut) thought so.

  6. MD Jackson said

    The only “Wet-Bulb” is the intellect that came up with this nonsense.

  7. Steve Fitzpatrick said

    Stupid, just incredibly stupid. An all-out attempt to instill fear; no science, just politics.

    And we taxpayers are supporting this garbage? They should all be fired.

  8. T. Paul said

    It’s not the heat, it’s the humidity.

    Seriously. They’re discussing WET BULB temperatures.

    Of course, that’s meaningless unless compared to dry bulb temperatures. But, hey, facts, schmacts, who needs ’em?

  9. Crusty the Clown said

    In the space of one hundred and seventy-six years the Lower Mississippi has shortened itself two hundred and forty-two miles. That is an average of a trifle over one mile and a third per year. Therefore, any calm person, who is not blind or idiotic, can see that in the Old Oolitic Silurian Period, just a million years ago next November, the Lower Mississippi River was upwards of one million three hundred thousand miles long, and stuck out over the Gulf of Mexico like a fishing-rod. And by the same token any person can see that seven hundred and forty-two years from now the Lower Mississippi will be only a mile and three-quarters long, and Cairo and New Orleans will have joined their streets together, and be plodding comfortably along under a single mayor and a mutual board of aldermen. There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.

    – Life on the Mississippi by Mark Twain

  10. co2fan said

    Proceedings of the National Academy of Sciences.

    Hmmm. Peer reviewed. Probably by other “Boilermakers”* .

    *Ref: Purdue University nick name

  11. Harry said

    If this research does show anything, it must be the following:
    1. Using models you can get any situation you could dream of. The same model can be used to show that CO2 would fall down as snow, like it happens on Mars in winter.
    2. This kind of crap is now even accepted for publication in PNAS. It is the most striking evidence of how corrupt the peer-review process has become.

    It is far beyond my comprehension how it is possible that this paper has been approved for publication by PNAS. Did they want to ruin their standing and become member of the Science and Nature leage? Or is it more political statement about how far they want to go to promote AGW?

    I will reconsider next time when I want to publish whether PNAS is of sufficiently high standard. This is incredible, it comes close to scientific Harakiri.

  12. j ferguson said

    High school friend got ME at Purdue in 1963. Went to Cal Tech for a Doctorate. 3 years into his dissertation (or whatever they call it) one of the papers on which his work was premised was found – not by him – to contain errors which invalidated its conclusions. He would need to start over.

    His committee felt that the error in the paper, although not at all obvious, was within my friend’s competence to discover. They thought he should have.

    So he was sent on his way with an MS with best wishes.

    He then went to Stanford, got an MBA and sought his fortune in the big city with great success.

    If PhD committees continue to expect their candidates to at least audit the stuff on which they base their dissertations, maybe some of this junk we’re looking at will finally get the reviews it deserves.

  13. michel said

    With this, its beyond doubt, the only question is the cause.

    Americans, perhaps all Westerners, are losing the ability to think logically and in a connected fashion. I don’t know whether its something in the water, the air, perhaps their huge consumption of legal and illegal psychoactive drugs, including SSRIs. But you can see from this publication that it has happened, is happening, and can only get worse. The giveaway is that the authors evidently think the following is a valid argument:

    “When it comes to evaluating the risk of carbon emissions, such worst-case scenarios need to be taken into account. It’s the difference between a game of roulette and playing Russian roulette with a pistol. Sometimes the stakes are too high, even if there is only a small chance of losing.”

    Lets deconstruct this. We are once more joining Pascal for a small sporting wager. Lets put it in terms that Pascal would have used. Worst case scenarios need to be taken into account. Its the difference between a game of roulette and playing Russian roulette with a pistol. Sometimes the stakes are too high, even if there is only a small chance of losing. If what is at stake is our immortal souls, and if the threat is eternal damnation, then clearly even if there is only a small chance of it, we should take the action necessary to avoid it. Eternal damnation is really rather unpleasant.

    You see the problem? Not yet, probably.

    Well, let us suppose that there is only a small chance of a dreadful ice age. And that we could forestall that by increasing CO2 emissions. Well surely, given that there is a difference between Russian roulette and conventional roulette, we should increase them.

    Or perhaps we should all stand on our heads for 10 minutes every morning? Yes, I know there is only a small chance that it would help, but if civilization is at stake we have to think of the children. Or perhaps again we should nuke someone, anyone, because if there is the smallest chance that we could save civilization that way, why pass it up?

    Wait a second. I am not sure we can all stand on our heads, convert to Christianity, and nuke a few people, and emit more CO2, all at the same time. Oh dear, do you mean we have to choose which one to do? I am thinking as hard as I can about the children, but it doesn’t seem to be helping me decide which to do.

    I have some serious advice for all who find themselves in this dire intellectual dilemma. Stop taking Prozac and similar drugs. Drink a lot of water, with adequate levels of electrolytes dissolved in it, otherwise your hallucinations will worsen. Read some basic material on logic, and do some exercises. Come back in three months if you still think the argument in the quotation is valid. The only remedy in that case will be incarceration, before you damage yourself or others.

  14. j ferguson said

    That should be BSME above.

    I have another (true) story about disposition of lousy research from my 1964 stint in the Behavioral Research business if anyone is interested.

  15. Dagfinn said

    This looks like an admission that 12 degrees F is not scary enough, so they have to resort to an “eventual” even greater temperature rise. By when? 2200? And what estimate of fossil fuel reserves would be needed to achieve the required continued rise in CO2 levels?

  16. Harry said

    I have to apologise.

    Being allowed to commit Harakiri is considered to be a honour.

    This looks more like an act of desperation, nothing to do with honour.

  17. stumpy said

    Doesnt this just clearly demonstrate how poor the climate models are? Surely with all the increase in heat and humidity we would see more cloud cover, evaporation, rainfall and storms, all of which would quickly offset the heat and humidity, the earth can deal with it on a day to day basis, why not on a longer scale in the models?

    To quote the IPCC, in past warmer periods, the northern and southern extremities warmed the most, whilst the equatorial areas cooled slightly. The the coldest areas enjoy a more beneficial climate, whilst the hottest areas enjoy more cloud cover / rainfall / thunderstorms and a slightly cooler climate. Thats from the Paleo section of AR4. Do their models replicate this that we know? Nope…

    They have no skill, I wish people would stop using them to do obscure “what if” scenarios and then present it as genuine science with some level of confidence. It’s a waste of taxpayer dollars and could be better spent on some usefull research!!! People are currently dying all over the world for a whole number of REAL reasons, why not try and solve our current problems, then invent new ones.

  18. Kon Dealer said

    First I had to check the calendar. Nope it’s not April 1,

    I still laughed so much I split my colostomy bag.

    I’m going to sue them for the cleaning bill.

  19. I’m missing the technical part of “the technical public.”

    Anyone have a technical critique?
    I’d love to read it.

  20. TGSG said

    has to be a spoof? right?

  21. RomanM said

    Re: Ron Broberg (May 5 19:26),

    The paper hasn’t been published yet, but the hype seems to have started anyway. Two quotes from the sciencedaily article:

    The study did not provide new evaluations of the likelihood of future climate scenarios, but explored the impacts of warming.

    “We found that a warming of 12 degrees Fahrenheit would cause some areas of the world to surpass the wet-bulb temperature limit, and a 21-degree warming would put half of the world’s population in an uninhabitable environment,” Huber said. “When it comes to evaluating the risk of carbon emissions, such worst-case scenarios need to be taken into account. It’s the difference between a game of roulette and playing Russian roulette with a pistol. Sometimes the stakes are too high, even if there is only a small chance of losing.”

    Real technical.

    Translation: “We didn’t worry about whether this is realistic, but, man, we found it really scary!”

    Sure sounds like real climate science , doesn’t it? This is a prime example of exactly the type of insane exaggeration that creates skeptics.

  22. Brian H said

    Gee, and a 100°F increase would be even worse, I betcha! We’re all gonna die diddly-eye-die, oh noosse!

  23. The paper hasn’t been published yet, but the hype seems to have started anyway. Two quotes from the sciencedaily article:

    Sort of published … if you want to pay the 2 bits for the early edition.

    Adaptability Limit to Climate Change due to Heat Stress
    Steven C. Sherwooda and Matthew Huberb
    http://www.pnas.org/content/early/2010/04/26/0913352107.abstract

    Despite the uncertainty in future climate-change impacts, it is often assumed that humans would be able to adapt to any possible warming. Here we argue that heat stress imposes a robust upper limit to such adaptation. Peak heat stress, quantified by the wet-bulb temperature TW, is surprisingly similar across diverse climates today. TW never exceeds 31 °C. Any exceedence of 35 °C for extended periods should induce hyperthermia in humans and other mammals, as dissipation of metabolic heat becomes impossible. While this never happens now, it would begin to occur with global-mean warming of about 7 °C, calling the habitability of some regions into question. With 11–12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed. Eventual warmings of 12 °C are possible from fossil fuel burning. One implication is that recent estimates of the costs of unmitigated climate change are too low unless the range of possible warming can somehow be narrowed. Heat stress also may help explain trends in the mammalian fossil record.

    Such a scenario sort of reminds me a bit of the PETM – not a close analogy but a loose example – where fast steep rise may have been stressor which spurred mammalian evolution.

    Climate directly influences Eocene mammal faunal dynamics in North America
    Michael O. Woodburnea, Gregg F. Gunnellb and Richard K. Stuckyc

    The modern effect of climate on plants and animals is well documented. Some have cautioned against assigning climate a direct role in Cenozoic land mammal faunal changes. We illustrate 3 episodes of significant mammalian reorganization in the Eocene of North America that are considered direct responses to dramatic climatic events. The first episode occurred during the Paleocene–Eocene Thermal Maximum (PETM), beginning the Eocene (55.8 Ma), and earliest Wasatchian North American Land Mammal Age (NALMA). The PETM documents a short (<170 k.y.) global temperature increase of ~5 °C and a substantial increase in first appearances of mammals traced to climate-induced immigration. A 4-m.y. period of climatic and evolutionary stasis then ensued. The second climate episode, the late early Eocene Climatic Optimum (EECO, 53–50 Ma), is marked by a temperature increase to the highest prolonged Cenozoic ocean temperature and a similarly distinctive continental interior mean annual temperature (MAT) of 23 °C. This MAT increase [and of mean annual precipitation (MAP) to 150 cm/y) promoted a major increase in floral diversity and habitat complexity under temporally unique, moist, paratropical conditions. Subsequent climatic deterioration in a third interval, from 50 to 47 Ma, resulted in major faunal diversity loss at both continental and local scales. In this Bridgerian Crash, relative abundance shifted from very diverse, evenly represented, communities to those dominated by the condylarth Hyopsodus. Rather than being ‘‘optimum,’’ the EECO began the greatest episode of faunal turnover of the first 15 m.y. of the Cenozoic.

    So does it sound extreme? Sure.
    Am I going to dismiss it out of hand before I read it? Nope.

  24. Alan McIntire said

    12 F is 6 2/3 C. A doubling of CO2 would theoretically raise temps by about 1.2 C. 6.67/1.2 = 5.558 doublings. So far, we’ve managed
    .305/.693 = 0,44 doublings. 5.558/.44 is over 12. 2^12 is 4096, so we’ll manage that 6.67 C increase when we’ve increased the total CO2 we’ve put into the atmosphere by a factor of 4096- and that’s assuming there are no negative feedbacks like CO2 being washed out of the atmosphere from inreased rainfall.

  25. A doubling of CO2 would theoretically raise temps by about 1.2 C.

    You seem pretty certain of that response.
    No chance at all (0%) that a doubling would lead to 1.5-4.5C?
    No chance at all (0%) that it is less than 1C?

    Ultimately, that’s seems to be where this paper is positioning itself … the potential costs of the upper limits of the improbable (but not impossible) response. That’s part of the risk assessment – kind of like costing the possibility of an oil well blow out. Even unlikely events play a part in the cost analysis.

  26. RomanM said

    Re: Ron Broberg (May 5 21:21),

    (Bold mine)

    Such a scenario sort of reminds me a bit of the PETM – not a close analogy but a loose example – where fast steep rise may have been stressor which spurred mammalian evolution.

    The PETM documents a short (<170 k.y.) global temperature increase of ~5 °C and a substantial increase in first appearances of mammals traced to climate-induced immigration.

    Yes, a 5 C. (9 F) rise in close to 170,000 years does bear a striking resemblance to an off-the-top-of the-head tossed out figure of 21 F … if one has lost their grip with reality.

  27. Yes, a 5 C. (9 F) rise in close to 170,000 years does bear a striking resemblance to an off-the-top-of the-head tossed out figure of 21 F … if one has lost their grip with reality.

    sort of reminds me a bit of the PETM – not a close analogy but a loose example

    Is this typical around here? Just reword stuff so you can then display your sharp incisors of scorn against a strawman?

    Did you just make that part about the 21F being off-the-top-of-the-head as well? Or do you have some basis in reality for that claim?

  28. Tim said

    #27 Ron Broberg

    There are many ‘plausible’ disaster scenarios that could wipe out humans. A giant asteroid. A superbug. An alien invasion. I don’t see a 12degC rise in 100 years to be any more probable that any of those. At some point you just have to shrug you shoulders and move on with your life.

    The problem with these kinds of scenarios is they are being used to manipulate people into supporting policies that will do nothing other that make a few people very rich while impoverishing the rest.

  29. Robert Austin said

    I guess we don’t to worry about the next ice age if man has this much climate change clout.

  30. gallopingcamel said

    Robert Austin (#29),
    Thinking about the REALLY big picture one has to love “Skepticalscience”:
    http://www.skepticalscience.com/upcoming-ice-age-postponed-indefinitely.html

    Take a look at Archer, 2005. If you were to take this stuff seriously it would be your patriotic duty to maximise your carbon footprint. Imagine governments encouraging their serfs to buy bigger SUVs with lower gas mileage to postpone the next Ice Age indefinitely.

    Now that is an initiative I could commit to with enthusiasm!

  31. gallopingcamel said

    Ron Broberg, (#25 & #27),
    The “experts” have published papers suggesting numbers ranging from 0.5 to 5.4 degrees Celsius/doubling of CO2. This is similar to the speculations on the mass of the Higgs boson. There are many well reasoned theories but thus far nobody can say which one is correct.

    Over time, experimental science will more accurately quantify the radiative forcing due to CO2 in the atmosphere. Even if it turns out that the correct number is 1.2 degrees C +/- 0.1 per doubling of CO2, that will not be enough to support realistic predictions of global temperatures.

    There still remains the question of feedback. There are many interacting factors that are not well understood, such as rising temperatures causing more evaporation at low latitudes, producing water vapour and clouds. Do these factors reduce or increase the warming due to radiative forcing? If so, by what amount?

  32. Espen said

    This article seems to be in conflict with articles comparing a possible warmer future with the conditions during the Pliocene (se e.g. http://rsta.royalsocietypublishing.org/content/367/1886/189.abstract). If the conditions during the mid Pliocene is a reliable guideline, we can look forward to less deserts and in general a much more friendly world. There will be almost no warming close to the equator, almost all of it will happen in current temperate and arctic climate. We have nothing to fear if it gets warmer, the scary future is the white and grey future which is a few degrees colder, not the possible friendly green future that is a couple of degrees warmer.

  33. Patagon said

    Ron Broberg said

    May 5, 2010 at 7:26 pm
    I’m missing the technical part of “the technical public.”

    Anyone have a technical critique?
    I’d love to read it.

    This is a very good example of PlayStation science, or how to produce papers playing with electronic gadgets without even lookig at their meaning.

    A prerequisite for excessive warming in higher CO2 concentrations is that relative humidity (RH) remains constant, otherwise CO2 alone cannot produce enough radiative forcing. Constant RH implies higher water vapor in the atmosphere.
    Thus, lets check some of the “non-livable” regions of the future, like Toumbuctu in Mali, in the Saharan desert, Asswan in Egipt, or Central Australia. In these regions wet bulb maximum temperature would reach 40°C. Current mean relative humidity is 25-35% (this is mean values, the actual value is smaller when they reach the maximum recorded temperatures). The important thing here is that the lowest the RH, the highest the actual air temperature must be to reach a given value of wet bulb temperature. Wet bulb T is defined as “the temperature an air parcel would have if cooled adiabatically to saturation at constant pressure by evaporation of water into it, all latent heat being supplied by the parcel.” If we have to cool the actual air parcel and it has low RH we must cool it a lot to reach saturation. That means that for a very high wet bulb temperature with low RH, the actual air T should be extremely high.

    Well, a simple calculation shows that to reach 40C wet bulb temperature in Toumbuctu or Asswan with 35% RH, the air temperature should be about 58°C, that is 18°C higher than the atual maximum values. Another interesting thing is that in that situation there would be condensation every night following the current daily temperature fluctuation. The amount of water from that condensation would transform the Sahara in a very nice garden. Remember that the same RH at higher temperatures means much more water vapour content in the air.

    Not to mention the problem of finding the source of water for the increased humidity in the Sahara. Or the fact the the poles will warm more than the tropics.

    Can’t wait to see the actual paper and methodology…..

    This reminds me a very good book by Vit Klemes, it was about hydrology, but it is equally applicable to climatology. The title of the book: “Common sense and other heresies”

    The paper is out already: http://www.pnas.org/content/early/2010/04/26/0913352107 Any one can point to a pdf copy?

  34. RomanM said

    Re: Ron Broberg (May 5 22:24),

    Frankly, I couldn’t see what relevance your reference to PETM had with the over-the-top scare tactics in the Purdue press release version of the paper, so I took a look at the actual paper to see exactly what the authors said. Your characterization of a change of 5 C over 170,000 years as a “fast steep rise” does not compare in any way with the authors of the paper positing as a possibility:

    If warmings of 10 °C were really to occur in next three centuries, the area of land likely rendered uninhabitable by heat stress would dwarf that affected by rising sea level.

    However, you were right in the fact that I misspoke myself when I stated that the 21 F figure was made up “off-the-top-of-the-head”. In fact it comes from the paper as:

    We conclude that a global-mean warming of roughly 7 °C would create small zones where metabolic heat dissipation would for the first time become impossible, calling into question their suitability for human habitation. A warming of 11–12 °C would expand these zones to encompass most of today’s human population.

    The statement is based on, what else, computer simulations using their collection of cobbled-up worst case scenarios from the first paragraph of the paper:

    Recent studies have highlighted the possibility of large global warmings in the absence of strong mitigation measures, for example the possibility of over 7 °C of warming this century alone (1). Warming will not stop in 2100 if emissions continue. Each doubling of carbon dioxide is expected to produce 1.9–4.5 °C of warming at equilibrium, but this is poorly constrained on the high side (2, 3) and according to one new estimate has a 5% chance of exceeding 7.1 °C per doubling (4). Because combustion of all available fossil fuels could produce 2.75 doublings of CO2 by 2300 (5), even a 4.5 °C sensitivity could eventually produce 12 °C of warming. Degassing of various natural stores of methane and/or CO2 in a warmer climate (6, 7, 8) could increase warming further. Thus while central estimates of business-as-usual warming by 2100 are 3–4 °C, eventual warmings of 10 °C are quite feasible and even 20 °C is theoretically possible (9).

    In particular, the portion “Each doubling of carbon dioxide is expected to produce 1.9–4.5 °C of warming at equilibrium, but this is poorly constrained on the high side (2, 3) and according to one new estimate has a 5% chance of exceeding 7.1 °C per doubling (4)” caught my eye since climate science probabilities are invariably generated in interesting ways.

    Reference 4 is the paper Greenhouse-gas emission targets for limiting global warming to 2 °C by Meinshausen et al from Nature in 2009. Although the paper does use Bayesian statistics in to generate some probability statements, I could not find any reference in the paper which would support the contention made by Sherwood and Huber. However, I did find the following:

    We chose a Bayesian approach, but also obtain ‘frequentist’ confidence intervals for climate sensitivity (68% interval, 2.3–4.5 C; 90%, 2.1–7.1 C), which is in approximate agreement with the recent AR4 estimates.

    From a “technical” viewpoint, Ron, if a 90% confidence interval for an unknown parameter is the set of values [a, b], does it then follow that there is a 5% probability that the parameter itself exceeds b?

  35. Frank K. said

    What bugs me most about this is not the research itself (they can make any conclusions they want based on their “models”), but that we’re spending federal money WE DON’T HAVE on this nonsense…

    Where is the “stimulus” here (other than in the researchers egos…)?

  36. Carrick said

    Ron Brodberg:

    Did you just make that part about the 21F being off-the-top-of-the-head as well? Or do you have some basis in reality for that claim?

    You go first.

    Tell us a) how 7°C/doubling is anything more than bad science fiction or b) how a 2.75 CO2 doubling factor could actually occur.

    What they are even suggesting is outside the remote range of possibility. That’s without addressing whether their claim about a lack of adaptability is true or not. I suspect that is completely false too.

  37. DeWitt Payne said

    The PETM lasted ~170,000 years. That means the temperature rose and fell back to near the previous level during that time. There is good reason to believe that the rise was quite rapid and that the release rate of methane or CO2 was in the same ballpark as now in both rate and quantity. However, the PETM occurred near the peak of the Eocene climate optimum and the global average temperature before and after the PETM was something on the order of 10 C warmer than now. So yes, it’s indeed comparable.

  38. @Patagon: Thank you. Best post in this thread, IMO.

    RomanM: From a “technical” viewpoint, Ron, if a 90% confidence interval for an unknown parameter is the set of values [a, b], does it then follow that there is a 5% probability that the parameter itself exceeds b?

    If you’re ‘testing’ my statistical acumen – then let me be blunt: I don’t know diddly squat. But I’m willing to learn. That’s why I asked for a technical critique – and there is a hell of a lot more _information_ in this thread following that request then preceding it. So thank you.

    But let me tender a guess, so that you can savage it and I might learn something in the process. The answer is no. There is a 10% probability that it lies outside the range [a,b] – but there is not enough information in your question to assume that the outside values are equally distributed on either side of the 90% CI interval.

    Carrick: Tell us a) how 7°C/doubling is anything more than bad science fiction or b) how a 2.75 CO2 doubling factor could actually occur.

    b) depends largely on burn-rates of fossil fuels – a number I don’t have on-the-top-of-my-head. But this is a calculation I can do – at least to the back-of-the-envelope-level. I’ll get back to you on that one if someone doesn’t beat me to it. How did reference [5] reach that figure?

    And y’all, I appreciate the shift to discussing stuff in the paper. That’s the discussion in which I’ll learn stuff.

  39. RomanM said

    Re: Ron Broberg (May 6 11:16),

    The answer is no. There is a 10% probability that it lies outside the range [a,b] – but there is not enough information in your question to assume that the outside values are equally distributed on either side of the 90% CI interval.

    I am not going to savage you. The answer is indeed no. However, it is not because of the reason you suggest.

    A little statistics background: A confidence interval is, as you already understand, a guess for a plausible range of values for an unknown population parameter. It consists of a calculated pair of values which represent a lower and an upper bound for the parameter. If these values are calculated from sample data which contain elements with random components, they are then themselves random variables and it is reasonable that before the sampling takes place one can make probability statements about what might occur.

    The confidence level of the interval is the probability that the method used to calculate the interval will produce a result that contains the parameter in question. If you were to calculate 90% CIs for a parameter from a series of independent samples from a population, you would expect that about 90% of them to contain the parameter and about 10% of them to exclude it. However, when looking at a single calculated interval, you have no way of knowing whether that interval actually includes or excludes the parameter. Saying that the interval has a confidence level of 90% is a statement not about the particular interval, but about the procedure used to generate that interval.

    A common misinterpretation of a CI is that it is a probability statement about the parameter. There is nothing further from the truth. A good example of why this is not the case can be found on a web page on CIs authored by noted statistician Jerry Dallal (starting at “This Will Hurt Your Head!!!” – it won’t!😉 ).

    If, as it strongly appears, the authors have based their statement on the confidence interval, then the statement is blatantly unjustified and numerically meaningless. It is simply wrong. Furthermore, since this important fact is usually explained in an entry level statistics course, it would reflect very poorly on the authors’ understanding of statistics and cast a shadow on what is done in the rest of the paper.

  40. Thanks for the explanation Roman.

    Here is my back-of-the-envelope calculations as promised:

    Carbon dioxide (CO2) forms approximately 0.04% of the nominal 5,000,000 gigatonnes of gas and aerosols that comprise the Earth’s atmosphere. (wiki)

    CO2 = 0.0004 * 5,000,000 = 2000 gigatons

    Population ~ 6.7 billion
    Per capita CO2 emissions ~ 1 ton Carbon / person
    so …
    C_o ~ 6.7 giga tons Carbon
    GR ~ population growth rate of 1%
    N = number of years

    I believe a formula for accumulative growth is as follows:
    Total accumulation = C_o*GR((GR^N-1)/(GR-1))

    100 years is ~ 50% increase over current CO2
    200 years is ~ 2x current CO2 or one doubling CO2
    300 years is ~ 12x current CO2 or ~3.5 doublings CO2

    But lets assume 50% absorption of CO2 into oceans/biosphere
    In that case,
    100 years is ~ 25% increase over current CO2
    200 years is a doubling
    300 years is 1.75 doublings

    Assumptions:
    a constant emission of 1 metric ton C per person
    a 1% growth rate of population
    (and in the last case)
    a constant 50% fractional absorption of CO2

    Ignored:
    Resource constraints
    Technology/Economic factors that might change the per capita C emission

    Does that seem in the ball-park?

  41. tarpon said

    The worst it gets for the hoaxers, you have to ratchet it up, don’t they. How high can the nonsense go, want to start a betting pool. Heck just today the Senate started hearings on the hoax. Just a mere coincidence, I am sure.

    Water has been 100 meters lower than today, and 100 meters higher than today. The lowest level comes when glaciation occurs, the highest when all ice melts. Well before there were any SUVs I might add.

  42. Oops, I made the single most common mistake:

    I calculated atmospheric CO2 and compared it to emitted C!

    Using the above figures, total atmospheric C is about 2000 * 12/44 = 545 gigatons C. But this is low according to other sources which place it closer to 750 gigatons C. Using 750 gigatons as the initial total …

    In the 50% absorption scenario:

    100 years -> 400 gigaton increase (~75% increase C)
    200 years -> 2300 gigaton increase (~1.5 doublings C)
    300 years -> 4666 gigaton increase (~t 2.5 doublings C)

  43. I give up. So much for punching numbers in a calculator …


    > # initial carbon in atmos
    > C1
    > # current emissions (assuming 50% absorption)
    > dc
    > # growth rate (assumes 1 ton C per capita and 1% pop growth)
    > GR
    > # 100 years
    > N # additional CO2 added to atmos
    > dc*GR*(((GR^N)-1)/(GR-1))
    [1] 568.2144
    > # % change
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 1.757619
    >
    > # 200 years
    > N dc*GR*(((GR^N)-1)/(GR-1))
    [1] 2105.129
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 3.806838
    >
    > # 300 years
    > N dc*GR*(((GR^N)-1)/(GR-1))
    [1] 6262.196
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 9.349594
    >

  44. Now I am just experimenting… the previous post, using ‘code’ html tags, mangled the R code. I’m trying again …

    > # initial carbon in atmos
    > C1 
    > # current emissions (assuming 50% absorption)
    > dc 
    > # growth rate (assumes 1 ton C per capita and 1% pop growth)
    > GR 
    > # 100 years
    > N  # additional CO2 added to atmos
    > dc*GR*(((GR^N)-1)/(GR-1))
    [1] 568.2144
    > # % change
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 1.757619
    > 
    > # 200 years
    > N  dc*GR*(((GR^N)-1)/(GR-1))
    [1] 2105.129
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 3.806838
    > 
    > # 300 years
    > N  dc*GR*(((GR^N)-1)/(GR-1))
    [1] 6262.196
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 9.349594
    > 
    

    Testing … testing …

  45. testing … testing … it’s mangling the R assignment (thinks it’s the beginning of an html comment?)

    I give up. So much for punching numbers in a calculator …


    > # initial carbon in atmos
    > C1 = 750
    >
    > # current emissions (assuming 50% absorption)
    > dc = 3.3
    >
    > # growth rate (assumes 1 ton C per capita and 1% pop growth)
    > GR = 1.01
    >
    > # 100 years
    > N = 200
    > # additional CO2 added to atmos
    > dc*GR*(((GR^N)-1)/(GR-1))
    [1] 568.2144
    > # % change
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 1.757619
    >
    > # 200 years
    > N = 200
    > dc*GR*(((GR^N)-1)/(GR-1))
    [1] 2105.129
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 3.806838
    >
    > # 300 years
    > N = 300
    > dc*GR*(((GR^N)-1)/(GR-1))
    [1] 6262.196
    > (C1+dc*GR*(((GR^N)-1)/(GR-1)))/C1
    [1] 9.349594
    >

  46. gallopingcamel said

    Ron Broberg,
    I think your calculation is in the right ball park. My understanding is that if mankind were to burn all known reserves of fossil fuels it would add less than 5,000 giga-tonnes of “new” carbon molecules to the atmosphere. The big question is how long the carbon would stay in the atmosphere.

    Archer assumes a very long residence time to arrive at his breathtaking conclusions:
    http://geosci-webdev.uchicago.edu/~archer/reprints/archer.2005.trigger.pdf

  47. Layman Lurker said

    #39 Roman M

    Roman, please consider submitting a comment to PNAS.

  48. steven Mosher said

    ron

    I hesitate to post on population since the other steven mosher does that.

    http://www.math.duke.edu/education/ccp/materials/diffeq/logistic/logi1.html

    more important for you however is this

    http://www.grida.no/publications/other/ipcc_sr/?src=/climate/ipcc/emission/014.htm

    basically, estimates all over the spectrum…

  49. Layman Lurker said

    Roman, interesting that Briggs posted a couple of days ago with his version of your CI primer:

    http://wmbriggs.com/blog/?p=2336

  50. Suibhne said

    When I saw “uncertainty” in the title my mind drifted to handing in reports in university first year Physics labs.
    Did my final calculation contain two many significant figures?

    I have also recently been looking at the KT “la la land” diagram.

    The surface radiation of 390W/m2 seems to have been calculated with a figure for emissivity e=1, rounded up to the value for a perfect black body.
    All reasonable people however would say the figure should be 0.xy and possibly some discussion about the values for x and y.

    However the value would be to two significant figures.

    The figures in the graph above based on two significant figures are mostly accorded three figure accuracy.
    I can see my old lab supervisor raise his eyebrows and reach for the “red pen” if I had handed in such a report.

  51. Geoff Sherrington said

    From history, we read

    “Systems ecology has concentrated on constructing computer models of ecosystems…. The most famous model of this class was the world model developed by Donella and Dennis Meadows amd their colleagues at MIT and published in “Limits to Growth” …… The computer models of the systems ecologists tend to be high in precision and low in generality and to lose reality as they become more general …. One of the major drawbacks of systems ecology is the relative paucity of data available with which to develop and test models.” From Ehrlich P, Ehrlich A and Holdren, John, “Ecoscience”, p 170, 1977.

    Following page, ‘… J S Haldane claimed we cannot apply mathematical reasoning to vital processes, since a mathematical treatment assumes a separability events in space “Which does not exist for life as such. We are dealing with an indivisible whole when we are dealing with life” ‘. Nagel, Ernest, “The Structure of Science: Problems in the Logic of Scientific Explanation”, New York, Hackett and Harcourt, pp 445-6, 1961, in which is an analysis of the holistic fantasy so loved by Lovelock and his Gaia.

    Nagel continues “Like everyone else who contributes to the advancement of knowledge, organismic biologists must be abstractive and analytical in their research procedures. They must study the operations of various prescinded parts of living organisms under selected and often artificially instituted conditions – on pain of mistaking unenlightening statements liberally studded with locutions like “wholeness”, “unifiedness”, and “indivisible unity” for expressions of general knowledge.

    (From Efron, Edith, “The Apocalyptics”, p. 53, 1984. Her topic was the artificiality and political manipulation of the frightening, imminent epidemic of cancers forecast in humans from man-made chemicals in the 1970s. Edith was right, as time has shown).

    If Nagel got it right 50 years ago and Haldane pronounced on it, then there is a chance that it is correct. The failure of much modern climate science to tease out, isolate, quantify confounding processes is one of the large obstacles to recognition of climate science as a mature science – especially when it references Gaia.

    And yes, some of us know still of Holdren, John.

    Do try to read “The Apocalyptics”. It’s only 590 pages of meticulous research whose references are quotes from the sources verbatim. (In other words, the culprits are convicted by their utterances. There are no references from industry, such as the tobacco or pharmaceutical companies).

  52. RomanM said

    Re: Layman Lurker (May 7 02:18),

    Don’t get me started on the straw man mis-representation and denigration of “frequentist” methodology by Bayesian advocates.

  53. hunter said

    I urge you to read “The Doomsday Syndrome” by John Maddox.
    John Maddox was the publisher of “Nature” magazine and had no patience for alarmist pap.
    The book is out of print currently, but is easily available on line.
    AGW is a social madness of increasingly dangerous proportions.
    But other areas of our society are infected by the kind of magical thinking that led to AGW.
    Our economy come to mind.

  54. Suibhne said

    Just been edited out of the Skeptical Science Thread

    Where is global warming going?

    The edit made a bit of a mess because other posters had referenced it before it was removed.
    Once the hole was there the other posters comments made no sense at all.
    Ive been back at the site and all’s well because they tidied up the other posters posts by editing them to remove any reference to the missing post.
    I now know how Trotsky felt after Stalin blanked him out from the photograph of Lenins burial ceremony.

  55. gallopingcamel said

    Suibhne (#54),
    The same thing happened to me yesterday at “Skepticalscience”. The post was about the last “3 million years” so I commented that over most of that period, rising CO2 followed rising temperatures by several hundreds of years, proving that CO2 concentration is not the primary driver of climate change over at least the last 750,000 years. Cause precedes effect in the real world.

    My post was up long enough to attract some comments but by this morning my comment and the responses had disappeared.

    Up to now I have been very complimentary about Skepticalscience but it seems likely that they will deteriorate into just another “Climate Progress” or “Realclimate.

    The good news is that the skeptics will soon fall away and the CAGW folks will be left inside an echo chamber.

    I hope that Jeff will not expunge “Sod” and the other CAGW visitors to this fine site.

  56. Carrick said

    I quit reading “Skeptical Science” when it became apparent it wasn’t (skeptical).

  57. Jeff Id said

    #55 I’ve got no energy for it and not the slightest intent. I’m way too lazy to moderate.

    The worst a disagreement get’s here is an argument. Moderator of doooom!

    Dishonesty bugs me, the rest is no problem.

  58. Suibhne said

    For go big I wonder which set of these results finally made it into the IPCC orthodox bible
    page 4239(of link below) shows readings taken of “backradiation” in the Antarctic.

    Two instruments were used

    1. Radiance Interferometer(RI)
    2. Pyrgeometer(P)

    Take the July figure for instance
    RI gives 48W/m2 P gives 90 W/m2

    These are not unusual examples they are pretty typical for the table.

    If the research team had only one instrument there would be less confusion, but would the result be any more accurate?
    Other reports suggest that the Pyrgeometer is particularly prone to false readings.
    The instruction manuals give equations to tell you the “right” answer.
    There is a very convenient “offset” knob.
    The less charitable amongst us would think that this control could then be used to make sure that the results validated the equations.
    However should the point of any experiment be to test the theory and falsify it if that’s what the results show.

    http://www.webpages.uidaho.edu/~vonw/pubs/TownEtAl_2005.pdf

  59. gallopingcamel said

    suibhne (#58),
    I tried to read that paper you quoted but had to quit when my eyes started revolving in opposite directions. When one gets into the details, climate science is way beyond my pay grade even though my work on high energy particle accelerators is arguably more challenging than rocket science.

    If you are into these things there is a paper that goes into the absorption spectra for the atmosphere on Venus. This was something I found while looking for papers relating to the recent furore arising from some guest posts by Steve Goddard on WUWT.
    http://www.funkyscience.net/documents/Vertical_90.pdf

    Some other papers:
    http://www.funkyscience.net/science.html

    While atmospheric gases and clouds intercept much of the long wavelength emission from Venus’ hot surface before it can escape to outer space, how much of this warming can be called “Greenhouse Effect” and how much “Adiabatic Warming”?

    This may be a meaningless question as it seems to me that there would be no greenhouse effect if temperature did not fall with increasing altitude in the dense part of a planet’s atmosphere.

  60. Suibhne said

    gallopingcamel

    “climate science is way beyond my pay grade”

    I think that you are giving climate science as presently understood too much respect.
    I’m in education (Physics) and I find climate science papers and conclusions founded on exaggerated claims.

    Hence the attraction of the headline of this thread.
    Round up where possible temperatures or anything else that suits your conclusion.

    Take for example the Earth Surface Radiation in diagram below.

    I have read recently that this figure has now been revised up to 396.1 W/m2

    Two things concern me about this figure;

    1. The emissivity e is now greater than 1 for an average temperature of 15 degrees C which is physically impossible.

    2. The value is given to four significant figures whereas the emissivity figures (which the calculation uses) I have seen have only two significant figures.
    This would not be acceptable in a first year university lab report.

    On the other paper

    http://www.webpages.uidaho.edu/~vonw/pubs/TownEtAl_2005.pdf

    1. Different instruments give different values for the same quantity- very odd.

    2. Very low values of “backradiation” given 41W/m2 for August. The uncertainty for these measurements is 8W/m2 so the figure could be as low as 33W/m2.

    3. Also of interest is that even for low humidity clear skies conditions the radiative flux from H2O vapour was twice that for CO2
    There are some people who want to exaggerate the effect of CO2 and they will not find anything in this result to support them.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: