Bending the Thermometers

Reader Boballab found this an entire day before it was supposed to be released. It’s a paper written byJoseph D’Aleo and Anthony Watts for the Science and Public Policy Institute.

Many of us are familiar with the accusations made by D’Aleo and EM Smith regarding the systematic elimination of colder temperature stations. They claimed that there is specific intent to distort the temperature record to inflate the true temperatures today. Not that hard to believe considering what we’ve recently learned from Climategate and the IPCC, however this paper goes a step further. It’s a comprehensive report on the numerous flaws in the dataset, including surfacestations, data elimination and a variety of other issues.

The report is 111 pages long and to put it simply, it contains the single strongest worded accusations against climate science of anything I’ve read from qualified skeptics. Here’s a quote:

That is to say, leading meteorological institutions in the USA and around the world have so systematically tampered with instrumental temperature data that it cannot be safely said that there has been any significant net “global warming” in the 20th century.

The report is handled on a single page with a summary for policymakers. I would encourage any (both) policymakers who stop by to consider that these claims are being made by qualified people with background and ability to understand the data and implications.

1. Instrumental temperature data for the pre-satellite era (1850-1980) have been so widely, systematically, and unidirectionally tampered with that it cannot be credibly asserted there has been any significant “global warming” in the 20th century.
2. All terrestrial surface-temperature databases exhibit very serious problems that render them useless for determining accurate long-term temperature trends.
3. All of the problems have skewed the data so as greatly to overstate observed warming both regionally and globally.
4. Global terrestrial temperature data are gravely compromised because more than three-quarters of the 6,000 stations that once existed are no longer reporting.
5. There has been a severe bias towards removing higher-altitude, higher-latitude, and rural stations, leading to a further serious overstatement of warming.
6. Contamination by urbanization, changes in land use, improper siting, and inadequately-calibrated instrument upgrades further overstates warming.
7. Numerous peer-reviewed papers in recent years have shown the overstatement of observed longer term warming is 30-50% from heat-island contamination alone.
8. Cherry-picking of observing sites combined with interpolation to vacant data grids may make heat-island bias greater than 50% of 20th-century warming.
9. In the oceans, data are missing and uncertainties are substantial. Comprehensive coverage has only been available since 2003, and shows no warming.
10. Satellite temperature monitoring has provided an alternative to terrestrial stations in compiling the global lower-troposphere temperature record. Their findings are increasingly diverging from the station-based constructions in a manner consistent with evidence of a warm bias in the surface temperature record.
11. NOAA and NASA, along with CRU, were the driving forces behind the systematic hyping of 20th-century “global warming”.
12. Changes have been made to alter the historical record to mask cyclical changes that could be readily explained by natural factors like multidecadal ocean and solar changes.
13. Global terrestrial data bases are seriously flawed and can no longer be trusted to assess climate trends or VALIDATE model forecasts.
14. An inclusive external assessment is essential of the surface temperature record of CRU, GISS and NCDC “chaired and paneled by mutually agreed to climate scientists who do not have a vested interest in the outcome of the evaluations.”
15. Reliance on the global data by both the UNIPCC and the US GCRP/CCSP also requires a full investigation and audit.

Thanks again to the omnipresent boballab who’s quick emails allowed Anthony and crew to release the report in an orderly fashion.

Check out the full Damning report thsurface_temp[1]

The original post at the SPPI is here.

Honestly, CO2 does catch/retard/slow down heat. It’s just the truth. More CO2, more warming, however, there is absolutely no guarantee that the warming is enough to be measurable. Our understanding isn’t good enough on how the climate reacts and those that say it is, are not being honest with you.

I’ll stop there because these bastard advocaticians want me to be a denier. They spend every day of their lives messing with reality to an extent that reasonable people have no choice. We want data, we want code, we want truth, the problem is we appear to be asking a bunch of paid liars to give it to us.

165 thoughts on “Bending the Thermometers

  1. Well, this might help explain why we haven’t seen much new stuff from El Jefe for a little while.

    Wow, attribution of intent!

    OK time to read it all.

    You just reinforced my ‘frequent=recheck-Airvent’ habit.
    RR

  2. Shouldn’t “the systematic elimination of warmer temperature stations” actually be “the systematic elimination of colder temperature stations”?

    REPLY: Thanks.

  3. “That is to say, leading meteorological institutions in the USA and around the world have so systematically tampered with instrumental temperature data that it cannot be safely said that there has been any significant net “global warming” in the 20th century.”

    I have to disagree, this is too strong by far. I think there is evidence that the data are not particularly reliable, but I would hardly say that there is no basis for safely concluding that there was warming in the twentieth century. I do think that the magnitude of the warming is likely overstated, though.

  4. Summary of summary:
    1. Data is garbage
    1a. Data is fraudulently-tampered with garbage.
    1b. If not tampered with, the instruments are garbage.
    1c. If neither a nor b, it is erroneous garbage.
    2. If data is not garbage, there is no provable warming.
    2b. If there is warming, it is most likely natural.
    2c. If it is not natural, the models are garbage. Go back to 1.

  5. Re 2;

    More tomorrow? Yikes…
    You’re a real piece of work.
    I keep promising myself I’ll quit with the ‘recheck noconsensus’
    I only type N and poof, there it is, another time I have to decide, check it yet again, or has it only been 5 minutes since the last recheck…

    Amazing 2-hop logical leap on that amazon thing;
    From ‘logging can impact forests by up to 40%’ in Nature;
    to ‘40% of amazon forests are very sensitive’ in WWF,
    to ‘Up to 40% of the Amazonian forests could react drastically to even a slight reduction in precipitation’ In IPCC 2007.

    Hidden in Plain Sight indeed!
    Thanks for all of your efforts.
    I heard RFP deliver the Cargo Cult Science talk live in June 1974.
    I have the impetus but not the toolset to do as Fearless Leader directed us…
    He’d be very outspoken supporter of this effort!
    RR

  6. TTCA:

    I have to disagree, this is too strong by far. I think there is evidence that the data are not particularly reliable, but I would hardly say that there is no basis for safely concluding that there was warming in the twentieth century. I do think that the magnitude of the warming is likely overstated, though.

    Yes, I agree. There’s lots of good discussions and summaries of data, but the rhetoric is over the top in places.

  7. Whether the temperature measurements are corrupted in any way or not is irrelevant. The reported slight rise in global mean temperatures over the past century is well within normal operational parameters of natural variability. Some of it may have been caused by man but no one has shown by how much, if any. If the temperature measurements are ever proven beyond any doubt that they were corrupted then all the better in destroying the man-made global warming catastrophe myth.

  8. DG:

    What about the argument the surface data is in good agreement with satellite?

    It isn’t really (UAH isn’t at least, that’s the one I trust more… RSS is really close, maybe too close, to HADCRUT though), but then again I don’t they should agree, if all you are doing is correcting for lapse rate (largest effect is the ground has a greater warming due to curvature of the temperature profile near the ground).

  9. timetochooseagain said
    January 26, 2010 at 11:18 pm

    I think there is evidence that the data are not particularly reliable, but I would hardly say that there is no basis for safely concluding that there was warming in the twentieth century.

    If the data are not reliable, isn’t that the same thing as saying that conclusions cannot be drawn from the data?

    I do think that the magnitude of the warming is likely overstated, though.

    How can you even know this?

    This is precisely the problem with all the proxy reconstructions, btw. Their dismantling does not disprove claims of unprecendented temperature rise, it merely reinforces the notion that conclusions cannot be drawn from their results (one way or another).

    Mark

  10. Mark T:

    If the data are not reliable, isn’t that the same thing as saying that conclusions cannot be drawn from the data?

    “Reliability” isn’t an either/or proposition, and many of the issues that Watts brings up have yet to make their way to proof that all (or most, or any) of the warming observed in the last 150 years is either manufactured or artificially inflated.

  11. #10 Szo

    No, the satellites are not calibrated against the surface record. They are calibrated against several internal platinum resistance thermometers on the satellite once each Earth scan by aiming at a “warm calibration target” within the instrument. To complete the process a “cold target” is checked, that being the cosmic background which is assumed to be about 2.7 Kelvin. This then provides the “brightness temperature curve” which enables UAH to calculate the Earth temperature.

    The fact that the satellite is independent of the surface records and UHI and other terrestrial effects makes for a more reliable measure. Unfortunately we’ve only got 30 years of satellite data to work with.

  12. JLKrueger:

    The fact that the satellite is independent of the surface records and UHI and other terrestrial effects makes for a more reliable measure. Unfortunately we’ve only got 30 years of satellite data to work with.

    Since that happens to correspond roughly to the period where anthropogenic warming began, that’s a good thing. I’ve argued other places that satellite measures are closer to what the global climate models model in any case (they don’t have a proper boundary layer, which is what surface instrumental measurements are obtained in, of course).

    Before the mid-1970s, it is believed that anthropogenic CO2 emissions were nearly balanced by anthropogenic sulfate emissions… Nearly all of the warming prior to that… is thought to be entirely natural in origin.

  13. Timetochooseagain wrote:

    “I think there is evidence that the data are not particularly reliable, but I would hardly say that there is no basis for safely concluding that there was warming in the twentieth century.”

    First of all they said no “significant net global warming.” If you agree that the data in the U.S. are not particularly reliable; and if, as appears to be the case, the situation isn’t better elsewhere in the world, then on what basis would you say that there was significant warming? I think their statement, while provacative, is exactly on target. Namely, it may be the case that there has been some warming, but we cannot be sure of where or how much, and thus we cannot “safely” say that there has been any “significant” net global warming.

    I think they are exactly on point.

  14. Forgot to mention that my explanation in #17 was for the Advanced Microwave Sounding Unit (AMSU) on NASA’s Aqua satellite. Which is what provides the data for the UAH satellite anomaly calculated by Drs Christy and Spencer. I can’t speak for RSS’s calibration method.

  15. #18 Carrick:

    Since that happens to correspond roughly to the period where anthropogenic warming began, that’s a good thing. I’ve argued other places that satellite measures are closer to what the global climate models model in any case (they don’t have a proper boundary layer, which is what surface instrumental measurements are obtained in, of course).

    It also dovetails with the PDO. So quite simply, we don’t really know for certain what is driving the train.

  16. Carrick said
    January 27, 2010 at 1:11 am
    Mark T:

    “Reliability” isn’t an either/or proposition

    I’m not saying it is. Reliability is trust in what you have. If it is unreliable, you do not trust what you have.

    and many of the issues that Watts brings up have yet to make their way to proof that all (or most, or any) of the warming observed in the last 150 years is either manufactured or artificially inflated.

    I also specifically said that it does not prove there is no warming. If data are unreliable, you cannot draw reliable conclusions from the data. We (skeptics) seem to hold every other aspect of the science to this standard except this one area.

    We (people in general, skeptics in particular) tend to capitulate points that are debatable in order to make the rest of our argument more sound, likely out of fear of being labeled as anti-science or a denier. This goes for any argument/debate, it’s a sort of psychology thing I suppose. It is the same reason people always start off their otherwise skeptical review of something by stating “well, I agree that…” Why should it matter what you agree upon if your point is something you disagree on, particularly if agreement on the one area does not directly impact the reasons for disagreement on the other?

    Just saying…

    Mark

  17. JL Krueger:

    It also dovetails with the PDO. So quite simply, we don’t really know for certain what is driving the train.

    Well, 30 years of data is just enough to be able to resolve a global temperature trend from the PDO. But I agree with your point in principle, there are other longer term fluctuations that are even of larger magnitude (the amplitudes of the oscillations tend to increase approximately in proportion to their period, or in inversely proportion to their frequency).

    The truth is, even if you accept the environmental climate sensitivity of CO2 as stated in the IPCC, what we are seeing to this point is a very modest anthropogenic warming (maybe 0.25°C). There has been an amazing amount of hot air generated about the impact of man-made warming, far beyond what the physical science says is warranted.

  18. Mark T:

    I’m not saying it is. Reliability is trust in what you have. If it is unreliable, you do not trust what you have.

    As used in science, it’s quantitative: Watts argues that the temperature record is less reliable, even if true, that doesn’t mean it has no value, just that the uncertainty has increased. That doesn’t mean you stop trusting it, just that you trust it less based on the increased error bars. But Watts hasn’t even gotten to the point of actually quantifying what those error bars should be increased too…

    The default position is that Watts has raised some issues, but has not yet crossed the “i”‘s and dotted the “t”‘s. The “skeptical” position is to say to Watts “show me” and to assume that the dozens or more people who have looked at the problem haven’t blown it so badly that it is entirely wrong.

    I assume that the people who have worked on this for decades haven’t got it so wrong, because that is scientifically the conservative view, and Watt’s view by argument is the radical one requiring strong proof that is yet to be forthcoming.

  19. Carrick said
    January 27, 2010 at 1:54 am

    As used in science, it’s quantitative: Watts argues that the temperature record is less reliable, even if true, that doesn’t mean it has no value, just that the uncertainty has increased.

    I didn’t say it had no value, either. I said we can no longer trust claims of warming.

    That doesn’t mean you stop trusting it, just that you trust it less based on the increased error bars.

    Which is exactly my point. The trend is not very significant over the last 150 years to begin with. Now we know that many, if not most, adjustments tend to enhance the trend over the last 30 years or so (as well as reduce the warmth of the early 20th century). This decreases the trend relative to the null (no trend, I would assume). The number of stations has dropped significantly, too, which will also serves to increase the error bars.

    But Watts hasn’t even gotten to the point of actually quantifying what those error bars should be increased too…

    Yes. We are at the point where it is not unlikely that “no trend” will be within the error bars.

    The “skeptical” position is to say to Watts “show me” and to assume that the dozens or more people who have looked at the problem haven’t blown it so badly that it is entirely wrong.

    I take the position that the original claimants, i.e., those that originally put the data together, hold the burden of proof. We (figuratively) have found serious problems with the way they have done this, and thus, it is on them to demonstrate that it does not matter. They said “It’s been warming,” and we came along and said “wow, so many serious errors with your data.”

    I assume that the people who have worked on this for decades haven’t got it so wrong, because that is scientifically the conservative view

    My guesses are a) there really aren’t that many people that hold the keys to the kingdom (not anymore, at least), b) most of what we see now is actually a product of algorithmic work that has been done relatively recently, c) the legacy work that has survived has made the total package rather unwieldy (these guys aren’t trained software writers that I can see), and d) of those that actually do hold the keys, not many really, truly understand what the algorithms are doing.

    Watt’s view by argument is the radical one requiring strong proof that is yet to be forthcoming.

    Given all the other issues that have appeared over the last 6-8 years that I have been following this mess, I must disagree. I have very little faith in claims coming from a scientific field (broadest terms, of course) that has clearly been corrupted by politics. Yes, that is cynical, but over this time period, revelations have only become more shocking, not less (though I am less shocked each time, due to my cynicism).

    Once the error analysis is complete, should there be something significant, I will adjust my opinion accordingly, of course.

    Mark

  20. Btw, Carrick, I’m not saying “zero trust,” just, “not much trust.” Perhaps I should have some sort of scale like the IPCC… likely, probable, etc. 😉

    Mark

  21. Oh, and for the record, of the three places I’ve lived the majority of my life, two have apparently experienced some warming. St. Louis, Missouri, my first home (27 years), was significantly warmer when I left 14 years ago than it was in the 70s when I was in grade school (to the point we no longer got snow in the winter when I moved away). That has changed recently, however. Melbourne, Florida, where I lived the next seven years, apparently has not changed much. It was the same the whole time I was there with two seasons: pre-summer and summer. Colorado, where I have lived the last seven years, has apparently warmed a small amount, though you wouldn’t have known it this past December. 😉

    Mark

  22. “but I would hardly say that there is no basis for safely concluding that there was warming in the twentieth century. ”

    The strongest of which occurred in the first half of the century. The second half of the century was, overall, flat if you consider that temperatures after 1950 never exceeded the maximum value set before 1950.

  23. Mark T:

    The trend is not very significant over the last 150 years to begin with

    I have to disagree. On paper, the trend is at least 10-sigma from “no temperature change” (accounting for serial correlation, I get 0.5±0.05 °C/century for full record). It’s going to take a lot of work to make this disappear, especially over the last 30 years, where we have two completely different technologies both agreeing there is a significant trend (4 sigma).

    I take the position that the original claimants, i.e., those that originally put the data together, hold the burden of proof.

    They have the largest body of data, and have presented proof. The onus is now on those who say they have made errors to point to the errors or holes in their proof. You can’t prove a negative, so you can never prove that you didn’t make an error or didn’t leave something important out. That’s why, if Watts says A,B,C matter, it’s beholden on him to show they do matter. especially when the scientific corpus says otherwise.

    I’m not demanding (I don’t think) an unreasonable burden on Anthony. It should be pretty straightforward to come up with another temperature reconstruction “doing it right” and see how much the answer varies when you do so.

    My bet is, even making every correction he, Smith and others have suggested, you still get a very similar temperature trend. Regardless of what one says happens prior to roughly 1950, the instrumental temperature record is so bad, that when you do a proper weighted fit to account for the greater uncertainty in the old data, this change doesn’t have much of the trend gets driven by the recent warming.

  24. George:

    The strongest of which occurred in the first half of the century. The second half of the century was, overall, flat if you consider that temperatures after 1950 never exceeded the maximum value set before 1950.

    You’re thinking of the US land record, not the world land or world land+ocean, which are the relevant measures. The most robust warming has been since 1975, an observation for which we have agreement between satellite and ground based temperature measurements.

  25. Carrick said
    January 27, 2010 at 2:59 am

    I have to disagree. On paper, the trend is at least 10-sigma from “no temperature change” (accounting for serial correlation, I get 0.5±0.05 °C/century for full record).

    I had not seen that. Global, or US, btw?

    It’s going to take a lot of work to make this disappear, especially over the last 30 years, where we have two completely different technologies both agreeing there is a significant trend (4 sigma).

    I agree with Eric Anderson’s point in #19 regarding this. The satellites have been diverging from the land record as I’ve read, so how much are they agreeing?

    They have the largest body of data, and have presented proof.

    Evidence, not proof. 😉

    The onus is now on those who say they have made errors to point to the errors or holes in their proof.

    This is where you and I disagree, really. I think the holes have been pointed out, and the errors (biases, really) numerous enough that the originators hold the burden.

    so you can never prove that you didn’t make an error or didn’t leave something important out.

    I don’t think they need to do that. Free all the code, explain why the station count has dropped, justify every step of the algorithm, etc., and people like me will be happy.

    That’s why, if Watts says A,B,C matter, it’s beholden on him to show they do matter. especially when the scientific corpus says otherwise.

    The problem is that the “scientific corpus” have proven themselves untrustworthy. Many of the things they are claiming matter have already been shown to matter, btw.

    Regardless of what one says happens prior to roughly 1950, the instrumental temperature record is so bad, that when you do a proper weighted fit to account for the greater uncertainty in the old data, this change doesn’t have much of the trend gets driven by the recent warming.

    I think this sort of begs the question (particularly given UHI that is apparently not properly accounted for). For that matter, why should the older data have greater uncertainty than the newer data? I’m just curious.

    Mark

  26. I think what I’m getting at, Carrick, that I don’t believe I’ve made clear to you in the few posts regarding this, is that I think we can now demand that the keepers of the data prove the methodology they use for adjustments. I’m not saying they should prove “it doesn’t matter,” which I agree would be the same as a negative. Sorry if that’s how my explanation came off. I think this should happen even if D’aleo and Watts demonstrate that there is not much of a difference. No matter how you slice it, if the methodology is bad, regardless of the quality of the data, the result is bad science.

    Mark

  27. Mark T:

    I had not seen that. Global, or US, btw?

    Global land record (crutemvgl) for one. Full global average is similar (the trend decreases from about 2°C/century to 1.5°C/century for last 30 years, but the uncertainty goes down because you’re averaging over 4x the area).

    I agree with Eric Anderson’s point in #19 regarding this. The satellites have been diverging from the land record as I’ve read, so how much are they agreeing?

    Surface instrumental trends are running about 25% too high (if you use UAH, they agree if you use RSS). My belief is surface temperature should run high because of the difference in what they are measuring (one is measuring boundary layer physics, the other the average of the troposphere above it). UAH and HADCRUT (land+sea temp) disagree by about 2 standard deviations in temperature trend.

    This is where you and I disagree, really. I think the holes have been pointed out, and the errors (biases, really) numerous enough that the originators hold the burden.

    I would say the convention in physical sciences is the burden of proof is on the people raising the issues to show they are significant.

    I don’t think they need to do that. Free all the code, explain why the station count has dropped, justify every step of the algorithm, etc., and people like me will be happy.

    GISTEMP is freely distributed, and fully documented. I’m not sure “justify every step of the algorithm” is required, as long as it is available for other people to look at. And it has been replicated now, several times. Most conspicuously (to me anyway) is Clear Climate Code and yes they’ve found at issues (and they were fixed).

    The problem is that the “scientific corpus” have proven themselves untrustworthy

    Well a few of them anyway, and these violations in trust are mostly not dealing with the physical science issues but the climate impact BS.

    I think this sort of begs the question (particularly given UHI that is apparently not properly accounted for). For that matter, why should the older data have greater uncertainty than the newer data? I’m just curious.

    Because of the smaller Earth coverage. You are effectively averaging over less surface area.. the more surface area you average over (until you over sample), the smaller the error of the mean temperature will be. See for example this. (CRUTEM3V, 20-year average.)

    Note that only the last 2 points are statistically different than the mean of the other 7. The bloat in the uncertainty comes from the paucity of data prior to 1950. It does not factor in the systematic uncertainty from a poor sampling of the Earth’s surface, which is likely of the same order.

  28. I would say the convention in physical sciences is the burden of proof is on the people raising the issues to show they are significant.

    I would otherwise agree except that this is a different situation than is conventionally uncovered. What has been demonstrated is blatant disregard for proper scientific procedures including (seemingly) willful bias. It’s not like a simple rounding error has been found. We’re at the point that they need to do this simply to save face and recover some dignity.

    Isn’t there a paucity of data after about 1990, too (at least in the US)? That was the point I was driving at, btw, since I understand the errors increase with decreasing sample points (both statistically and due to an analog to the Nyquist problem).

    Mark

  29. I’m not sure “justify every step of the algorithm” is required, as long as it is available for other people to look at.

    Uh, I meant to restrict that to every major adjustment. I don’t think the UHI case has been properly dealt with in particular, and as I recall, there’s significant evidence they aren’t properly accounting for it.

    It is unfortunate that the siting issues uncovered by Anthony will ultimately reduce the number of stations available, which will likewise increase the uncertainty. Imagine what happens when they get to the ROW? Ugh.

    Mark

  30. Mark T:

    particularly given UHI that is apparently not properly accounted for)

    I neglected to comment on this. GISTEMP includes a correction for UHI, there are a few examples that were found where it was being applied backwards (that is fixed now I believe). If you compare CRUTEM3V with GISTEMP land surface values, CRU is about 1 sigma higher than GISTEMP for trend over last 30 years (rough number 2°C/century versus about 1.85°C/century CRU and GISS respectively). While it is real (it is a systematic difference), it lacks experimental significance (30% chance it could have happened by chance).

    Mote, the fact CRU doesn’t apply a correction and GISS does gives us a measure of how important this effect is… So in that sense, Jones is right… it doesn’t make a systematically significant difference. My rule of thumb is any systematic effect that is larger than 1/10 the observational uncertainty should be applied, so by my standards, GISS is correct in applying it. (The issue here is obvious, if you have 10 systematic errors on the same magnitude of your observation error you’re talking about a huge bloat in uncertainty, leaving even one systematic error of the same magnitude is not excusable).

    If you use full land+ocean, the difference in the two sets is about 0.05°C/century, close to what Jones 1990 says it is.

  31. No need to make this more complicated than it already is. If we all agree the global mean temperature rose about 0.6 C over the past 100 or so years then what caused it? Anyone who says they know is a telling porkies. It’s that simple.

  32. You’ve got a statistician (correct?) and a signal processing engineer discussing data quality. Sorry, Peter, but if there is a way to make it more complicated, I’m not only sure we can, but will.

    Carrick: adjusting for UHI and doing it correctly are two different beasts. If I’m not mistaken, a chunk of that need goes away anyway when looking at the 1 and 2 rated stations. Correct? I think it should still be applied.

    Of course, I’m also a bit leery of a “global temperature” in the first place. Are high altitude or otherwise low atmospheric density locales derated compared to high humidity sea level locales? It seems to me the tropics should dominate the curve (both due to area and atmosphere), yet I don’t recall that being the case. Pielke Sr. argues similarly as I recall (at least, that the global temperature measurement is meaningless).

    Mark

  33. Mark T:

    Isn’t there a paucity of data after about 1990, too (at least in the US)? That was the point I was driving at, btw, since I understand the errors increase with decreasing sample points (both statistically and due to an analog to the Nyquist problem).

    It’s still much better than it was prior to 1950. Now that we have satellite measurements, they may decide to phase out the use of land-based measurements entirely (for climate purposes).

    Also because wind pulls air across the sensor, for a 24-hour measurement you’re sampling at least a 200-km column of air. Over a month, the effective length is at least 6000-km. (In practice, you are actually sampling a significantly larger area of the temperature field than that due to the much higher average wind-speeds aloft.) This suggests (and is the case) that you arrive at the Nyquist limit much sooner than you might expect, as long as all you want is climate-scale temporal resolution. It shows up in this case in highly correlated measurements in temperature fluctuations for well-sited sensors.

    It is unfortunate that the siting issues uncovered by Anthony will ultimately reduce the number of stations available, which will likewise increase the uncertainty. Imagine what happens when they get to the ROW? Ugh.

    Siting errors where the environment doesn’t change will have no significant effect, as long as you are well away from heating sources. Central Park NYC is fine for example.

    As I’ve said elsewhere what Anthony is doing is of real value, even though it is likely to be of more application for the met community than the climate community. (You have to reconstruct the full temperature field for met, and you need a much more dense number of points, because typically you want a 6 hr average of temperature…which represents a much smaller spatial sampling) instead of getting away with residuals.

  34. Actually I’m a physicist, with experimental measurement of the atmosphere one of my specialties (not climate though).

    Carrick: adjusting for UHI and doing it correctly are two different beasts. If I’m not mistaken, a chunk of that need goes away anyway when looking at the 1 and 2 rated stations. Correct? I think it should still be applied.

    I agree it should be applied. See my comment about making systematic corrections that are as large as 1/10 the measurement error (there’s a more technical version involving error budgets and Monte Carlo’s but this rule of thumb works pretty well).

    Are high altitude or otherwise low atmospheric density locales derated compared to high humidity sea level locales? 3/4 of the area of the Earth is ocean. High altitude locations are a very small subset of the full data set. You could set them to about anything reasonable and it wouldn’t do much to the temperature trends.

  35. Peter of Sydney:

    . If we all agree the global mean temperature rose about 0.6 C over the past 100 or so years then what caused it?

    Well they are reasonably sure that all of the warming prior to 1980 was natural, so “natural causes” is a good explanation (human generated sulfates canceled CO2 emissions till then). We’re also reasonably sure CO2 acts as a green house gas, and that we’re increasing it. So we can say with some certainty that the temperature is higher than it would have been without us adding any CO2. The standard climate sensitivity suggests a number around 0.25°C for that (probably less because there is roughly a 30 year latency between adding CO2 and the full effect from it).

    Beyond that there’s a hell of a lot of natural climate fluctuations going on. It’s like we’re on a rough sea and the tide is rising. Hard to notice a gradual level change with all of that rocking back and forth.

  36. Carrick said
    January 27, 2010 at 4:27 am

    Actually I’m a physicist, with experimental measurement of the atmosphere one of my specialties (not climate though).

    That’s what I thought after I posted. My bad.

    I agree it should be applied. See my comment about making systematic corrections that are as large as 1/10 the measurement error (there’s a more technical version involving error budgets and Monte Carlo’s but this rule of thumb works pretty well).

    Oh, yeah, I saw that. My point was that I believe there is some noticeable difference just using the high quality stations (1 and 2 rated), irrespective of the actual UHI adjustment. I think, though I cannot confirm at the moment, these stations are mostly rural so UHI would not be that great to begin with using them. Of course, 1/2 rated stations are probably a rarity in the ROW. Yay.

    High altitude locations are a very small subset of the full data set. You could set them to about anything reasonable and it wouldn’t do much to the temperature trends.

    Really I was getting at the Arctic, which dominates (seems to) the trend (its trend is quite large, too). The air there is much less dense than the 3/4 of the world that is covered by ocean.

    Mark

  37. Mark T:

    Really I was getting at the Arctic, which dominates (seems to) the trend (its trend is quite large, too). The air there is much less dense than the 3/4 of the world that is covered by ocean.

    It’s a pretty small area.. only 4% of the total surface area. But do you think density is lower in the Arctic?

  38. Carrick said
    January 27, 2010 at 5:10 am

    It’s a pretty small area.. only 4% of the total surface area.

    Yup, but take a look at the anomaly maps. The Arctic circle is the big player. The southern hemisphere is almost nothing (relatively speaking). Clearly the northern hemisphere is dominating the global trend.

    But do you think density is lower in the Arctic?

    Yes. Lower humidity, I’m guessing. The same reason I get chapped lips in St. Louis in the winter but not in the summer (here, year round, dangit!).

    Mark

  39. #44 Mark T: Yup, but take a look at the anomaly maps. The Arctic circle is the big player. The southern hemisphere is almost nothing (relatively speaking). Clearly the northern hemisphere is dominating the global trend.

    And the northern part of the northern hemisphere is also where you have the most obvious oscillations (especially the north Atlantic), and you have many long-running records that show that the current warm period isn’t that much different from the period 70 years ago.

  40. Good point. Trends in oscillatory (and even AR) data are meaningless anyway.

    Carrick, I should also apologize for saying that the 150 year trend was not significant. I was not thinking. I should have said “not much.” I know better and “spoke” without thinking. Though the statistic you offered seems… artificially high, as if there was something added? 🙂

    Mark

  41. MarkT:

    Carrick, I should also apologize for saying that the 150 year trend was not significant. I was not thinking. I should have said “not much.” I know better and “spoke” without thinking. Though the statistic you offered seems… artificially high, as if there was something added?

    Most of the significant trend is from post 1960. If you look at 1850-1960, I don’t find a significant trend.

    Anyway, as I pointed out, it’s circa 1975 to current that matters.

  42. JLKrueger said
    January 27, 2010 at 1:15 am

    No, the satellites are not calibrated against the surface record. They are calibrated against several internal platinum resistance thermometers on the satellite once each Earth scan by aiming at a “warm calibration target” within the instrument. To complete the process a “cold target” is checked, that being the cosmic background which is assumed to be about 2.7 Kelvin. This then provides the “brightness temperature curve” which enables UAH to calculate the Earth temperature.

    Curiosity got the best of me. Given the thermometers you describe, have they been in use for the entire 30 years of readings? or are they a relatively new addition?

    In other words, what was in use before these platinum resistant thermometers, if anything?

    Thanks

  43. Carrick said
    January 27, 2010 at 11:59 am

    Most of the significant trend is from post 1960. If you look at 1850-1960, I don’t find a significant trend.

    Of course, this is the data that is apparently the beneficiary of “man made” warming. 😉

    Mark

  44. I am not at all sure that calculating the CIs for the global trends over the past 100 plus years from selected and adjusted instrumental data really tells the full story of the uncertainty in the estimated trends.

    We do have an independent measure from RSS and UAH satellitie measurements and adjustments, but only over the past 30 years. Unfortunately the the major temperature data sets that go back nearly 150 years use mainly the same raw data and do similiar adjustments. I should interject here that there are statistically significant differences between data sets in certain global zones and over certion periods of time, and further that I think we concentrate too much on the global trends and not sufficiently on regional and local ones.

    In the end we have no truly independent instrumental measures of temperatures going back in time more than 30 years ago. We can agree we have a better confidence in the last 30 year trend, but the real kicker in pursuing studies of AGW has to be to compare that trend with trends prior to that time.

    I also do not think that the temperature set owners do a comprehensive job in determining all the sources of uncertainty – and further they are not in a position to be motivated to do that. I see many of the set owners as number crunchers and adjusters from afar and obviously unaware of the quality of their product in field, as was revealed by the Watts team CRN evaluations. Their response to these field evaluations certainly could not inspire much confidence that these owners were motivated to do much more than attempt to defend their data – and again from afar.

  45. Here’s a wacky thought that criticizes the very idea of calculating the “mean” temp anomaly.

    The mean temp anomaly is calculating (simplifying) like this:

    Collect station data (adjust for some things, blah blah blah)

    This is where things first go awry: Subtract the “normal” from a “base period”. The problem is that if we want something which corresponds to how the Earth is responding to a change in radiative balance, WE WANT THE ABSOLUTE TEMP IN K. Radiation is related to the fourth power of the temperature. Right of the bat, this approach gives undue weight to polar locations.

    The second place where things go awry: The data you’ve collected is in daily mins and maxs. They just take these and average them, (min+max)/2, which again disconnects us from the radiative impact, for which max temp is more important.

    The third place things go awry: They average daily anomalies into monthly, giving undue weight to cold days whose impact on radiation is smaller.

    The final place where things go awry: they put the data from stations into grids for area weighting, giving undue weight to the colder places within grid cells.

    Because colder places tend to warm more than warmer places, in spite of having a smaller impact on the radiation balance, this exaggerates the warming as a measure of the planet trying to achieve radiative balance.

    All temperature readings should be taken to the fourth power BEFORE attempting to create an index of global response.

  46. “Wait until you see what we have for tomorrow.”

    Now poor Gavin will be up half the night tossing and turning. You should block Gavin’s IP Address. When he takes a laptop to the local WIFI someone can snap a picture of him reading TAV (on his own time of course).

  47. “Honestly, CO2 does catch/retard/slow down heat. It’s just the truth. More CO2, more warming…”

    Sorry, I disagree. To retard the escape of heat is not warming, just as “saving money” on a sale is not always saving. You’re still spending money, just less. As the Kiehl-Trenberth cartoon shows (WG1), 390 w/m^2 leaves the surface. 40 w/m^2 goes straight through (atmospheric window, no absorption). Of the remaining 350, another 26 is radiated upward to space. That process continues all night (or all Polar Winter). That’s net cooling, not warming. Just slower cooling.

    Likewise, the absorption/emission process goes steady-state when enough greenhouse gas (GHG) is present to reduce transmissivity to 0%. Adding CO2 (or any GHG)has no further effect.

  48. Mark T:

    Of course, this is the data that is apparently the beneficiary of “man made” warming.

    Not really. Most of the argument about “man made” warming has to do with cooling the data prior to 1950 or so (especially the mid-19th century warming event). While I’m not saying that people didn’t impose their biases on the data (it happens frequently), I’m not sure there is a lot of leeway in the interpretation of the data since the advent of satellite measurements.

  49. RB:

    To say that there is probably zero cooling trend, man-made or otherwise, is to ignore the physical evidence of sea level change, glacier melt, surface melt in Greenland etc.

    To be clear, who do you think is advocating a “zero cooling trend.”

    Richard Savage:

    retard the escape of heat is not warming, just as “saving money” on a sale is not always saving. You’re still spending money, just less. As the Kiehl-Trenberth cartoon shows (WG1), 390 w/m^2 leaves the surface. 40 w/m^2 goes straight through (atmospheric window, no absorption). Of the remaining 350, another 26 is radiated upward to space. That process continues all night (or all Polar Winter). That’s net cooling, not warming. Just slower cooling. Adding CO2 (or any GHG)has no further effect

    This argument makes no sense.

    You’ve just proven that adding another blanket to your bed doesn’t make you any warmer.

  50. #54, you might find a bunch of comments by raypierre here useful regarding equilibrium conditions under a greenhouse effect:
    http://www.realclimate.org/?comments_popup=2652

    Example:
    [Response: This never happens. As you add more CO2, the stratosphere moves higher up, too. Venus has 300,000 times as much CO2 in its atmosphere as Earth and still hasn’t exhausted its capability to get more greenhouse effect out of CO2. The stratosphere has moved up to where it is under 1% of the mass of the atmosphere. –raypierre]

  51. RB:

    Zeke Hausfather addresses the claims made by the d’aleo/smith here.

    He didn’t know how to interpret this result or he might have been a bit more low key in his discussion. Leptokurtosis in a distribution is often evidence of human tampering.

  52. Carrick said
    January 27, 2010 at 2:55 pm

    Most of the argument about “man made” warming has to do with cooling the data prior to 1950 or so (especially the mid-19th century warming event).

    Uh, you missed my point. Discounting the satellite measurements for a moment, data for the last 30 years has been shown to be biased upward (even if only by a slight amount), hence it is a beneficiary of “man made” warming (whether it is the majority or not was irrelevant to my point). In that same vein, data prior to 1950 would be a beneficiary of “man made” cooling.

    Mark

  53. To say that there is probably zero cooling trend, man-made or otherwise, is to ignore the physical evidence of sea level change, glacier melt, surface melt in Greenland etc. but one could always shift the goalposts.

    Not sure I get your point here, but it is always instructive to estimate when these events started and how long the glaciers have been melting, the sea level has been changing and Greenland surface melting. I guess that would be determining where the goal posts are located.

  54. Kenneth Fritsch:

    Not sure I get your point here, but it is always instructive to estimate when these events started and how long the glaciers have been melting, the sea level has been changing and Greenland surface melting.

    This is a good starting place:

    The Little Ice Age: How Climate Made History, 1300-1850 (Paperback)

    It includes discussion of climate and agricultural production. It also talks about the advancing of the glaciers in the Alp since the start of the LIA. Just an overall good reference, and a fun read.

  55. Mark T:

    data for the last 30 years has been shown to be biased upward (even if only by a slight amount

    Can you link to me where has this been shown? It’s possible that the bias may represent a slight cooling from the real temperature trend, at least for GISTemp (i.e., they may be overcorrecting slightly).

  56. Sorry Carrick (and others… we’re way off topic), perhaps you can explain why you believe that warming prior to 1980 is “natural” and that after 1980 it’s due to humans? There sure was a lot of industrial activity in the 20th century… it didn’t all occur starting in 1980 (or, if you prefer, starting in 1950).

    Or why you believe that the trends for the past 30 years are different than the trends that occurred at other points over the historic temperature record (e.g., between ~1860 – ~ 1800 or the trend from ~1910 to ~ 1940)?

    Or how release of a well mixed gas like CO2 would mainly affect the Northern Hemisphere more than the Southern Hemisphere. Would not the impact be seen on the entire planet?

    Or that sea level change has stayed pretty much the same over the past century (when taking into account changes in measurement technology)?

    Or that some glaciers are melting and some are growing? And, most importantly, that none of the changes are different from what has happened in the past (meaning, we’re within the range of normal variability)????

    Bruce

  57. #57
    Carrick it does make sense. Adding a Blanket isn’t adding energy into the system it is just retarding the loss of energy already in the system changing the “thickness” thus changing the Q. Keeping with the “blanket” example when you are in bed you are the heat source and all the blanket does is retard the loss of body heat [unless it’s an electric blanket :)]. If you are still feeling “cold” and add another blanket you didn’t add any heat to your body, you just slowed down the heat you lose. Same principle in changing your windows from old single pane to double pane windows. You didn’t add energy to the system you just changed the Q.

    However with that said the “blanket” analogy for the Atmosphere is not the best imho. Unlike a blanket on you at bedtime the atmoshpere has other factors at play such as air/water currents that affect the nighttime temps in different regions.(Think of a cold or warm front that passes through at night)

  58. #68, quick comment, as has been stated several times here, it is believed that until the 1970s warming emissions were in balance with cooling aerosols (sulphates) and since then it’s been mostly CO2 (due to anti-pollution measures in the West, I think).

    Meanwhile, on topic, Gavin Schmidt has a response:

    The temperature analyses are not averages of all the stations absolute temperature. Instead, they calculate how much warmer or colder a place is compared to the long term record at that location. This anomaly turns out to be well correlated across long distances – which serves as a check on nearby stations and as a way to credibly fill in data poor regions. There has been no deliberate reduction in temperature stations, rather the change over time is simply a function of how the data set was created in the first place (from 31 different datasets, only 3 of which update in real time). Read Peterson and Vose (1997) or NCDC’s good description of their procedures or Zeke Hausfather’s very good explanation of the real issues on the Yale Forum.
    [Response: Much more relevant is that Watts still, after years of being told otherwise, thinks that the global temperature analyses are made by averaging absolute temperatures. – gavin]

    #69, I’ll give this a humble shot – if you have a leaky bucket, and you turn the tap on and off intermittently, the ‘equilibrium’ average height of the water in the bucket will depend on the arrival rate of water and the leak rate. So also, earth is warmed by both the sun and atmospheric IR and by reducing the leak rate of the atmosphere by adding a ‘CO2 blanket’, you warm up the surface more by increasing atmospheric IR to the surface of the earth. That this will radiation will be negated by an increase in reflecting clouds is the counterclaim which so far seems to lack strength but is accounted for in the wide uncertainty in climate sensitivity (1.5C -4.5C per doubling of CO2).

  59. RB: Thanks, yeas, have seen that response before, but that’s not really THE answer, it’s just AN answer. And the answer is telling: “…it is believed…” seems appropriate. Perhaps one could change that description to, “Many in the pro-AGW community wish to believe…”. 😀

    Also, the blanket analogy is poor, but lets go with it for a moment…

    Imagine that you’re chilly at night. You decide to add a blanket to prevent heat loss. Sometime later, you are still chilly, and add another… repeat the process 10 times till you have a pretty hefty bundle of blankets on you. What is the relative value of heat retention added by that last blanket compared the the next to last blanket? Sure, the first one really helps. The second one also helps. But after a while, you can’t really notice the difference between blanket 10 and 11.

    Bruce

  60. #70

    Flawed Analogy – Normal Blankets don’t radiate a thing they just slow loss rate, also that energy that the CO2 is “re-radiating” is just the energy lost by the planet. So again you just slowed loss not gained warmth no new Energy was inputted to the system. To gain warmth you have to add more energy to the system, this is basic thermodynamics. Again that is why I don’t like the “blanket” analogy for the Atmoshpere since it doesn’t take into account the complexity of the atmosphere, but if you are going to argue the point about blankets he is right since you are only dealing with heat/energy loss. And your bucket analogy is flawed as well since a smaller hole doesn’t add more water to the bucket, it just slows the loss of whats already in the bucket.

    You also haven’t seen the latest work on clouds try here:

    Click to access Spencer-and-Braswell-08.pdf

    http://www.drroyspencer.com/research-articles/satellite-and-climate-model-evidence/

    http://www.drroyspencer.com/2009/12/little-feedback-on-climate-feedbacks-in-the-city-by-the-bay/
    (the second and third link are tied together. The work discussed int he second link is what was presented at the Fall AGU meeting in the 3rd link.)

  61. 71-Actually, assumed for convenience is more accurate-there are no measurements of the aerosols before the seventies, so their effect during the twentieth century is completely up in the air.

    And of course, even the measurements we have now are inadequate to really understand what they are doing.

  62. #72, I’m not sure if you are arguing the greenhouse effect but defining warmer = higher temperatures, and gaining warmth = increasing temperature, isn’t it the point that based on different temperature gradients above the surface between earth and space, you have different equilibrium temperatures for the surface in accordance with Stefan-Boltzmann’s Law until temperatures rise to a point where net power absorbed=net power radiated?

  63. #71, you are correct, but climate scientists know that defining sensitivity as, say, 1.5C-4.5C per doubling is only valid for small changes in the temperature. Which is why temperature anomalies have something close to a log-relationship with CO2 concentrations – not a linear one.

  64. #74

    No what I’m saying is that Carrick’s statement about a blanket is incorrect since it deals with nothing but conduction, however I said it twice before that the analogy is flawed since the Atmosphere is not a blanket. So if you base your refutation of the original poster on that a blanket “warms” you, you are wrong based on Laws of Thermodynamics. A blanket does nothing but slow the loss of heat.
    See this:
    http://hyperphysics.phy-astr.gsu.edu/Hbase/thermo/heatcond.html

  65. #76, analogies are meant to convey something similar, but not exactly identical, I find the blanket analogy adequate even if it warms differently from, say, a space heater, your mileage obviously varies.

  66. Jeff Id said
    January 26, 2010 at 9:44 pm

    Wait until you see what we have for tomorrow.

    Is it tomorrow yet?

    😉

  67. BDAABAT :

    RB: Thanks, yeas, have seen that response before, but that’s not really THE answer, it’s just AN answer. And the answer is telling: “…it is believed…” seems appropriate. Perhaps one could change that description to, “Many in the pro-AGW community wish to believe…”.

    The science says man-made warming started roughly 1980. Of course that’s what’s believed, what else is one supposed to say?

    What’s an alternative explanation that has any founding in science?

    Imagine that you’re chilly at night. You decide to add a blanket to prevent heat loss. Sometime later, you are still chilly, and add another… repeat the process 10 times till you have a pretty hefty bundle of blankets on you. What is the relative value of heat retention added by that last blanket compared the the next to last blanket? Sure, the first one really helps. The second one also helps. But after a while, you can’t really notice the difference between blanket 10 and 11.

    If you put a thermometer under the blanket with you and you add another blanket, guess what? The temperature on the thermometer rises. And it stays higher than it would have been without the blanket. Your description of the physics is wrong here. You do notice a difference of course.

    Same thing happens with a house with a heater with a fixed BTU output. If you add more insulation, your house will stay warmer in cold weather. “Duh”.

    Even if I take a passive system, insulation obviously reduces heat loss. Think coffee in a paper cup versus styrofoam cup with lid. Which stays warmer longer?

    The Earth with a periodic heating source that is the Sun, will stay warmer if the atmosphere retards the loss of heat than it will if you didn’t retard the heat as much. Adding CO2 reduces heat loss, ergo it stays warmer.

  68. RB Let’s leave analogies alone – they may be illustrative but can never prove a point. Surprised you are quoting Gavin as he is so much “damaged goods” after the emails, but then I might be guilty of ad hom argument here. Anthony understands how anomalies work, and Gavin knows he does, but Anthony is addressing the fact that grid cells are infilled from nearby readings, and with the catastrophic reduction in the number of stations over the last decade the selection of those remaining stations becomes increasingly critical to the output data. If anybody wants to form a view as to the quality of the data Gavin has been working with, they only have to look at the “Harry” emails – here is clear evidence from within the CRU that the raw temperature data has been abused beyond recovery.

  69. RB @ Post #70:

    Surely it is readily understood that what Gavin Schmidt has to say about the proper use of anomalies or even Watts understanding of that use has little to no relevance to the discussion of uncertainty of the temperature data sets or the quality control of the measurements in the field and its effects on uncertainty or which stations are removed from the official record and how that might increase (or even decrease) uncertainty.

    It is a rather simple concept that an anomaly adresses relative temperature changes, whereas absolute temperature alone does not, but that absolute temperatures are used to calculate an anomaly – so where is the magic.

    I would say that I sometimes see people not thinking clearly about a station observed currently as warmed non climatically or a station with a large UHI and the effects of those stations on a temperature trend. Those stations do not necessarily add to a warming trend unless the non climatic effects changed over time and in a warming direction. That is why when the Watts team finds a station that they can by observation indicate a cuurent non climatic warming that is a start, but not sufficient to show what effect the change has on the temperature trend. Some statistical analyses like those that RomanM did on the data (and I recently linked to here at TAV) can help, but more is needed like a detailed study that might reveal/predict how the change obsevered in current time progressed over time.

    I suspect that finding someone interested and motivated in doing that study is not likely coming from the climate science community and surely not by the data set owners.

  70. Carrick:

    What does the science say about warming prior to 1980 being natural and that after 1980 being “human influenced”? Actually, what science says, “I don’t know.”

    What the science says is that what we’ve experienced so far is not outside what has occurred in the past (see the rest of my post about earlier temperature increases in certain intervals that are similar to what we’ve experienced recently… which you’ve stated was prior to the period being due to human influence).

    What some in the pro-AGW group say is, “It’s all due to humans and it’s all BAAADDD!!!”. Just saying it’s so isn’t science.

    Timetochooseagain #73 has it correct.

    BTW: Continuing with the admittedly flawed blanket analogy… yes, I acknowledged that adding a blanket will result in retained heat. What I asked was, what’s the differential effect of adding the 10th or 11th blanket? Probably measurable if one has an extremely sensitive instrument, but maybe not. What’s the differential when adding the 100th blanket? At some point, adding more blankets isn’t going to make a difference.

    Bruce

  71. Boballab

    No what I’m saying is that Carrick’s statement about a blanket is incorrect since it deals with nothing but conduction, however I said it twice before that the analogy is flawed since the Atmosphere is not a blanket. So if you base your refutation of the original poster on that a blanket “warms” you, you are wrong based on Laws of Thermodynamics. A blanket does nothing but slow the loss of heat.
    See this:
    http://hyperphysics.phy-astr.gsu.edu/Hbase/thermo/heatcond.html

    OMG.

    Another lecture on thermodynamics. I thought I was done with that when I finished grad school. Seriously what was the point of the link?

    Since you want to discuss the physics…if your body is radiating heat at the rate q = dQ/dt, at every point from the interior of the blanket to the surface, we have from Fourier’s law, dT/dz = q/k (the temperature gradient is constant), where z is the distance from the top surface of the blanket into the interior, “k” is the heat conduction and dT/dz is the temperature gradient. Note if q and k are constant with z within a material (for steady state they will be) then so is the temperature gradient dT/dz.

    This is easy to solve: T(z) = T(0) + q/k * z. If “H” is the thickness of the blanket(s), then the temperature at the bottom of the blanket (in contact with your body) is

    T(H) = T(0) + q/k * H

    If you double H, you obviously get a higher steady-state temperature. It’s just physics (and common sense).

    As to the atmosphere is not a blanket… don’t confuse the mechanism for the reduction in heat loss with the physics processes at work. For the atmosphere the reasons why it works are a bit more complicated but the end result is the same.

    The additional CO2 causes the insulating layer of the atmosphere to get a bit thicker, but dT/dz (called the “lapse rate”) is the same, so the temperature at the bottom increases.

  72. BDAABAT:

    What does the science say about warming prior to 1980 being natural and that after 1980 being “human influenced”? Actually, what science says, “I don’t know.”

    Please stop with this sophism.

    The science very clearly says prior to 1980 the warming was natural. Want references?

    You might argue that the science is wrong, but that’s what it says.

  73. TTCA:

    71-Actually, assumed for convenience is more accurate-there are no measurements of the aerosols before the seventies, so their effect during the twentieth century is completely up in the air.

    Well, it’s not like we don’t have any measurements, just not direct ones. Economic activity and knowledge of the methods of production gives you information on aerosol release, models of dispersion of aerosols gives you information on climate. I believe that ORNL factors these information into its estimates.

    Even today aerosol levels is notoriously poorly instrumented. As I understand it, in the GCMs the forcings over time are usually a “fit’ parameters of the models.

  74. Carrick @84,
    ahh yes, thanks for correcting me re: temperature gradients when what I was trying to get at was more a cross-sectional view of temperature profile.

  75. Carrick I’m not going to argue with you about it, but you might want to go back and review basic Thermodynamics because when you have two objects in proximity at different temperatures, heat will transfer from the warmer to the cooler and it can never be stopped only slowed. If the temperature of the room outside of your blanket is lower then the temperature of your skin, heat will flow from your skin to the room. Throwing a blanket on will not stop that heat transfer only slow it down. It also did not make it “warmer” for you, it just slowed the amount of heat loss you experience. This will continue until you and the room reach equalibrium, either by you warming the room to the same temperature as your skin or you dying and assuming room temperature. For you to get “warmer” you would need an outside heat source at a higher temperature then your skin. Trying to argue against that is trying to argue for a perpetual motion machine of the second kind.

  76. The problem with the blanket analogy is that it doesn’t address the difference in wavelength of incoming vs outgoing radiation. We all know the transmission bias of incoming to outgoing energy. That’s why greenhouse is so appropriate IMO. Even tho the greenhouse warming in a typical greenhouse is related more to trapping of convection, glass itself has far better transmission to visible incoming than infrared outgoing.

  77. Boballab:

    Carrick I’m not going to argue with you about it, but you might want to go back and review basic Thermodynamics because when you have two objects in proximity at different temperatures, heat will transfer from the warmer to the cooler and it can never be stopped only slowed.

    That is an incorrect statement of the problem.

    The rate of transfer of heat (q = dQ/dt) is a given, the temperature on the outside surface of the blanket is a given. You compute this using Fourier’s Law of Heat Conduction, from which you conclude there is a constant temperature gradient, given by dT/dz = q/k. For the blanket problem, the rate of transfer of heat is neither slowed nor stopped and the temperature gradient depends only on the amount of transfer of heat q and inversely on the heat conductance k.

    This is 2nd semester freshman physics.

    You can confirm this with the calculator you yourself linked.

    http://hyperphysics.phy-astr.gsu.edu/Hbase/thermo/heatcond.html

    Set up a problem, double the thickness, calculate the new Thot (interior temperature). What happens?

    Ans: Temperature difference from interior to exterior doubles.

  78. #91, I fully believe in the ability for CO2 to capture/retard heatflow and the blanket analogy is reasonable but like most analogies isn’t perfect. I don’t believe we know how much warming is caused by the CO2 and I believe even less that a bunch of skeptics take my opinion over their own 😀 .

  79. “For the blanket problem, the rate of transfer of heat is neither slowed nor stopped ”

    This has to do with a steady state understanding of the problem. Actually the heat is slowed which doesn’t disagree with the statement but rather clarifies it. An individual joule of energy takes more time to re-emit which causes more Joules to be present in the lower Troposphere and thus the warming. Net Joules in still equals net joules out exactly -flow is equal – and the total Joules doesn’t change (Sun/internal heating is the same). However, the time for an individual Joule of energy to be re-emitted is increased. Again, not an argument with the correct equation but rather a clarification for whomever doesn’t follow thermo.

  80. Nitpick:

    Water vapor makes a fairly trivial contribution to air density. The air at sea level in the tropics is less dense than the air at the poles because the temperature is higher in the tropics. P is constant so V/nR (effectively the density) has to increase as the temperature goes down. The lower density means the tropopause is at higher altitude and the temperature difference between the surface and the tropopause is higher in the tropics than at the poles. The end result of this is that radiative forcing from doubling CO2 is greater by about a factor of three in the tropics than at the poles.

  81. Jeff ID:

    Actually the heat is slowed which doesn’t disagree with the statement but rather clarifies it.

    In the sense I meant it, the rate of heat transfer q = dQ/dt isn’t “slowed”, in steady state it’s the same.

    If you make a sudden change in q, you’re correct that observing the shift in q on the surface will take longer with a thicker blanket. We use this principle to thermally isolate sensors from external (usually atmospheric) temperature changes.

  82. #96 Completely agreed. It’s the difference between steady state and perturbations to the system. I think the difference is easily lost if you haven’t studied thermo.

  83. Carrick, I don’t have one “source,” but a general compilation from reading CA, WUWT, and here (as well as Pielke Sr.’s work, IIRC). The general “read” of everything is that recent adjustments underestimate UHI, which introduces a warm bias in the record. There are also a few other issues I recall, such as the bucket adjustment, though I don’t recall the resolution to that matter (or if there ever was one).

    I suppose if I were so inclined I could go out and search, but I’m not. It’s really moot given we both agree many adjustments have pushed the past down, which introduces the same general trend bias (with possibly different magnitudes) – my comment was really just a funny snark anyway. Arguing magnitudes is not worthwhile when we have ample evidence the cooks have been behaving badly – once that is fixed the issue of magnitudes will likely go away. 🙂

    While I agree it is nice to have satellite measurements now, we cannot stick them onto the end of the surface record prior to the existence of the satellites for obvious reasons. As a result, knowing what the satellites say over the last 30 years doesn’t tell us much about the previous 120.

    Mark

  84. Along the lines of the blanket analogy discussion, something I keep reading, but cannot find a justification for is the statement “Those are colder layers, so they do not radiate heat as well.” from the following description that is found on Realclimate ( a-saturated-gassy-argument ). I have read this elsewhere.

    My issue is with the colder layers not radiating as efficiently until they get warmer. But, would that not also mean they would not absorb radiation as efficiently either. Both emission and absorption are effected equally by the surrounding temperature, and pressure – no?

  85. The blockqoute tag above did not work so well. The paragraph from realclimate is

    “What happens if we add more carbon dioxide? In the layers so high and thin that much of the heat radiation from lower down slips through, adding more greenhouse gas molecules means the layer will absorb more of the rays. So the place from which most of the heat energy finally leaves the Earth will shift to higher layers. Those are colder layers, so they do not radiate heat as well. The planet as a whole is now taking in more energy than it radiates (which is in fact our current situation). As the higher levels radiate some of the excess downwards, all the lower levels down to the surface warm up. The imbalance must continue until the high levels get hot enough to radiate as much energy back out as the planet is receiving.”

    My question is regarding:

    the colder layers not radiating as efficiently until they get warmer. But, would that not also mean they would not absorb radiation as efficiently either. Both emission and absorption are effected equally by the surrounding temperature, and pressure – no?

  86. Kevoka:

    Both emission and absorption are effected equally by the surrounding temperature, and pressure – no?

    I don’t think they are talking about emissivity or absorption being affected by temperature. I think they are just saying that you get fewer emissions when the atom/molecule is cooler (T4 dependence from Planck’s Law), it’s “less efficient”. I don’t think the average kinetic energy of the atom/molecule matters unless you are referring to bending modes being excited by collisions.

  87. Carrick: Not sophism. Simply saying that there isn’t enough information to make an accurate and testable and verifiable determination that what took place prior to 1980 was “natural” vs what happened after 1980. Lack of measurements of the variables involved means that there really can be no way to ascertain whether previous warming was solely due to man or nature. Yes, I understand that folks have used this threshold as a line in the sand for human warming vs. natural. Just stating that those claims cannot be validated, tested, and shown to be accurate because there isn’t enough data to support the satement.

    What IS reasonably well known is that similar temperature changes to what we are experiencing currently have taken place in the past. If temps have increased at similar rates and at similar amounts in the past (especially during times when you’ve argued that those changes could not have been due to humans), why then do you believe that what we are experiencing now is anything horrid and solely due to humans?

    Bruce

  88. With the blanket analogy, I think to talk of heat being “slowed” confuses the issue. What a blanket warming someone in bed, and atmospheric transmission, have in common is a prescribed flux that must pass through. In the case of the person it’s the ~100W resting metabolic heat, and for the atmosphere it’s the 235 W/m2 heat derived from thermalised solar. For a heat flux to pass through a medium, a temperature differential is required. If the thermal resistance increases, and the temperature at the end of the path is fixed, the temperature at the start must rise – warming.

    For the blanket, the end of path temp is ambient. An extra blanket increases the temp diff. For the atmosphere, the TOA temp (~225K) is pretty much what is required for BB radiation of ~195 W/m2 (subtracting the 40 W/m2 or so that radiated directly through the atmospheric window). Adding thermal resistance to the pathway again increases the temperature difference required to pass that flux, and hence the temperature at the bottom.

    The adiabatic lapse rate and the partly absorbed radiation bands complicate the argument.

  89. BDAABAT :

    … saying that there isn’t enough information to make an accurate and testable and verifiable determination that what took place prior to 1980 was “natural” vs what happened after 1980.

    Again, that’s your judgement call. It’s not what the science says.

    I find the estimates reasonable, you don’t. Either way, they don’t impact what the current scientific reports actually say.

  90. Nick Stokes: The adiabatic lapse rate and the partly absorbed radiation bands complicate the argument.As I understand it, because convection is responsible for most of the heat transport in the atmosphere, ultimately the effect is entirely due to the fact that 1) the lapse rate stays constant, and 2) the height of the stratosphere increases slightly. I believe the main effect of the increased thermal resistance is to increase the effective convective height of the atmosphere.

    As I see it, the result is a pretty simple consequence of the fact that the lapse rate is driven by atmospheric convective physics (it’s pretty much a fixed number in other words, unless you change moisture content), and if you have more heat being absorbed by the atmosphere due to the increase in CO2, the circulation associated with it has to stretch to maintain the same lapse rate.

  91. “Adding thermal resistance to the pathway again increases the temperature difference required to pass that flux, and hence the temperature at the bottom.”

    Nice way to put it – sounds as simple as Ohm’s law.

  92. This effect was noted by Elkholm back in 1901 back in 1901 (via the comments linked to earlier):

    . . . radiation from the earth into space does not go directly from the ground, but on the average from a layer of the atmosphere having a considerable height above sea-level. . .The greater is the absorbing power of the air for heat rays emitted from the ground, the higher will that layer be. But the higher the layer, the lower is its temperature relatively to the ground; and as the radiation from the layer into space is the less the lower its temperature is, it follows that the ground will be hotter the higher the radiating layer is.

  93. JLKrueger,

    ” They are calibrated against several internal platinum resistance thermometers on the satellite once each Earth scan by aiming at a “warm calibration target” within the instrument. To complete the process a “cold target” is checked, that being the cosmic background which is assumed to be about 2.7 Kelvin. This then provides the “brightness temperature curve” which enables UAH to calculate the Earth temperature.”

    I understand that is what Dr. Spencer has told us.

    Now, how do you get from an instrument calibration with no atmospheric interference to an estimate of the temperature of something being measured through the atmosphere????

    Please think about this. Without some kind of benchmark, like Radiosonde temp measurements, it is not possible to have a reasonably definitive understanding of what the calibrated instrument is reading.

  94. RB:

    Nice way to put it – sounds as simple as Ohm’s law.

    This simple explanation works as long as you ignore convection, e.g. a “crystal atmosphere”.

    But increased thermal resistance isn’t the mechanism responsible for the atmospheric greenhouse effect when an atmosphere where convection can occur. Ignoring corrections from weather, in the convective zone of the atmosphere you have a fixed lapse rate of dT/dz= 6.5°C/km. Increasing the thermal resistance does not change dT/dz, nor does it change the equation for radiative balance at the top of the atmosphere, so the only way you can get an increase in temperature is if the convective layer of the atmosphere were to increase.

  95. JLKreuger,

    Another issue is with the multiple splices from the different satellites that have been used to make up the satellite record. Each splice required extensive work benchmarking the old against the new and rechecking the onsite benchmarks. The fact that there was at least one splice with little overlap complicates this. All in all a problem for something that is claiming such small deviations over such a large period of time.

    There is the claim that the records match well. I can as well say that the records don’t match at all. The wiggles don’t match and the trends don’t match. The amount of mismatch changes continuously. The sign of the wiggle doesn’t even always match.

  96. kuhnkat:

    Now, how do you get from an instrument calibration with no atmospheric interference to an estimate of the temperature of something being measured through the atmosphere????

    Please think about this. Without some kind of benchmark, like Radiosonde temp measurements, it is not possible to have a reasonably definitive understanding of what the calibrated instrument is reading.

    The effect of the atmosphere can be computed from first principles by simply inputing the known structure of the atmosphere itself. In fact, the profiler relies on this (it slices the atmosphere by measuring it at different angles theta, and reconstructs the vertical profile from the measured luminosity L(theta).

    Radiosondes serve as a test and an independent means of measuring, not a means of direct calibration in the normal sense of the word “calibration”.

  97. Carrick,

    your cute quote from Elkhorn shows the limitations of the day. He apparently did not know about the temp inversion in the atmosphere. The thermosphere far exceeds the temps of the surface. If he was talking about total energy he might have been correct as the density is so small that huge temp has little total energy.

    Now, since the temps invert at the tropopause, what do you think about the fact that there are TWO (2) layers in the atmosphere that are actually at the temp of the most common LW emission to space???

    An explanation of how we know which layers in the atmosphere are emitting how much would help greatly.

  98. Carrick,

    “The effect of the atmosphere can be computed from first principles by simply inputing the known structure of the atmosphere itself. In fact, the profiler relies on this (it slices the atmosphere by measuring it at different angles theta, and reconstructs the vertical profile from the measured luminosity L(theta).”

    What we know about the atmosphere is that it is continuously changing in wind directions, density, and temperature. Without these data you introduce uncertainty. How certain are you??

  99. Re: kevoka (Jan 28 02:07),

    Both emission and absorption are effected equally by the surrounding temperature, and pressure – no?

    Nomenclature is very important here. There is emissivity and absorptivity and then there’s emission and absorption. Radiant energy can be absorbed, reflected or transmitted. Transmissivity, reflectivity and absorptivity are expressed as numbers from zero to one. At any given wavelength the sum of transmissivity, absorptivity and reflectivity is exactly equal to one. These quantities are functions of wavelength and concentration and are determined for gases, where reflectivity can be ignored, by the structure of the molecules involved. For small molecules like CO2 and H2O, emission and absorption can only occur at particular wavelengths or lines. The energy levels and transitions between them that determine the wavelengths of the lines can be calculated from first principles. The strength of the individual lines vary over many orders of magnitude. Temperature and pressure affect the width and height of the lines through Doppler and pressure broadening. Pressure broadening dominates in most of the troposphere. So knowing the wavelength, line strength, temperature, pressure and the pressure broadening coefficients (all in the HITRAN database) one can calculate the absorptivity at any wavelength by summing over all lines at that wavelength. This is what a line-by-line radiative tranfer program does.

    Emission is the amount of radiant energy emitted by a volume of gas or from the surface of an opaque solid. Emission from a surface at any wavelength is determined by the emissivity at that wavelength (a number from zero to one) multiplied by the value of the Planck function at that wavelength. The value of the Planck function at any wavelength depends only on temperature. The integral of the Planck function over all wavelengths at constant emissivity, or the total power emitted by a black or gray body, is known as the Stefan-Boltzmann equation. The total power varies as the fourth power of the temperature. Because the emissivity of the atmosphere is not constant with wavelength, the Stefan-Boltzmann equation does not apply.

    Molecules can transfer energy by radiation or inelastic collision with other molecules. When collision transfer dominates over radiative transfer, the molecules are said to be in local thermal equilibrium. The energy distribution of the molecules follows the Boltzmann distribution so the fraction of molecules in an excited state (capable of radiation) is fixed by the energy of the excited state and the temperature. Emissivity then is equal to absorptivity, or Kirchhoff’s Law.

    So to answer the original question, no, emission and absorption are not equally affected by the temperature and pressure. Absorption varies weakly with temperature while emission varies strongly with the temperature.

  100. #113, Kuhnkat:

    Now, since the temps invert at the tropopause, what do you think about the fact that there are TWO (2) layers in the atmosphere that are actually at the temp of the most common LW emission to space???

    It’s not that hard to explain, and you’re making it needlessly complex. If you look at at vertical temperature profile, you have a nearly constant lapse rate of 6.5°C/km to the top of the troposphere. Where you “break” from this lapse rate, the tropopause, is where you obtain radiative balance. This discusses the situation in more detail including modeling results and experimental measurements of the increase in height of the tropopause since the onset of global warming.

  101. Re: KuhnKat (Jan 28 11:33),

    Now, how do you get from an instrument calibration with no atmospheric interference to an estimate of the temperature of something being measured through the atmosphere????

    It’s not really fair to try to reduce that subject to a paragraph or two comment. If you really want to know, then I suggest you obtain a copy of A First Course in Atmospheric Radiation or some other similar textbook. There’s a whole chapter devoted to remote sensing.

  102. kunhkat:

    What we know about the atmosphere is that it is continuously changing in wind directions, density, and temperature. Without these data you introduce uncertainty. How certain are you??

    Those atmospheric data are known (they get measured independently), so we can put that into the inverse solution as necessary. As to the uncertainty, there are ways to estimate it using monte carlo methods, and then one can compare it with radiosonde data to obtain an empirical measure of uncertainty.

    Again you are needlessly making it sound more complicated than it is.

  103. Re: Carrick (Jan 28 12:44),

    Where you “break” from this lapse rate, the tropopause, is where you obtain radiative balance.

    Not really. That’s approximately true for the CO2 band, but not the rest of the spectrum. For example, For the 1976 standard atmosphere with a surface temperature of 288.2 K, the altitude for the brightness temperature of the planet, 255 K, is 5 km but the tropopause is at 11 km. Even if you subtract 40 W/m2 for surface emission directly to space, the brightness temperature of the atmosphere is 242 K, that’s 7 km. Reducing very complex behavior to a single number is oversimplifying. The 1976 standard atmosphere is itself an oversimplification.

  104. Jeff Id, I would suggest that you get DeWitt Payne to do a post here at TAV as an exposition of what the chemistry and physics of the atmosphere, as it relates to GHG levels, says we know conclusively about these effects and what remains uncertain.

    He does a good job of explaining what we know. I think sometimes we waste a lot of time with discussions that tend to ignore what we know about the settled part of the science, and which as a result takes away from the more important discussions, in my view, of the uncertainties of the science and what we know we do not know.

  105. DeWitt:

    Not really. That’s approximately true for the CO2 band, but not the rest of the spectrum. For example, For the 1976 standard atmosphere with a surface temperature of 288.2 K, the altitude for the brightness temperature of the planet, 255 K, is 5 km but the tropopause is at 11 km. Even if you subtract 40 W/m2 for surface emission directly to space, the brightness temperature of the atmosphere is 242 K, that’s 7 km. Reducing very complex behavior to a single number is oversimplifying. The 1976 standard atmosphere is itself an oversimplification.

    I believe it’s true for all of the greenhouse gases, not just CO2.

    Am I mistaken?

    As to “oversimplifying”, certainly one could go to a 3-d nonlinear time-domain model, but what does that teach us? There needs to be some balance between the descriptions of the details and the underlying physical phenomenon.

  106. Carrick @110,
    From your original equation:
    dT/dz = q/k

    Compare this with an Ohm’s law formulation of the form:
    dV=I*(RdZ)
    where R is the resistance per unit length dz,
    dV is equivalent to dT and I (dq/dt of charge) is equivalent to dQ/dt of heat

    the two would be equivalent, no?

  107. In reference to the 1976 standard atmosphere, of course I recognize it’s limitations.

    Our group uses the 3-d profiles from an empirical code called g2s, and we augment these with radiosonde measurements.

  108. #122 RB, you are correct. Fourier’s law is an analog to Ohm’s law.

    The issue here is with interpretation. As I read Nick Stokes (apologizes if I misunderstood him), adding CO2 increases the resistance (1/k) so dT/dz increases. If you have a constant atmospheric layer, a larger dT/dz equals a higher surface temperature since we can consider the temperature at the “top” to be a constant.

    The problem is that dT/dz is a constant, q is a constant, that means that R=1/k doesn’t depend on CO2 content. This 1/k comes from atmospheric convective properties, and as I see it, the effect of of increased CO2 is from the increase in atmospheric convection.

  109. Carrick, you are correct that the description doesn’t capture the fact that the increased resistance comes from an effective increase in atmospheric height, not from an equivalent change in material properties from an Ohm’s law perspective.

  110. RB:

    I agree that that the net resistance increase is due to a length change, not due to resistance/unit length change.

    Well yes, I agree with this and it’s a pretty simple problem to calculate because for our purposes we can take dT/dz and hence 1/k to be constants.

    Here’s the issue though… I don’t think the resistance you get H_stratosphere/k_convection is in anyway related to CO2 content.

  111. Or going back to our original analogy, if we haven’t beaten this to death already, adding more CO2 is equivalent to making the blanket thicker, not denser.

  112. RB:

    Or going back to our original analogy, if we haven’t beaten this to death already, adding more CO2 is equivalent to making the blanket thicker, not denser.

    LOL. Too late for that.

    But yes I agree.

  113. Re: Carrick (Jan 28 13:31),

    I believe it’s true for all of the greenhouse gases, not just CO2.</blockquote

    The atmospheric emission spectrum calculated by MODTRAN includes Planck curves for different temperatures. So for any point on the spectrum, which is in cm-1 (effectively frequency) rather than wavelength, one can estimate a brightness temperature by interpolating between the curves. If you select the save text option, there are tables that include the temperature with altitude so one can then estimate the effective emission altitude. The brightness temperature in the CO2 band centered at 667 cm-1 is over 220 K. The temperature at the tropopause is 217 K. So even for CO2, the effective emission altitude is below the tropopause. All other ghg’s have effective emission altitudes below CO2 over most of the spectral range. Increasing the concentration of CO2 does not change the brightness temperature at the center of the band. It increases the width of the band. So saying that an increase in CO2 increases the effective emission altitude is only true if you integrate over the entire band. And if you do, the effective altitude is even further below the tropopause.

    The lapse rate in the tropopause is not entirely controlled by convection either. A lapse rate lower than the adiabatic lapse rate is stable to convection, but it’s not stable to radiation. So if one starts with an isothermal atmosphere at the same temperature as the surface, the upper atmosphere would cool by radiation to space until a steady state is achieved with a lapse rate somewhat lower than the adiabatic rate.

    The temperature inversion at the tropopause is caused by the presence of oxygen in the atmosphere. Oxygen absorbs in the UV and generates ozone, which absorbs even more in the UV. the temperature must then increase over that for a simple exponential atmosphere with a constant lapse rate to increase energy emission until in and out balance.

  114. Somehow the blockquote tag didn’t get closed. It should be obvious where it was though. The preview button that you get with CA Assistant/Firefox is your friend if you just remember to use it.

  115. DeWitt, thanks for the comments, but to be clear, I was trying to describe the processes in a simplified manner, not make it as complex as I possibly could. 🙂

    We do know both experimentally and from detailed models increasing CO2 tends to cause the tropopause to raise. If you want to suggest another mechanism besides increased convection from the added CO2, be my guest.

    Secondly we know that the average/environmental lapse rate is near 6.5°C/km. (Yes on a local scale you will find differences depending on weather.)

    Third, while I described the “top” at the troposphere, it’s really where radiative equilibrium is achieved, which will be below that height (that fact is almost a tautology based on the definition of the tropopause, which is denoted by the minimum temperature between the troposphere and the stratosphere and the fact that T(z) is in generally smoothly varying..)

    I realize one can go into greater detail, the “top” isn’t a top so much as a layer, and it begins where the temperature curve starts to bend over (is no longer controlled by convection). I hadn’t though about CO2 having a different effective temperature and hence equivalent height, the MODTRAN thing was inte And there is a semi-infinite amount of other physics and details one could throw into…

    But main point of my simple description, was to make it as simple as possible, but no simpler.

    Feel free to add your own explanation if you aren’t satisfied with mine, but otherwise I’m not sure what the point of the extra detail is for. Is it just interesting, or do you think there is something fundamental that can’t be neglected, even in a qualitative explanation?

  116. While I’m tossing out reference Manabe & Wetherald (1967) “Thermal Equilibrium of the Atmosphere with a Given Distribution of Relative Humidity” is considered to be a classic.

    My blown link is here Santer et al 2003, “Behavior of tropopause height and atmospheric temperature
    in models, reanalyses, and observations: Decadal changes.

  117. Carrick #127
    “As I read Nick Stokes (apologizes if I misunderstood him), adding CO2 increases the resistance (1/k) so dT/dz increases. If you have a constant atmospheric layer, a larger dT/dz equals a higher surface temperature since we can consider the temperature at the “top” to be a constant.”

    I’m speaking a bit more broadly, and was careful to speak of temp differences rather than gradients. Almost all modes of heat transfer have an Ohm-like law relating flux to temp difference (over the pathway). With radiation, there isn’t a spatial gradient, and the flux is proportional to a T^4 difference.

    CO2 increases mainly the resistance of the radiative pathway (by absorbing radiation and emitting some back in a downward direction). When radiation is absorbed and re-emitted several times in a gas path, the resulting flux does approach proportionality to the temp gradient (Rosseland heat transfer).

    Convection doesn’t totally change this. Turbulent transfer is also fairly proportional to temperature gradient. The complication with adiabatic lapse rate is that its associated flux is no longer proportional to the temp difference – in fact, the neutral no-flux state has the 9.6 K/km gradient. But also CO2 does not significantly change the dry adiabatic lapse rate.

    All that said, RB is right. Ohm’s law is the analogy.

  118. Nick Stokes:

    With radiation, there isn’t a spatial gradient, and the flux is proportional to a T^4 difference.

    CO2 is a saturated GHG right? So isn’t it true that photons in the wavelengths associated with the CO2 absorption band have a very short mean free path?

    If I understand the problem correctly, if you neglect convection, you are essentially modeling ordinary heat transport through the atmosphere, and not radiative transfer.

  119. Re: Carrick (Jan 28 15:59),

    An increase in surface temperature by whatever means will increase the height of the tropopause. It’s not a signature specific to ghg warming. Warmer air at constant pressure equals larger volume. Since area is fixed, that means expansion must be vertical. That’s why the tropopause starts at 17 km in the tropics and 9 km for sub-arctic winter. MODTRAN, however, doesn’t seem to include thermal expansion. That’s probably why it somewhat underestimates forcing from doubling CO2.

  120. Carrick #124
    “CO2 is a saturated GHG right? So isn’t it true that photons in the wavelengths associated with the CO2 absorption band have a very short mean free path?”

    That was Angstrom’s claim against Arrhenius’ theory, resolved by Plass in about 1955. There’s a complete range. Some wavelengths are saturated, some not affected at all. The transmission affected by change in CO2 is that in the fringe wavelengths. This is indeed the range of Rosseland radiation, which is like enhanced conduction with a resistance inversely proportional to the mean free path. It’s actually the most important kind of heat transport for the GHE.

    I’m not sure what you are saying about convection. It’s a heat transport component – fairly minor according to Trenberth. But it isn’t much affected by GHG.

  121. There’s quite a bit above here that I need to absorb but to me, the most rudimentary electrical system that you can construct as an analog for the greenhouse effect is a capacitor (Earth) with a resistive discharge path (atmosphere) driven by a current source which is a pulse train (the Sun). If resistance is zero, you have large swings on the capacitor as it fully charges and discharges in response to the incoming charge quanta. As discharge path resistance increases, mean voltage on the capacitor rises to a point where average current in = average current out but min-max fluctuations are lower than in the zero resistance case.

  122. Carrick wrote:

    Again, that’s your judgement call. It’s not what the science says.

    Dr. Spencer disagrees with your assessment. 😀

    Bruce

  123. Bruce, I was referring to what the climate models say. Spencer doesn’t model climate of course.

    But an you link to where Spencer says anthropogenic sulfate emissions aren’t important or couldn’t have balanced CO2 emissions prior to say 1975?

    My impression is that he thinks even when sulfates receded in importance relative to CO2 there were still other negative feedback terms that acted to stabilize climate against the increased CO2 emissions. I’ve never seen anything where he claimed you could neglect sulfates.

  124. Carrick: no one said anything specific about sulfates or specifically about particulates… I was simply pointing out that the idea that one could make blanket statements like, “Warming prior to 1980 was due to natural causes”, is not based on science. The definition of science being not that something is published. Rather, that something is published and independently verifiable.

    There’s no way that anyone can know that warming up to period XYZ was due to “natural” variability and any warming after this period is solely due to human activity, at least they can’t reasonably state this with any observational evidence because there is no way to compare pre and post. Why? There aren’t measurements of sulfate or aerosols or whatever the perceived “balancing” factors were. It’s not known what’s “natural” variability vs. human influence. You can’t separate out what warming was from which source… That is, unless you happen to have a magical thermometer that distinguishes between anthropogenic warming and “natural” warming. What’s relied on instead are models. These are models that by definition are not verifiable…The best that can be done of comparisons with what little is know… leading to issues of correlation-causation. And, we know that the modelers tune their models to achieve the desired or expected results. So, the question is, how do the models perform? Not terribly well.

    What does Spencer say?
    http://www.drroyspencer.com/2010/01/evidence-for-natural-climate-cycles-in-the-ipcc-climate-models-20th-century-temperature-reconstructions/

    Specifically:
    “If we use the 1900 to 1970 overlap to come up with a natural variability component, the following graph shows that the post-1970 warming is overstated by even more: 74%.”

    “Interpretation
    What I believe this demonstrates is that after known, natural modes of climate variability are taken into account, the primary period of supposed CO2-induced warming during the 20th Century – that from about 1970 onward – does not need as strong a CO2-warming effect as is programmed into the average IPCC climate model. This is because the natural variability seen BEFORE 1970 suggests that part of the warming AFTER 1970 is natural! Note that I have deduced this from the IPCC’s inherent admission that they can not explain all of the temperature variability seen during the 20th Century.”

    “The Logical Absurdity of Some Climate Sensitivity Arguments
    This demonstrates one of the absurdities (Dick Lindzen’s term, as I recall) in the way current climate change theory works: For a given observed temperature change, the smaller the forcing that caused it, the greater the inferred sensitivity of the climate system. This is why Jim Hansen believes in catastrophic global warming: since he thinks he knows for sure that a relatively tiny forcing caused the Ice Ages, then the greater forcing produced by our CO2 emissions will result in even more dramatic climate change!

    But taken to its logical conclusion, this relationship between the strength of the forcing, and the inferred sensitivity of the climate system, leads to the absurd notion that an infinitesimally small forcing causes nearly infinite climate sensitivity(!) As I have mentioned before, this is analogous to an ancient tribe of people thinking their moral shortcomings were responsible for lightning, storms, and other whims of nature.”

    “This absurdity is avoided if we simply admit that we do not know all of the natural forcings involved in climate change. And the greater the number of natural forcings involved, then the less we have to worry about human-caused global warming.

    The IPCC, though, never points out this inherent source of bias in its reports. But the IPCC can not admit to scientific uncertainty…that would reduce the chance of getting the energy policy changes they so desire.”

    Bruce

  125. I wish Dr. Spencer had been able to say something about what it implied about CO2 sensitivity. I’m not sure how to interpret his findings. But the question for me is: though the implied sensitivity to CO2 is less than what the ‘average’ model produces, does it fall outside the IPCC’s 95% range i.e. is it less than 1.5C per doubling of CO2? Annan and Hargreaves have a recent paper showing that the ensemble mean is not the truth. And this sensitivity is not programmed into the models – if I understand correctly, it is only the physical processes that are parameterized and sensitivity is what is implied by the climate models. There may still be no inconsistency with assuming a CO2-sulphate balance prior to 1980 but I agree that we need more accurate estimates of the sensitivity range and we need better modeling of the oceans.

  126. In the discussion above on the greenhouse effect, we assumed that lapse rate is a constant and therefore atmospheric height increases to effectively increase atmospheric resistance. Based on this:
    http://en.wikipedia.org/wiki/Lapse_rate

    would it be consistent to say that lapse rate is a constant because atmospheric height will increase due to adiabatic expansion according to the 1st law of thermodynamics, the mechanism of this expansion being a convection process, and therefore lapse rate is a constant? i.e., convection is necessary but convection is an outcome of the adiabatic process?

  127. I wonder if this supports the case that convection is the outcome and not the cause of the physics behind a constant lapse rate:

    Steeper and/or positive lapse rates (environmental air cools quickly with height) suggests atmospheric convection is more likely, while weaker and/or negative environmental lapse rates suggest it is less likely.

    http://en.wikipedia.org/wiki/Atmospheric_convection

  128. Bruce #146:

    There aren’t measurements of sulfate or aerosols or whatever the perceived “balancing” factors were.
    There are indirect measurements. Economic activity is one. But again, you are confusing what you find believable with what the science (in this case the models) predict.

    Now this quote from Spencer:

    What I believe this demonstrates is that after known, natural modes of climate variability are taken into account, the primary period of supposed CO2-induced warming during the 20th Century – that from about 1970 onward – does not need as strong a CO2-warming effect as is programmed into the average IPCC climate model. This is because the natural variability seen BEFORE 1970 suggests that part of the warming AFTER 1970 is natural! Note that I have deduced this from the IPCC’s inherent admission that they can not explain all of the temperature variability seen during the 20th Century.”

    Think about this Bruce. You brought it up. I think you aren’t grasping it’s interpretation.

    Spencer is suggesting that he believes that the variations observed before 1970 are natural.

    You do understand what I was saying was anthropogenic CO2 and sulfate emissions (which tend to cool the Earth) balanced each other. /insertblinkingredarrow

    That is the models suggest that variations observed before circa 1975 were natural.

  129. Re: RB (Jan 29 21:54),

    You can get a lot more detail on Physical Meteorology and the why and how of lapse rates from these on-line lecture notes/. However, the fundamental assumption of adiabatic expansion, no heat transfer to the surroundings, starts to fail at altitude as the efficiency of radiative transfer to space increases. For a system where there is only convection, an isothermal atmosphere is stable. The more or less constant lapse rate in the troposphere is a function of the balance between convective and radiative heat transfer.

    For example, if you initialize a simple radiative/convective model with an isothermal atmosphere at or above the surface temperature, the lapse rate increases until you see something very like what we observe in the real atmosphere. The same sort of thing happens if you start with an isothermal atmosphere below the surface temperature. You still end up with the same lapse rate. If you don’t allow radiation, then as long as the lapse rate is below the adiabatic rate, the atmosphere is stable.

  130. Re: RB (Jan 29 21:54),
    I think it goes like this. If the lapse rate is less than 9.8 K.km (dry adiabat), then because of expansion any up/down motion of air moves heat downwards (against the gradient). This takes energy, which comes out of the KE behind the motion. This stabilises against convection, by taking that energy out.

    If the lapse rate exceeds 9,8 K/km, then the up/down motion gains energy from the temp grad, and moves heat upwards. That makes the air unstable to convective movement. Of course, by moving heat upward, the gradient is pushed back towards 9.8.

    So 9.8 is the stable dry adiabat, in the presence of moving air. Any deviation has a restoring mechanism. And since other heat transfer mechanisms move heat down the temp grad, the KE of the air is always bein gdrained to maintain the lapse rate.

  131. Re: Nick Stokes (Jan 30 02:04),

    So 9.8 is the stable dry adiabat, in the presence of moving air. Any deviation has a restoring mechanism.

    Not true. An increase in the lapse rate above the adiabatic rate means that the buoyancy of a packet of air at lower altitude is higher than the air above it and it will rise to lower the lapse rate. OTOH, if the lapse rate is less than the adiabatic lapse rate, there is no driving force for convection. A packet of air from higher altitude that is brought to lower altitude would be warmer and have higher buoyancy than the air around it and it would rise back to its original altitude. That’s what makes a temperature inversion (warmer air above colder) stable. Note that the warmer air does not have to have a warmer absolute temperature than the air below it, only a higher potential temperature. The link in my post above goes into much more detail about potential, or isentropic, temperature.

  132. Carrick: I read Dr. Spencer’s analysis properly. Did you???

    What it says is that the models are incorrect. The models that you suggested were “the science” and that all warming post 1980 is due to humans don’t actually work. Just to refresh your memory, here’s what you wrote above (#80): The science says man-made warming started roughly 1980. Of course that’s what’s believed, what else is one supposed to say?

    If one is thinking honestly about the state of the science, one is supposed to say, “This is what the literature suggests, but there are valid reasons to question the findings.”

    My point was that there are too many unknowns in the models and there’s no way to verify them. Dr. Spencer’s analysis seems to agree with my interpretation.

    I brought up previous periods of time when the earth experienced warming consistent with what we are seeing currently that could not have been due to humans emitting CO2, which is terribly inconvenient for those that push CO2 as the boogeyman responsible for current warming. The explanation offered was that other factors influenced those previous periods of warming… particulates and other human emissions. However, those excuses are just that… rationalizations that cannot be empirically tested.

    Spencer’s analysis shows that with the information available at present that there is no way to determine what warming due is to natural variability and what warming is due to humans. It states that part of the warming after 1970 is natural… which is on in direct contrast contrast to what’s been previously accepted by some in the climate community.

    So, yes, Spencer is stating that the models are wrong and that the evidence available shows substantial natural variability prior to 1970… AND natural variability after 1970.

    Since you seem to be somewhat selective in your interpretation of what I wrote earlier, here it is (#67).
    Carrick:

    What does the science say about warming prior to 1980 being natural and that after 1980 being “human influenced”? Actually, what science says, “I don’t know.”

    What the science says is that what we’ve experienced so far is not outside what has occurred in the past (see the rest of my post about earlier temperature increases in certain intervals that are similar to what we’ve experienced recently… which you’ve stated was prior to the period being due to human influence).

    What some in the pro-AGW group say is, “It’s all due to humans and it’s all BAAADDD!!!”. Just saying it’s so isn’t science.

    Timetochooseagain #73 has it correct.

    Still no magical thermometer that can tell which heat is from what source.

    Bruce

  133. DeWitt Payne ,
    “there is no driving force for convection”

    Not from thermal buoyancy. But there’s plenty of convection. In our world, the lapse rate is generally less than 9.8, but air goes up and down all the time.

    That’s my point above – restoring a lapse rate from below, or just maintaining it, takes energy from a flow field. That energy has to be put in somehow. On the Earth, there are all sorts of mechanisms that supply it. Diurnal effects, rotation, spatial variations, all help to stir up the air. Once the air is in motion, it pushes the lapse rate towards 9.8.

    There’s a reason why the lapse rate sits below 9.8. Moisture condensation is usually cited. But all of the mechanisms that convey heat down a temp gradient are at work. They have to be countered by the adiabatic heat pump. Since the power available for this is limited, the lapse rate generally sits below 9.8.

  134. 156-I don’t get how Spencer shows that anthropogenic influences weren’t less before 1970 than after. Yes, he shows that some of the 70- warming is probably natural if we recognize that the earlier warming was largely natural. This doesn’t mean that the human influence wasn’t small before 1970 and not as small after.

  135. I should clarify my understanding: Whatever the effect of aerosols over the twentieth century, greenhouse forcing increased much more slowly before the second half of the twentieth century. In the absence of measurements we have to assume that we can estimate the time history of aerosols to some extent-this is woefully inadequate, but it’s all we’ve got. Now, whatever aerosols did to greenhouse forcing before 1970ish (Carrick thinks they canceled out, I think they probably still netted positive, just less) the forcing is just too small prior to 1970 to have caused the earlier warming.

    So significant anthropogenic warming influence began about 1970. This doesn’t mean that 70- warming was mostly AGW-it means it was more AGW than the earlier warming was.

  136. TTCA, what I’m claiming is the models show little or no net effect from anthropogenic activities.

    See this. According to this model (GISS Model-E), until the mid-1970s, net anthropogenic drivings were nearly zero, and possibly even negative from 1850-1930.

    The point of my comment is, this is what the state of the art in the models say. It is a point to start from, not end at. The models may be wrong, but they represent the best objective tool we have to study this question. And if I wanted to represent what I thought was the best representation of what climate science says about anthropogenic activity, this is where I would go.

    Spencer may be right and there may be problems with the models. Everybody agrees there is important physics still missing in them, but even then, not all of the corrections are necessarily going to lead to a warmer AGW influence. That is certainly Spencer’s argument.

  137. Re: Nick Stokes (Jan 31 04:52),

    air goes up and down all the time.

    Yes, but in our world, there is radiative cooling of the atmosphere to space that works to keep a positive lapse rate when heat is transferred to the upper atmosphere by deep convection as in thunderstorms. Air goes down because somewhere else air went up and hydrostatic balance applies over large areas.

    Here’s a thought experiment for you. Assume a completely transparent atmosphere so there is no radiative heat transfer within the atmosphere. All incoming heat to the surface is radiated directly to space. If the atmosphere is isothermal at the same temperature as the surface (potential temperature increases with altitude), how do you propose that the upper atmosphere cools to restore an adiabatic lapse rate? If there is no driving force for air to go up, then how does air come down when if it did it would be warmer and more buoyant than the air around it. I don’t think you can even get Hadley cell formation without radiative cooling in the upper atmosphere.

  138. Re: DeWitt Payne,
    In fact, I have a post about that thought experiment here. Any motion in the air causes the adiabatic heat pump to pump heat downwards. Motion can be produced by any inhomogeneity at the surface. An albedo difference, say, producing a local convection cell. There probably would be Hadley cells*. Day/night creates temperature gradients.

    It takes energy to run the adiabatic heat pump (less without GHG). But there’s plenty available.

    * Because of the stability it’s harder to get a cell started. But there’s energy available from the movement of heat from tropic to temperate to drive the cell.

Leave a reply to Kenneth Fritsch Cancel reply