Estimating the impact of CO2 on global mean temperature

Click For Summary
The discussion focuses on the quantification of CO2's impact on global mean temperature, emphasizing its significant role in climate change based on established physical principles. The author presents a detailed calculation of energy balance, CO2 forcing, and climate sensitivity, asserting that CO2's contribution to recent warming trends is approximately 0.022 K/year. The calculations are grounded in scientific literature, highlighting that while other factors influence climate, CO2 remains a straightforward and major contributor. The thread aims to clarify assumptions about CO2's impact while inviting respectful discourse on the topic.
  • #31
I came across a new paper that suggests a climate sensitivity for CO2 of somewhere close to one.

Douglass, D.H., J.R. Christy, 2009: Limits on CO2 climate forcing from recent temperature data of Earth. Energy & Environment, 20, 178-189 (Invited paper, reviewed by Editor.) You can download the pdf for free from Dr. Christy's web page http://www.nsstc.uah.edu/atmos/christy_pubs.html" .

The gist of the paper was the CO2 warming should be consistent across all climate zones because CO2 is well mixed and if some zones experience more warming than others, then it is for other reasons.

Here is the abstract:

The global atmospheric temperature anomalies of Earth reached a maximum in
1998 which has not been exceeded during the subsequent 10 years. The global
anomalies are calculated from the average of climate effects occurring in the
tropical and the extratropical latitude bands. El Niño/La Niña effects in the tropical
band are shown to explain the 1998 maximum while variations in the background
of the global anomalies largely come from climate effects in the northern
extratropics. These effects do not have the signature associated with CO2 climate
forcing. However, the data show a small underlying positive trend that is
consistent with CO2 climate forcing with no-feedback.​
 
Last edited by a moderator:
Earth sciences news on Phys.org
  • #32
joelupchurch said:
I came across a new paper that suggests a climate sensitivity for CO2 of somewhere close to one.

Douglass, D.H., J.R. Christy, 2009: Limits on CO2 climate forcing from recent temperature data of Earth. Energy & Environment, 20, 178-189 (Invited paper, reviewed by Editor.) You can download the pdf for free from Dr. Christy's web page http://www.nsstc.uah.edu/atmos/christy_pubs.html" .

Let's stop right there. That's not a credible source. It's not peer reviewed, and it is not in a recognized science journal. Here's the relevant part of the physicsforums guidelines
One of the main goals of PF is to help students learn the current status of physics as practiced by the scientific community; accordingly, Physicsforums.com strives to maintain high standards of academic integrity. There are many open questions in physics, and we welcome discussion on those subjects provided the discussion remains intellectually sound. It is against our Posting Guidelines to discuss, in most of the PF forums or in blogs, new or non-mainstream theories or ideas that have not been published in professional peer-reviewed journals or are not part of current professional mainstream scientific discussion. …
This is applied especially stringently in the Earth science forum, mainly because there is so much low grade unscientific rubbish in the popular debate. The problem of course, is that "climate skeptics" generally perceive that the scientific community is "biased" against their work. But the only bias is the same that applies in all science: a bias against bad science and sloppy methodology. There's still ample scope for robust scientific debate on the many wide open research questions of climatology.

I'm not going to go into the merits of this article in any depth. Doing so would just give the incorrect impression that this is an actual scientific debate. It isn't. The authors are legitimate scientists in their own right, who do publish in real scientific literature; but on this topic they are both extreme isolated outliers within the scientific community, and they are both known for making error-filled arguments for insupportable nonsense, and public pronouncements that appear deliberately misleading. Their paper here shows both.

The journal "Energy and Environment" is notorious in this whole area. It has no impact rating, and almost no scientific visibility. It does not have the technical standards of a real science journal. It was, in fact, set up as an alternative publication venue for stuff on climate especially that would get tossed out of a real science journal.

Furthermore, there wasn't even an attempt at a fake peer review here. This paper is listed as an "invited paper" and "editor reviewed"; and the editor in this case is Sonja Boehmer-Christiansen; an English academic in political sciences who is a so-called "climate skeptic" with no technical expertise in the subject matter.

Most of the stuff in the quoted abstract is true enough, and pretty trivial. I have already described the guts of what is true in the abstract, back in [post=2171973]msg#28[/post]:
sylas said:
Recently, it seems to be mostly due to the shift from a strong El Nino in 1998 to a strong La Nina in 2008. This is a natural cycle, with strong short term impact over a decade or two, but no long term cumulative effect. In my own personal opinion, which has no authority, I think there might also be a smaller contribution from the extended solar minimum. I've been trying to pull out a signal for the solar cycle from the datasets. It seems to be there, but subtle.
These are additional non-cumulative natural variations that exist on top of any other long term trends. The ENSO cycle is not carbon related; although there is a potential for patterns of ENSO oscillation to alter as climate shifts in response to increasing global temperatures.

Why would Christy and Douglass bother to say ENSO is not carbon related? It could be just a comment that there are non-carbon related sources of temperature variation (duh!) but my cynical mind says that these guys are playing the usual misdirection card. I think they WANT to sow confusion, and are quite happy if a naïve reader picks up the impression that this must be some discovery or evidence against the larger greenhouse driven trends. It's no such thing, of course!

For a more sensible discussion of other factors bearing upon regional climate trends, see the thread [thread=306202]Only dirty coal can save the Earth[/thread], which addresses the causes of the exceptionally strong Arctic warming, and describes solid scientific research linking that Arctic warming to aerosols.

Where the abstract tips from trivial into junk is with final sentence, which speaks of a "small underlying positive trend" consistent with CO2 and no-feedback. They are effectively using a 1.1 K/2xCO2 no-feedback sensitivity value, and everyone else uses 3 +/- 1.5 K/2xCO2.

Effectively, they are saying that yes, CO2 is probably the major cause of long time climate trends; but that the trend involved is very small. In other words -- their argument hinges on assuming that the Earth isn't actually warming as much as other measurements indicate. This is papered over with a lot of other stuff within the article, some of which won't stand up too well to close examination, but ultimately it’s the temperature data that is IMHO the major reason they get ridiculous sensitivity numbers.

Christy uses his UAH_LT data for estimating the prevailing rate of temperature increase, limited to the tropics. This gives warming rates substantially less than almost any other source of data. There's a lot more to say on this, and that actually is a real scientific debate now underway… which Christy is losing. The warming rates I have used in this thread continue to be the better guide; and all the serious empirical work on estimating sensitivity continues to try and improve the bounds we now know from many different lines of evidence to be somewhere in the range 1.5 to 4.5 K/2xCO2.

Cheers -- sylas
 
Last edited by a moderator:
  • #33
It is not particularly true that the forcing from CO2 is globally uniform, even though CO2 concentrations are essentially well-mixed. It is actually maximum in the tropics owing to the temperature contrast between the surface and troposphere, and even in the tropics is non-uniform because of dependence on humidity and clouds. The total response (accounting for feedbacks) is especially not uniform, which is why a fundamental characteristic of hothouse and coldhouse climates is in the changes pole-to-equator temperature gradient (being small in a warm climate, and much larger in a cold climate).

The sensitivity to a doubling of CO2 appears to be well within range of 2 to 4.5 C (IPCC 2007) and paleoclimate evidence does not support a very low sensitivity, such as a neutral feedback scenario. Many studies have examined this subject from a observational, paleo, and modelling standpoint and the results of a few decades of research really point to the IPCC range, so that is probably what policy decision should be based on. The high end is harder to constrain owing to the fact that feedbacks look like a converging power series (so you get asymptotic behavior as the gain factor approaches unity) but data doesn't really support a very high sensitivity (>5 C or so) either. A decade of flatline or cooling does not contradict any of this (and does not put any constraints on sensitivity), and is well expected to occur owing to the natural variability of the climate system, as discussed in a recent paper
http://www.agu.org/pubs/crossref/2009/2009GL037810.shtml
 
  • #34
sylas said:
Let's stop right there. That's not a credible source. It's not peer reviewed, and it is not in a recognized science journal. Here's the relevant part of the physicsforums guidelines

This is applied especially stringently in the Earth science forum,

Perhaps a disclaimer should have been made that this paper is not peer reviewed, but it seems to me that the last clause in the guidelines:
... or are not part of current professional mainstream scientific discussion.…
would at least allow discussion on relevant writings from a scientist like Christy who has published so much peer reviewed material elsewhere in the field.
 
  • #35
mheslep said:
Perhaps a disclaimer should have been made that this paper is not peer reviewed, but it seems to me that the last clause in the guidelines:
... or are not part of current professional mainstream scientific discussion.…
would at least allow discussion on relevant writings from a scientist like Christy who has published so much peer reviewed material elsewhere in the field.

John Christy is an active working climatologist, who has views that are strongly at variance with the great majority of his scientific peers. He publishes regularly in the real scientific literature. His ideas in the professional literature are actively engaged by other scientists, on their own merits.

You can introduce his ideas just fine with properly published work, and there are a number of advantages to doing it this way. Please make sure it is actually relevant to the specific topic of this thread, or if you want to explore some other issue, then consider a new thread for it.

I'm working on a longer response, which does look at Christy's published scientific work, and shows how it might and might not relate to this thread, and to the E&E paper.

Cheers -- sylas

PS. Caution: a lot of what Christy wrote prior to 2005 was flatly wrong, due to a basic algebraic error in his group's analysis, involving an incorrectly reversed minus sign, of all things. Everyone involved acknowledges this, and the scientific debate has moved on. This old error is now water under the bridge, but it certainly means that the older papers are well and truly out of date. Christy continues to argue for appropriately revised notions in the literature; and IMO he's losing that debate. But there's real engagement and scope to look into it, either here, or in another thread if that is more appropriate.
 
Last edited:
  • #36
sylas said:
John Christy is an active working climatologist, who has views that are strongly at variance with the great majority of his scientific peers. He publishes regularly in the real scientific literature. His ideas in the professional literature are actively engaged by other scientists, on their own merits.

You can introduce his ideas just fine with properly published work, and there are a number of advantages to doing it this way. Please make sure it is actually relevant to the specific topic of this thread, or if you want to explore some other issue, then consider a new thread for it.
Fair enough.

...PS. Caution: a lot of what Christy wrote prior to 2005 was flatly wrong, due to a basic algebraic error in his group's analysis, involving an incorrectly reversed minus sign, of all things. Everyone involved acknowledges this, and the scientific debate has moved on. This old error is now water under the bridge, but it certainly means that the older papers are well and truly out of date. Christy continues to argue for appropriately revised notions in the literature; and IMO he's losing that debate. But there's real engagement and scope to look into it, either here, or in another thread if that is more appropriate.
A similar caution is due then for Hansen, referenced in post #1 of this thread, not for a math error but the fundamental predictive failure of http://pubs.giss.nasa.gov/docs/1988/1988_Hansen_etal.pdf".
 
Last edited by a moderator:
  • #37
mheslep said:
A similar caution is due then for Hansen, referenced in post #1 of this thread, not for a math error but the fundamental predictive failure of http://www.pnas.org/content/103/39/14288.full.pdf+html".

Sorry, but I can't let this pass. That's false. It's also completely out of left field with no relevance to the discussion and seems to be added as some kind of attempt to indicate that the other side is just as bad... but there's no comparison here at all, so it's worth setting the matter straight.

There's no predictive failure at all by Dr Hansen in 1988; just the reverse. The projections made in 1988 were amazingly good. I don't know where you are getting your information, but you've got it completely backwards. This is a common talking point, so I'm not saying the claim originated with you: but it is most definitely moonshine.

On the other hand, Christy's group made a mathematical mistake with a huge impact, which confused the whole debate on atmospheric temperatures for nearly ten years. When it was finally identified, everyone agreed and the error was fixed. Immediately. And most importantly of all -- this means there is a sharp discontinuity in the literature, and that papers about Christy's troposphere measurements prior to 2005 are out of date.

The alleged predictive failure of 1988

You've cited a paper from 2006. [Addedum: references added below.] The claim of a predictive failure by Hansen is a common talking point OUTSIDE the scientific literature, which amounts to outright distortion and rewriting of history. Your cited 2006 paper tells the story correctly. It's not relevant to the thread here, but the claim of predictive failure is such a crock of horse manure that a refutation is in order.

This all refers to testimony given by Dr Hansen to congress in 1988. In that testimony, he describes three possible future scenarios: A, B and C. The talking point by people claiming a predictive failure simply focuses on scenario A, which involves the largest increases in GHG emissions and hence the strongest warming. In the testimony to congress, however, Hansen explicitly identified scenario B as the most likely course for future emissions, and this involves milder warming. As it turns out (and as the 2006 paper shows) the scenario B did in fact turn out to be the closest to subsequent history! See figure 2 of your cited paper and the discussion immediately above it.

It's also important to understand the difference between a scenario and a prediction! A scenario is a possible set of future human impacts. The most important aspect of a scenario is that it defines a level of anthropogenic emissions. As such, therefore, a scenario is not a prediction so much as a target for politicians or policy makers to try and achieve -- or avoid. Politicians can't regulate climate directly. They can only influence the human impact. The role of science in politics is to help inform the likely empirical consequences of different decisions.

It is to some extent just a lucky co-incidence that in 1988 Hansen correctly singled out the most likely scenario; and that the associated projection was so exceptionally close to reality. (This is pointed out explicitly in the 2006 paper; see the tail end of page 14289.) But there's certainly no failure! Just the reverse -- the prediction was amazingly good.

The NASA climate science research group continues to be leading the way in helping estimate the likely consequences of different scenarios on future climate.

Scientific skepticism in general

Apart from that specific point on 1988 that you've mixed up rather badly, I do agree that there's a more general caution to bear in mind, for all scientific work.

There is, as always, a general continuous incremental improvement in the level of science as time goes by. We make progress. Hence a paper from 1988 is likely to be less reliable than one from 2008. Really drastic mistakes like Christy's unfortunate minus sign are the exception rather than the rule; but even so there is still in general a tendency towards better knowledge with additional work. The 1988 projections by Hansen's research group, for example, used a sensitivity value that is somewhat at the high end of what we would use now… but still well within the currently accepted bounds.

Climate sensitivity is somewhere between 2 and 4.5 degrees per 2xCO2. In 1988 Hansen's group was using models where the value was about 4.2. These days they use models where the value is about 3, +/- 1. This is also described in the 2006 paper you've cited, at the end of page 14289.

Bottom line: three points.

(1) All scientific work, no matter when it is published, is always open to question and to falsification. Hence you never take any scientific paper just for granted. This is not a specific caution, but a general rule that applies across the board.

(2) On top of this, sometimes things actually get falsified, and the debate moves on, leaving the falsified ideas behind. That's what happened with Christy's pre-2005 claims: there's a sharp discontinuity in the literature at 2005, and older papers by Christy may include claims that he now also recognizes as erroneous. A special caution is therefore needed for this specific case.

(3) Finally, especially if you look beyond the scientific literature, you can find lots of material that is incompetent, or dishonest, or pseudoscientific claptrap. It's not always easy to identify; but it happens a lot in climatology. As a convenient short cut to avoid wasting time on low grade work of no relevance to legitimate science, it is suggested we stick to peer-reviewed sources in the Earth science forum. This is not a perfect solution, but it does help a lot, and it still allows lots of scope for looking at all sides of genuinely open questions. This restriction will allow in some material which is still poor quality, because peer review is not perfect, and it will disallow some material which is good quality. Caveat emptor.

Cheers -- sylas

Postscript: References.

Something odd happened in our posts, so I'll just clarify. When I started this reply, mheslep had provided a link to a 2006 PNAS paper, and this appears in my original quote. The reference is:
  • Hansen, J. et. al. (2006) http://www.pnas.org/content/103/39/14288.full.pdf+html, PNAS September 26, 2006, Vol. 103, no. 39, pp 14288-14293.
Since I posted, he's made the link refer to the older paper. The reference is
  • Hansen, J. et. al., (1988) http://pubs.giss.nasa.gov/docs/1988/1988_Hansen_etal.pdf , in J. of Geophys. Res., Vol 93, No D8, pp 9341-9364, Aug 20 1988.

Both papers are relevant and useful for sorting this out if anyone cares. The 1988 paper shows very clearly the three scenarios. Page 9345 explicitly notes that "Scenario B is perhaps the most plausible of the three cases". Figure 3 of the 1988 paper is the one that is repeated in the 2006 paper as figure 2, but with actual results to 2006 included. Scenario B did turn out to give the best match.

There's no "fundamental failure" here anyway you cut it; "outstanding success" is a much more accurate description. Modeling has come a long way since 1988, of course, and it continues to build on and extend the foundational work from 1988. What Hansen's group was doing in 1988 remains a classic example of first rate science that has continued to be the foundation for progress made since then.
 
Last edited by a moderator:
  • #38
Sylas -
In the 'PS Caution' post above you've take the topic into researcher reliability based on 'much of' what a researcher wrote - quite beyond the specific CO2 sensitivity topic of the thread. Fine, it is your thread, but by that measure, your Hansen reference in the the OP deserves the same scrutiny.

[Sorry for the 2006/1988 URL blunder - I corrected it seconds after posting as can be seen now, unfortunately that doesn't help w/ the forever erroneous email notification that goes out immediately.]

Regarding Hansen '88 et al, it most certain was woefully wrong, and Hansen has made some explanations about its flaws (below). Hansen's testimony to Congress has no scientific relevance. Hansen 88's estimation of world CO2 emissions based on this-or-that assumption on economic growth in China or this-or-that assumption on energy efficiency also is of no scientific relevance to the '88 paper, nor is that of any other climate scientist in particular. The authors use CO2 emissions assumptions to choose, reasonably at the time I am sure, some range of inputs to their model, but it is of course the model itself and its scientific basis, given emissions, which is of interest. Man made emissions assumptions are beside the point.

Hansen et al '88 makes particular predictions of temperature from models given some CO2 emission level, an independent variable. From H '88 we have:
Scenario A, assumes an exponential increase in emissions:
H et al 88 said:
...the assumed annual growth averages about 1.5% of current emissions ..."

Scenario B: assumes, for comparison:
H et al 88 said:
decreasing trace gas growth rates,...
(and an eruption or two which occurred)​
As we know now CO2 emissions increases were indeed exponential - http://www.pbl.nl/en/dossiers/Climatechange/TrendGHGemissions1990-2004.html" in the period '90 through '04, with a 5% increase in '04 alone. It is beyond dispute that what actually occurred most closely matched Scenario A, plus Pinutubo, and the Scenario B emissions did not even remotely occur. What H '88 assumed was most likely at the time with regards to man made emissions is a red herring IMO, having little to do with climate science.

Now as it happens the CO2 atmospheric concentration growth rate was rather flat over the period. That is where H '88 visibly fails. Atmospheric uptake is part of the science, part of the prediction. Turns out CO2 sinks, most likely forest and soils, were http://www.pnas.org/cgi/reprint/95/8/4113?maxtoshow=&HITS=10&hits=10&RESULTFORMAT=1&author1=hansen&andorexacttitle=and&andorexacttitleabs=and&andorexactfulltext=and&searchid=1&FIRSTINDEX=0&sortspec=relevance&resourcetype=HWCIT":
Hansen '98 said:
...But the growth rate has been flat in the past 20 years, despite moderate continued growth of fossil fuel use and a widespread perception, albeit unquantified, that the rate of deforestation has also increased. Apparently the rate of uptake by CO2 sinks, either the ocean, or, more likely, forests and soils, has increased.

Now otherwise the science used in H '88 modelling may have been excellent, it is surely beyond my layman's level of study to judge. None the less the model's predication of Scenario A temperatures given man made CO2 emissions was substantially wrong, it is there in figure 2, along with the definitions for each scenario, for all to see.
 
Last edited by a moderator:
  • #39
Scenario B was closest to reality, and Hansen's projections (accounting for model-obs uncertanties) were accurate. Saying otherwise is just wrong. This is also a very simple model compared to the more powerful AOGCM's used for projections today
 
  • #40
chriscolose said:
Scenario B was closest to reality, ...
Please explain.

From H '88, the complete passage.
Scenario A assumes that growth rates of trace gas emissions typical of the 1970s and 1980s will continue indefinitely; the assumed annual growth averages about 1.5% of current emissions, so the net greenhouse forcing increases exponentially. Scenario B has decreasing trace gas growth rates, such that the annual increase of the greenhouse climate forcing remains ...

Edit: and here, more precisely stated, is the reality:
Reupach et al 2007, PNAS
PNAS said:
CO2 emissions from fossil-fuel burning and industrial processes have been accelerating at a global scale, with their growth rate increasing from 1.1% y−1 for 1990–1999 to >3% y−1 for 2000–2004. The emissions growth rate since 2000 was greater than for the most fossil-fuel intensive of the Intergovernmental Panel on Climate Change emissions scenarios developed in the late 1990s
http://www.pnas.org/content/104/24/10288.abstract
 
Last edited:
  • #41
sylas said:
...Both papers are relevant and useful for sorting this out if anyone cares. The 1988 paper shows very clearly the three scenarios. Page 9345 explicitly notes that "Scenario B is perhaps the most plausible of the three cases".
Yes I am aware (of the "most plausible" text above Figure 2 in the 2006 paper, and in the original '88). Yes H '88 considered the emissions scenario B the 'most plausible'. No, B emissions - decreasing growth - did not happen. 'A' emissions most nearly happened, its temperature predictions did not.
 
Last edited:
  • #42
This discussion is now derailing Sylas' otherwise fine thread on CO2 sensitivity, starting with ~ my post #36, so perhaps it and everything after should be moved to another thread. Mentor help?
 
  • #43
I don't mind a bit of side track, but this could indeed be a very interesting topic in its own right. The stuff you guys have said is welcome to stay, as far as I am concerned; but a new thread would be good to take it all further. It's okay to just refer back to here; or move a few posts if you prefer.

I'd like to clarify one point on scrutiny and errors, if I may.

mheslep said:
Sylas -
In the 'PS Caution' post above you've take the topic into researcher reliability based on 'much of' what a researcher wrote - quite beyond the specific CO2 sensitivity topic of the thread. Fine, it is your thread, but by that measure, your Hansen reference in the the OP deserves the same scrutiny.

I am NOT suggesting special scrutiny, but pointing out a known pitfall; one known by everyone involved, including Christy himself. If someone wants to argue positions advanced by Christy, then they are going to have to tread unusually carefully, because it is all too easy in this case to end up arguing for propositions that Christy himself would no longer endorse. That's all.

Someone else wanted to discuss Christy. Fine; I suggested that they use his scientific work, but I gave a caution relevant to anyone wants to know what Christy would say for himself. It's not about special scrutiny, but about reliable identification of what Christy would actually want to say on his own behalf.

It's true that Christy's credibility has been damaged, frankly; and that I think he's losing the scientific debates at present. That's not the fundamental point. Any peer reviewed scientific work is welcome, and will be considered on its own intrinsic merits; not dismissed just because of unrelated errors in earlier work. Just take care that you actually represent the guy as he himself would want to be represented. OK?

The claim that Hansen's work of 1988 is out of date in the same sense is wrong. It's diametrically the opposite! There are lots of developments over the last 20 years, with steady progress and refinement... which is the more usual case in science. For state of the art in modeling, you should indeed use more recent papers -- and that is what I will do. The relevance of work back in 1988 is mainly historical. I try to keep up with latest work by everyone involved. But if anyone is actually interested in how we got to the present state of play, the 1988 paper is a classic, and stands out much more for prescient insights and foundations than for cases where there's been a need for improvement. You're comparing apples and maggots.

I'm happy to consider any relevant scientific work, from any source, on its own merits. I'd like to stick to the topic of estimating CO2 impact here please; but apart from that, the direction of the thread is very much up to anyone good enough to join in. I am thinking of saying more about Christy's work, where it is relevant on CO2 impacts.

[Sorry for the 2006/1988 URL blunder - I corrected it seconds after posting as can be seen now, unfortunately that doesn't help w/ the forever erroneous email notification that goes out immediately.]

No problem. Both references are useful and directly relevant.

I'm not going to comment on the rest here, but I'm actually quite stunned. I might say more if you'd like to start a new thread for it.

Cheers -- sylas
 
  • #44
Hi, I have a few questions.
In the OP it mentions a climate sensitivity of 3 +-0.15 degrees Kelvin for CO2 doubling.
You also mention (from OP):
A standard reference for the calculation is in Myhre et al, (1998). The forcing for any doubling of CO2 is about 3.71 W/m2, or 5.35 per natural log. This is known to high precision for well defined conditions, and to about 10% accuracy in general for the Earth. That is, doubling CO2 in the atmosphere results in 3.7 W/m2 less IR emission escaping into space… until the surface heats up sufficiently to restore the balance.
You cite a 1998 reference for the CO2 forcing. More recent calculations/findings/research? have suggested lower values for the forcing value which corresponds to a lower climate sensitivty for CO2 doubling.

E.g.:
http://www.ecd.bnl.gov/steve/pubs/HeatCapacity.pdf"
Abstract:
The equilibrium sensitivity of Earth's climate is determined as the quotient of the relaxation time constant of the system and the pertinent global heat capacity. The heat capacity of the global ocean, obtained from regression of ocean heat content vs. global mean surface temperature, GMST, is 14 ± 6 W yr m-2 K-1, equivalent to 110 m of ocean water; other sinks raise the effective planetary heat capacity to 17± 7 W yr m-2 K-1 (all uncertainties are 1-sigma estimates). The time constant pertinent to changes in GMST is determined from autocorrelation of that quantity over 1880-2004 to be 5 ± 1 yr. The resultant equilibrium climate sensitivity, 0.30 ± 0.14 K/(W m-2), corresponds to an equilibrium temperature increase for doubled CO2 of 1.1 ± 0.5 K. The short time constant implies that GMST is in near equilibrium with applied forcings and hence that net climate forcing over the twentieth century can be obtained from the observed temperature increase over this period, 0.57 ± 0.08 K, as 1.9 ± 0.9 W m-2. For this forcing considered the sum of radiative forcing by incremental greenhouse gases, 2.2 ± 0.3 W m-2, and other forcings, other forcing agents, mainly incremental tropospheric aerosols, are inferred to have exerted only a slight forcing over the twentieth century of -0.3 ± 1.0 W m-2.
The author argued a 1.1 degree kelvin (max 1.6) increase if CO2 doubles.

What is the estimated doubling time?
http://www.citeulike.org/user/mouton/article/3826083"
Carbon dioxide is increasing in the atmosphere and is of considerable concern in global climate change because of its greenhouse gas warming potential. The rate of increase has accelerated since measurements began at Mauna Loa Observatory in 1958 where carbon dioxide increased from less than 1 part per million per year (ppm/yr) prior to 1970 to more than 2 ppm/yr in recent years (Keeling et al., 1995). This accelerating growth rate, which the London Guardian (2007) headlined a “Surge in carbon levels raises fear of runaway warming”, suggested that the terrestrial biosphere and oceans ability to take up carbon dioxide may be lessening as predicted from models and data (Fung et al., 2005; Le Quéré et al., 2007). Here we show that the anthropogenic component (atmospheric value reduced by the pre-industrial value of 280 ppm) of atmospheric carbon dioxide has been increasing exponentially with a doubling time of about 35 years since the beginning of the industrial revolution (~1800). Even during the 1970's, when fossil fuel emissions dropped sharply in response to the “oil crisis” of 1973, the anthropogenic atmospheric carbon dioxide level continued increasing exponentially at Mauna Loa Observatory. Since the growth rate (time derivative) of an exponential has the same characteristic lifetime as the function itself, the carbon dioxide growth rate is also doubling at the same rate. This explains the observation that the linear growth rate of carbon dioxide has more than doubled in the past 40 years. The accelerating linear growth rate is simply the outcome of exponential growth in carbon dioxide with a nearly constant doubling time of about 35 years (about 2 %/yr) and appears to have tracked human population since the pre-industrial era.
So, a doubling time of 35 years (?), taking into cosideration that the terrestrial biosphere and oceans' ability to take up carbon dioxide may be lessening.

I thought this article was interesting:
http://media-newswire.com/release_1083196.html"
(Media-Newswire.com) - One of the “pumps” contributing to the ocean’s global circulation suddenly switched on again last winter for the first time this decade, scientists reported Tuesday ( Dec. 23 ) in Nature Geoscience. The finding surprised scientists, who had been wondering if global warming was inhibiting the pump—which, in turn, would cause other far-reaching climate changes.

The “pump” in question is the sinking of cold, dense water in the North Atlantic Ocean in the winter. It drives water down into the lower limb of what is often described as the Great Ocean Conveyor. To replace that down-flowing water, warm surface waters from the tropics are pulled northward along the Conveyor’s upper limb.

The phenomenon helps draw down the man-made buildup of carbon dioxide from air to surface waters and eventually into the depths, where the greenhouse gas can be stored for centuries and offset global warming. It also transports warm tropical waters northward, where the ocean transfers heat to the air and keeps winter climate in the North Atlantic region much warmer than it would be otherwise.
Should this be taken into account when modelling CO2 doubling?
If CO2 doubles, can we expect a 3 degrees increase or a 1.6 (max)?
Is a 35 year period of CO2 doubling a certainty?

Also, which graph shows global temperatures the most accurate (in your opinion)? Which data set is it based on? Could you please provide such a graph if you don't mind. Thanks!
 
Last edited by a moderator:
  • #45
FTP said:
Hi, I have a few questions.
In the OP it mentions a climate sensitivity of 3 +-0.15 degrees Kelvin for CO2 doubling.

Actually, 3 +/- 1.5 That's a generous estimate... these days a range of 2 to 4.5 is pretty solid.

You cite a 1998 reference for the CO2 forcing. More recent calculations/findings/research? have suggested lower values for the forcing value which corresponds to a lower climate sensitivty for CO2 doubling.

This mixes up two quite different things; the forcing, and the sensitivity. They are different. In my [thread=307685]original post[/thread], the forcing was addressed in points (1) to (4), and the sensitivity in (5) and (6).

On the one hand, there is the forcing, which is very well known. The 1998 reference is something of a classic, and the value has not changed since. There have been subsequent calculations, and they give the same result, within the estimation errors. This forcing is 3.7 W/m2 per doubling of CO2, to about 10% or better.

Note the units! There's nothing there about temperature yet. This is strictly the effect of increased thermal absorption of carbon dioxide, on the transmission of radiant energy. It is founded on well understood physics, and it is not in any credible doubt.

The second thing is sensitivity. This is the response of the planet to a forcing, and the response is pretty similar for different sources of forcing. The 1998 paper has nothing to do with this.

Sensitivity is a temperature change per unit forcing. It can either be given using the doubled CO2 as a basic reference, or it can be given in terms of W/m2. In either case, it isn't actually about CO2 at all, but about the response to any forcing. There are some quite sensible technical reasons for using doubled CO2 as a unit of forcing. The unit W/m2 is a bit more ambiguous, because it depends on precisely how energy balance is identified. But the doubling of CO2 is pretty unambiguous, and hence it makes a good standard reference point.

The sensitivity of the Earth is known to limited accuracy. It is constrained by quite a number of empirical methods, and this is where you get the 2 to 4.5 degrees per doubling of CO2.

You can also represent this as 0.5 to 1.2 degrees per W/m2.

http://www.ecd.bnl.gov/steve/pubs/HeatCapacity.pdf"
Abstract:

The author argued a 1.1 degree kelvin (max 1.6) increase if CO2 doubles.

That has already been discussed in the thread. See [post=2171973]msg #28[/post]. The paper, by Steve Schwartz, is a legitimate peer-reviewed reference, but the method used has serious flaws that were pointed out almost at once in the subsequent issue of the journal. Schwartz is an extreme and isolated outlier in this whole area, and it didn't take long at all to see where he went wrong. His estimate is flatly in contradiction to just about every other empirical estimate, and the reasons why his value is too low are problems with his methodology. It's something of an oddity in the literature, and given the plainly identified problems, the value obtained is not really meaningful.

What is the estimated doubling time?
http://www.citeulike.org/user/mouton/article/3826083"

I haven't taken up that topic, because the time it takes to double the concentration of CO2 is so strongly dependent on human emissions. It's not simply a number to be measured, but a choice or target for the future. The exponential rate of increase is not sustainable anyway (IMO) since fossil fuels are a finite resource, but even more importantly, the choices we make as we move out of an oil based energy dependence will govern how the rate of increase proceeds in the future. I guess a social scientist might just treat human behaviour as another natural phenomenon to be modeled and projected into the future. This is effectively what the paper you cite here is doing; and short term, absent some deliberate collective human choices, it's probably about right. I think it will run into significant changes of the underlying assumptions as we run into peak oil, and I hope it there will also be deliberate choices to reign in the anthropogenic impact, from individuals and organizations and governments. I guess we'll see soon enough.

My own concern here is simply to explain the physics of how one particular important gas impacts global temperatures.

I thought this article was interesting:
http://media-newswire.com/release_1083196.html"

Should this be taken into account when modelling CO2 doubling?

Not really, no; but it should be considered certainly in climate modeling generally.

There's no doubt at all that there are many different factors impacting climate, and I've tried to keep that clear in this thread. The "Circulation Pump" research you mention here is very interesting indeed. However, it is really about a change in short term variation and in the time response to other forcings. It is not strictly a forcing in its own right. It is another important aspect of climate studies, and would be very relevant indeed to another ongoing thread: [thread=311982]Ocean Heat Storage[/thread].

If CO2 doubles, can we expect a 3 degrees increase or a 1.6 (max)?

1.6 is pretty much an absolute minimum. It's not likely to be more than 4, IMO, though it is common to give 4.5 as the upper bound, which I have done pretty consistently in this thread, based on the scientific literature on the subject.

It is also quite uncertain is how long it will take… and this is the major relevance of factors like the North Atlantic Circulation you mentioned above! Even if CO2 levels are doubled in 35 years, and then frozen at that point, it would take at least several decades and possibly a century or two before the full impact was realized as a temperature change. The reason for this is the long time lag in heating up an ocean until energy balance is restored.

The flip side to that, of course, is that if we froze CO2 levels dead, tomorrow, by some impossible stroke of luck, there would still be a significant amount of warming to occur, reflecting the current energy imbalance. That could be anything from 0.25 to 0.75 W/m2, and hence anything from 0.2 to 0.9 degrees of warming still in the pipeline. Note that doubling from NOW (which is what the 35 year figure is proposing) would be adding an additional 2 to 4 degrees on top of that again, as a rise by the end of the century or so.

Is a 35 year period of CO2 doubling a certainty?

No; but it is plausible, if we do nothing about it and continue increasing emissions at an accelerating rate into the future.

Cheers -- sylas
 
Last edited by a moderator:
  • #46
I read the comments to the heat capacity paper. The author replied to the comments http://www.ecd.bnl.gov/steve/pubs/HeatCapCommentResponse.pdf" :
Abstract
Reanalysis of the autocorrelation of global mean surface temperature prompted by the several comments, taking into account a subannual autocorrelation of about 0.4 year and bias in the autocorrelation resulting from the short duration of the time series has resulted in an upward revision of the climate system time constant determined in Schwartz [2007] by roughly 70%, to 8.5 ± 2.5 years (all uncertainties are 1-sigma estimates). This results in a like upward revision of the climate sensitivity determined in that paper, to 0.51 ± 0.26 K/(W m-2), corresponding to an equilibrium temperature increase for doubled CO2 of 1.9 ± 1.0 K, somewhat lower than the central estimate of the sensitivity given in the 2007 assessment report of the Intergovernmental Panel on Climate Change, but consistent within the uncertainties of both estimates. The conclusion that global mean surface temperature is in near equilibrium with the applied forcing continues to hold. Forcing over the twentieth century other than that due to greenhouse gases, ascribed mainly to tropospheric aerosols, is estimated as -1.1 ± 0.7 W m-2.
As mentioned, the upper limits of the adjusted forcing is within reach of the IPCC estimate, but the lower limit within the limit of the original article.

Could you please comment on which graph shows global temperatures the most accurate (in your opinion)? Which data set is it based on? Could you please provide such a graph if you don't mind. Thanks!
 
Last edited by a moderator:
  • #47
FTP said:
I read the comments to the heat capacity paper. The author replied to the comments http://www.ecd.bnl.gov/steve/pubs/HeatCapCommentResponse.pdf" :

Good on you! Actually reading all the comments to the paper, and the response, shows commendable thoroughness! I skimmed through pretty quickly, I must confess.

For others interested, I'll copy forward into this post all the links and references for the original paper, and the comments and response (which ftp cites above) as they appear in volumes 112 and 113 of the Journal of Geophysical Research. (Also given in [post=2171973]msg #28[/post].) The first reference is the original paper, the next three are comments, and the last is the response of Schwartz to the comments -- which FTP has also linked above, using a freely available copy at Steve Schwartz' website.
  • Schwartz, S.E. (2007) http://www.agu.org/pubs/crossref/2007/2007JD008746.shtml, in J. Geophys. Res., 112, D24S05, doi:10.1029/2007JD008746.
  • Foster, G., J. D. Annan, G. A. Schmidt, and M. E. Mann (2008), http://www.agu.org/pubs/crossref/2008/2007JD009373.shtml, in J. Geophys. Res., 113, D15102, doi:10.1029/2007JD009373.
  • Knutti, R., S. Krähenmann, D. J. Frame, and M. R. Allen (2008), http://www.agu.org/pubs/crossref/2008/2007JD009473.shtml, in J. Geophys. Res., 113, D15103, doi:10.1029/2007JD009473.
  • Scafetta, N. (2008), http://www.agu.org/pubs/crossref/2008/2007JD009586.shtml, in J. Geophys. Res., 113, D15104, doi:10.1029/2007JD009586.
  • Schwartz, S. E. (2008), http://www.agu.org/pubs/crossref/2008/2008JD009872.shtml, in J. Geophys. Res., 113, D15105, doi:10.1029/2008JD009872.

As mentioned, the upper limits of the adjusted forcing is within reach of the IPCC estimate, but the lower limit within the limit of the original article.
A realistic estimate of sensitivity must consider all the research, and there's a lot of empirical evidence that pretty much rules out anything as low as 1.5. Hence Schwartz' low estimate is not really relevant. The more important question is… can his UPPER estimate actually apply? If so, then he's provided a useful upper bound that narrows possibilities significantly.

Look at the numbers here.
  • 1.5 – 4.5 (IPCC 4AR, WG1, chapter 9, page 666)
  • 0.6 – 1.6 (Schwartz, original paper, 2007)
  • 0.9 – 2.9 (Schwartz, in response to criticism, 2008)

The IPCC estimate is generally around 1.5 to 4.5, and "likely" to be between 2.0 and 4.5. This is not a single scientific estimate, because the IPCC is basically a review of all available literature; the range proposed here considers many different lines of evidence. The IPCC tends to err on the side of caution, to encompass a broad range of views.

Climate models seem to indicate pretty strongly a narrow range of what is credible, more like 2.5 to 3.5 these days. That's probably a good bet; but it is the nature of things that models can go wrong, and so the wider bounds from empirical estimates (which I used in the original post) are still important.

Schwartz' upper bound is right at about the most likely value, of 3. A value of 2.8 or 2.9 is pretty normal in models; that showed up in some modeling papers from the NASA climate group I was looking at recently. The difficulty is that most scientists working on this still seem to think that the method Schwartz is using is not particularly meaningful. The original paper was pretty clearly wrong, and I don't think there's a lot of interest in the attempt to resuscitate the method with better control of the autocorrelation problems. I think mostly this will be ignored. His lower bound is irrelevant, because there are already much stronger lower bounds available. The upper bound would be relevant if the method was better founded, but I think most people think it's too weak to apply with any confidence. So although Schwartz' upper bound of 2.9 is probably a pretty dashed good estimate as a magnitude, realistically scientists are going to keep an open mind on the range up to 4.5. Above 4.5 is very unlikely.

However – looking at the comments of Scafetta, for example – it seems credible to me that what Schwartz' method actually uncovers is the transient response, which is less that the equilibrium response used to define sensitivity. The IPCC estimates this as between 1.0 and 3.5 (right after the conventional sensitivity estimate, on page 666)… and this is pretty close to the range obtained by Schwartz.

Transient response is closely related to what I mentioned in my previous post, about time lags and response "in the pipeline". If you ignore the longer term equilibrium response, then you are probably close to what Schwartz is trying to measure, and he adds nothing much to that at all.

I'm not an expert on this; just an amateur who has fun reading the experts. My opinion is not worth much on my own authority, but it looks that way to me on reading the various comments.

I very much doubt that his method holds up well enough to really give much of a useful additional constraint.

Could you please comment on which graph shows global temperatures the most accurate (in your opinion)? Which data set is it based on? Could you please provide such a graph if you don't mind. Thanks!

Schwartz is using the two main global anomaly datasets; the GISS dataset from the NASA climate group at the Goddard Institute of Space Studies, and the HadCRUT3 dataset provided by the Hadley Centre in the UK. (called CRU, in Schwartz' response to comments.)

There are slight differences between the two datasets, which are mainly to do with the methods used to extrapolate into regions of the globe where there is poor coverage – especially the Arctic. Personally, I don't think either should be considered as better, but that it is a good thing to consider them both. If you run any analysis on global anomalies, then you should repeat it on several datasets if at all possible, rather than single out one as better. I keep both datasets in a single spreadsheet for my own calculations, and can switch very quickly from one to the other with any analysis I run for my own interest. There are some others available as well, but these are the easiest to obtain and use for yourself.

The Hadley Center gives easy access to their data, and provides some really excellent diagnostic graphs at their website. Go to HadCRUT3 Diagnostics, and especially the nh+sh data, which they recommend for general use. There are nice graphs in the second link, and links to download the data in ascii form. You can follow the "home" link back to look at other datasets available.

The Goddard Institute for Space Studies provides a page on Surface Temperature Analysis, with links to graphs and data and various other goodies.

Here's a very simple plot of the two datasets together. (They have a different baseline, so you need to renormalize to get them aligned vertically.)
gisscru.jpg


Cheers -- sylas
 
Last edited by a moderator:
  • #48
If the climate warms, wouldn't a larger percentage of Earth's IR emission shift toward a warmer (shorter wavelength) spectrum of IR radiation which lies in the range of the "transparent window"? If so, wouldn't that be another response (like convection) which would serve to moderate the global temperature within certain bounds. Similar to a pressure relief valve, only dealing with IR emission.
 
  • #49
skypunter said:
If the climate warms, wouldn't a larger percentage of Earth's IR emission shift toward a warmer (shorter wavelength) spectrum of IR radiation which lies in the range of the "transparent window"? If so, wouldn't that be another response (like convection) which would serve to moderate the global temperature within certain bounds. Similar to a pressure relief valve, only dealing with IR emission.

Yes indeed... the spectrum shifts a bit as the planet heats up.

As the peak emission spectrum moves towards higher frequencies, you get the infrared window moving up towards the peak... but you also have the main absorption window moving up over the peak as well, with the secondary absorption region around wavenumber 1000 growing in significance.

You can have a look at this yourself using the http://geosci.uchicago.edu/~archer/cgimodels/radiation.html calculator which has been used a couple of times in these threads. Run for the default values, and then run again with a significantly higher surface temperature. Since the peak of Earth's thermal emission is already inside the large absorption band where carbon dioxide has its strongest effect, you can see the difference by comparing the relative height of the spectrum at the shoulders of this absorption window.

Here, for example, is the default case, and a repeat with 10 degrees rise in temperature (changing nothing else).
tempshift.GIF

Obviously, this involves a whole heck of a lot more thermal emission. To get back into energy balance at 10 degree additional surface temperature is going to take big increases in various greenhouse gases. But we are looking for a shift in the thermal spectrum. You can use the main absorption band as a reference point. On the left image (current conditions) the left side of the band at about wavenumber 580 is just a fraction lower in intensity than the right side, at wavenumber 760 or so. On the right image (10 degrees extra) if you look really carefully you can see this has reversed, as the main absorption band is passing across the peak in the thermal spectrum.

The effect, if any, is going to be pretty small. The main absorption band is still centered smack in the middle of the peak of Earth's thermal emission. But there's one much more fundamental lesson to be learned with this exercise...

The fact that you can observe this with the calculator tells you that this effect is already taken into account. Climate models are based on fundamental physics, including the spectrum of blackbody radiation. The calculation of surface emissions in climate models uses the same physics that you are using to conclude that the windows of absorption and transparency move towards the peak of thermal emission.

Climatology and greenhouse effect is not hacked guesswork. It's applied physics, used by experts.

Cheers -- sylas
 
Last edited by a moderator:
  • #50
sylas said:
The calculation of surface emissions in climate models uses the same physics that you are using to conclude that the windows of absorption and transparency move towards the peak of thermal emission.

Climatology and greenhouse effect is not hacked guesswork. It's applied physics, used by experts.

Cheers -- sylas

The transparent windows do not move, they are fixed wavelengths lying between the bands absorbed by CO2, Water Vapor and other GHG's.
The surface temperature, and thus the emission spectrum, varies, so the percentage of freely escaping outgoing IR for any given concentration of GHG is not a constant (even before considering dynamic processes like convection.)

For example, wouldn't more vegetative cover result in a darker surface and thus a shorter (warmer) wavelength IR emission from the surface? Land use satellites monitor these color changes regularly, and globally.

You are correct that the physics of the greenhouse effect is not hacked science (although the term "greenhouse" is something of a misnomer). The basic labaratory physics are well known and demonstrable.

The application of this solid physics to a dynamic fluid system in order to formulate extremely long term projections appears to be where the "hacked guesswork" lies.

Grins...
 
  • #51
skypunter said:
The transparent windows do not move, they are fixed wavelengths lying between the bands absorbed by CO2, Water Vapor and other GHG's.
The surface temperature, and thus the emission spectrum, varies, so the percentage of freely escaping outgoing IR for any given concentration of GHG is not a constant (even before considering dynamic processes like convection.)

Oops. Quite right, of course. Thanks for picking that up.

I tend to think of velocity as relative! (I must have been thinking of the relativity threads I'm engaging... :blushing: I'm aware that the bands are fixed, but I phrased things there using the spectrum peak as a reference frame.)

If you consider the spectrum as a reference point, then the bands move; but of course you are quite right that it is actually the peak of the emission spectrum which is moving and I should have phrased it the other way around. I'm not actually thinking that there is a change in the absolute frequency of the transmission and absorption bands! But did you look at the diagrams that show just how little shift there is in the emission spectrum, for a temperature increase of 10 degrees?

The percentage difference as a result of the movement in the peak of the emission spectrum is trivial; and note that the tropopause spectrum doesn't actually move much at all, unless maybe in the other direction! In either case, what matters for climate is absolute quantities of emission radiation... the forcings. The calculation of absolute radiation values already incorporates the shift of spectrum with temperature. It falls out naturally from the fact that all this is done with valid physics. If you try to single out the movement of the emission spectrum towards higher frequencies, you are looking at a completely trivial effect, which is already fully a part of the calculations of total impact.

For example, wouldn't more vegetative cover result in a darker surface and thus a shorter (warmer) wavelength IR emission from the surface? Land use satellites monitor these color changes regularly, and globally.

Yes. This is a part of climate models also, in the most recent generation of models. The net effect is small, but it can matter with regional forcings.

You are correct that the physics of the greenhouse effect is not hacked science (although the term "greenhouse" is something of a misnomer). The basic labaratory physics are well known and demonstrable.

The application of this solid physics to a dynamic fluid system in order to formulate extremely long term projections appears to be where the "hacked guesswork" lies.

Grins...

Shrug... I take your point; except that hacked guesswork is not a good description IMO. Fluid mechanics is not the major problem. For a better example of where models are a long way short of thorough physical modeling, consider cloud effects. A cloud is much smaller than a grid cell in a climate model, and so the models use abstractions, that summarize broad aspects of cloud cover in a region; percentage cover, altitude distributions, composition etc. There's a lot of work in making and testing such abstractions. So while it is certainly true that they are a long way short of a complete physical model, it is a lot better than guesswork.

Another point where models lack the resolution to capture the physics in detail is ocean transport of heat. Small scale eddy effects are unclear and have to be represented with parametrized abstractions... and how those change over time as the planet heats up is another uncertainty.

As I have said recently in another forum... an appropriate degree of scientific skepticism is important when looking at a complex subject like this. What's proper is the normal practice of working scientists right now. They don't make grand claims of perfection where there is real uncertainty, and papers are typically hedged throughout with explicit recognition of problems and uncertainties. As a body of literature, climate studies are full of open debate and disagreement, and something like the IPCC assessment reports has uncertainties and alternatives strongly emphasized throughout. It's also important to look at the impact of uncertainties. Some aspects of climate are fairly robust in the face of uncertainty in other facets.

Where skepticism goes off the rails is when it turns into a head-in-the-sand refusal to accept anything at all until it's all perfect. I'm not saying that's you, by the way! But it is certainly common in the "denialist" literature, and it is so blatant that a phrase like "denialist" is appropriate. It's not skepticism any more at that point, in my opinion.

Some things are known as well as we know anything in physics... like the absorption bands of CO2 and increased thermal absorption with increasing concentrations, which can be studied in a lab as you note. And yet it's all grist to the mill for the hard core denialist, who will even seize upon pseudoscience like the recent Gerlich and Tscheuschner paper denying that the atmospheric greenhouse effect works at all, cited recently in this thread.

With this thread I'm not trying to solve the whole climate problem, but to help explain greenhouse gases, and carbon dioxide in particular, necessarily stands as a major impact for shifting climate in recent decades. That's a fact, as much as anything is a fact.

Cheers -- sylas

PS. You're right that the word greenhouse is not perfect... but it's not that bad either. Both an atmospheric greenhouse and a glasshouse work by trapping heat and inhibiting a vertical transport of heat up from the surface. The main factor in a glasshouse is blocking convection, and the main factor in the atmospheric greenhouse by blocking radiant emission, so physically the process is somewhat different. But the net effect is similar and for much the same reason. The surface has to heat up more to shed the solar energy it is receiving.
 
  • #52
Your last post puts us in agreement at least as to the nature, if not the degree, of uncertainty in climate models.
I agree that scientists are generally aware of the uncertainties. It's the mainstream media and the political body that it feeds which fail to take them into account.
Your point is well taken that this thread is concerned only with the impact of CO2. Please excuse the divergence.
Sincere thanks for the lively discussion!
 
  • #53
skypunter said:
Your last post puts us in agreement at least as to the nature, if not the degree, of uncertainty in climate models.
I agree that scientists are generally aware of the uncertainties. It's the mainstream media and the political body that it feeds which fail to take them into account.
Your point is well taken that this thread is concerned only with the impact of CO2. Please excuse the divergence.
Sincere thanks for the lively discussion!

I think there may also be issues about how scientists present their findings about climate models to the lay community. There is an interesting paper about it: Seductive Simulations? Uncertainty Distribution Around Climate Models

http://www2.geog.ucl.ac.uk/~mdisney/teaching/1006/papers/lahsen_gcm.pdf"

There is also the issue that the underlying process may be chaotic and not predictable even in theory.
 
Last edited by a moderator:
  • #54
joelupchurch said:
There is also the issue that the underlying process may be chaotic and not predictable even in theory.

I cannot resist.

This link describes an attempt at practical use of fluid dynamic modelling. It might be viewed as a microcosm of the climate modeling dilemma.

http://www.lassc.ulg.ac.be/bibli/MinetHeyen-2001.pdf

Many blast furnace designers have returned to designing furnaces the old fashioned way, trial and error.

Another apology for another side-track.
 
  • #55
Here is http://globalchange.mit.edu/pubs/abstract.php?publication_id=1974"

Abstract

The MIT Integrated Global System Model is used to make probabilistic projections of climate change from 1861 to 2100. Since the model's first projections were published in 2003 substantial improvements have been made to the model and improved estimates of the probability distributions of uncertain input parameters have become available. The new projections are considerably warmer than the 2003 projections, e.g., the median surface warming in 2091 to 2100 is 5.2°C compared to 2.4°C in the earlier study. Many changes contribute to the stronger warming; among the more important ones are taking into account the cooling in the second half of the 20th century due to volcanic eruptions for input parameter estimation and a more sophisticated method for projecting GDP growth which eliminated many low emission scenarios. However, if recently published data, suggesting stronger 20th century ocean warming, are used to determine the input climate parameters, the median projected warning at the end of the 21st century is only 4.1°C. Nevertheless all our simulations have a much smaller probability of warming less than 2.4°C, than implied by the lower bound of the IPCC AR4 projected likely range for the A1FI scenario, which has forcing very similar to our median projection. The probability distribution for the surface warming produced by our analysis is more symmetric than the distribution assumed by the IPCC due to a different feedback between the climate and the carbon cycle, resulting from the inclusion in our model of the carbon-nitrogen interaction in the terrestrial ecosystem.

Full article available here: http://dx.doi.org/10.1175/2009JCLI2863.1
 
Last edited by a moderator:
  • #56
skypunter said:
Many blast furnace designers have returned to designing furnaces the old fashioned way, trial and error.

Not wishing to derail the topic, but do you have any reference or source to support that assertion?

And what do you mean by many? A majority? A significant minority? More than three?
 
  • #57
Sylas said:
the hard core denialist, who will even seize upon pseudoscience like the recent Gerlich and Tscheuschner paper denying that the atmospheric greenhouse effect works at all...

The surface has to heat up more to shed the solar energy it is receiving.
Still, the muddled thinking I see...

The surface does no such thing. I don't even think convection is the real key difference. Most of the so-called global warming occurs at night over the Arctic. How far North now can one build a successful Greenhouse that rarely needs extra heat by artifical means? I doubt that has changed much. The loss of glaciers have a more albedo {as well as a key process transpiration} cause then from a trace gas called carbon dioxide yet vital as plant food and therefore vital to humans. This is not to say drinking water isn't important but in the Arctic circle or just below, this is hardly a concern compared to the bitter cold!

MrB.
 
  • #58
The dipole physics of the CO2 molecule are well known.
The premise of this thread, "Estimating the impact of CO2 on global mean temperature" seems moot without discussing the extent of the mitigating or extenuating effect of dynamic processes such as albedo change, circulation and variable insolation, to name a few.
Otherwise the percentage of CO2's effect may only be compared against the known effect of other gases "in a jar".
Such a partial equation is of little practical benefit.
 
  • #59
skypunter said:
The dipole physics of the CO2 molecule are well known.
The premise of this thread, "Estimating the impact of CO2 on global mean temperature" seems moot without discussing the extent of the mitigating or extenuating effect of dynamic processes such as albedo change, circulation and variable insolation, to name a few.
Otherwise the percentage of CO2's effect may only be compared against the known effect of other gases "in a jar".
Such a partial equation is of little practical benefit.

What premise do you mean? I'd really like to know. I have deliberately kept this thread pretty basic, WITHOUT reliance on premises other than what is very solidly established science. Go back to the OP and look at the six steps for how I have tried to be clear about what I am assuming.

I'm not trying to solve the whole climate problem. I am trying to help explain one issue which stands as one of the most solidly confirmed discoveries of climate science, and yet which remains a widespread focus of poorly informed public "skepticism".

CO2 has a straightforward direct impact on temperature, by absorbing thermal radiation. The effect is very well quantified, and not in any scientific doubt whatsoever. It is a large effect. It is not based on dubious inferences or indirect correlations. When you add substantial amounts of greenhouse gases to the atmosphere, you are bound to be increasing temperatures as surely as if you increased solar luminosity. It's THAT basic.

To call this of "little practical benefit" is just surreal!

The "mitigating" processes you mention are no such thing.

Albedo effects are either part of the sensitivity of climate, and add up to a net increase in sensitivity and a stronger impact for ANY forcing, or else (mainly as part of a cloud impacts) they are another forcing... and a negative one at that, and so quite impossible to displace the main conclusion of this thread, that there is a very straightforward physical reason for emphasizing greenhouse effects and CO2 in particular as the major driving factor for heating up the planet in recent decades.

Circulation is about redistribution of energy. It can alter the rate at which the climate system comes to equilibrium, because it affects the rate at which heat is taken up into the ocean. But it is not a source of energy, and does not drive a trend in global net increase or decrease in temperatures. It is a major complexity in climate; but it is not a premise that makes a blind bit of difference for the main conclusions of this thread. I am not trying to make a climate model or calculate the rate at which temperatures will change or regional distributions of effects.

Variable insolation is another forcing that can be quantified... and it is much less than the greenhouse forcing, by far.

Look, there's no question that the whole scientific problem of understanding climate is difficult and involves many factors. I can appreciate that there are some skeptics who understand the basic physics of greenhouse effects and focus on genuinely open questions in climate science. But most popular climate "skepticism" is credulous naivety over points that have long since been well confirmed basic scientific discoveries. Most popular skepticism has all the validity and rigor of creationism or intelligent design in biology, and there's a place to help explain some of the really basic stuff.

In particular, many people still think that the whole carbon dioxide and greenhouse link to climate is dubious. It isn't. It is basic applied physics, and a foundation for all the real open questions.

Here at physicsforums we have an audience that is mostly reasonably clued up and interested in physics, and able to follow some of the details of WHY greenhouse effects are scientifically so uncontroversial as the major cause of global heating in recent decades.

Cheers -- sylas
 
  • #60
sylas said:
What premise do you mean?
Cheers -- sylas

The premise that one can assess or estimate the impact of one factor without considering all factors. The ratio of one factor to an unknown whole is an unknown.
 

Similar threads

Replies
3
Views
4K
Replies
8
Views
4K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 28 ·
Replies
28
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
15
Views
8K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 76 ·
3
Replies
76
Views
34K