A low likelihood for high climate sensitivity

In summary, the authors present evidence that suggests sensitivity may be lower than the "Planck response", or the smallest possible temperature increase that would be caused by a given amount of energy injection. The paper also discusses the process of climate feedbacks, which amplifies or dampens the response to a temperature change, and that this amplification effect produces a distribution with a long tail. This suggests that there is a low likelihood of sensitivity being as high as six degrees.
  • #1
sylas
Science Advisor
1,647
9
I've been aware of this work for a couple of months, and it has at last got through the review processes and has been accepted for publication.

  • Annan, J.D., and Hargreaves, J.C. (2009) On the generation and interpretation of probabilistic estimates of climate sensitivity, to appear in Climatic Change. (http://www.jamstec.go.jp/frcgc/research/d5/jdannan/probrevised.pdf ).

I'm pleased to see this published, as I think it is a useful constraint on some excessively scary estimates that are not particularly well founded -- in the author's view.

The abstract

Here is the authors' own abstract:
The equilibrium climate response to anthropogenic forcing has long been one of the dominant, and therefore most intensively studied uncertainties, in predicting future climate change. As a result, many probabilistic estimates of the climate sensitivity (S) have been presented. In recent years, most of them have assigned significant probability to extremely high sensitivity, such as P(S > 6oC) > 5%.

In this paper, we investigate some of the assumptions underlying these estimates. We show that the popular choice of a uniform prior has unacceptable properties and cannot be reasonably considered to generate meaningful and usable results. When instead reasonable assumptions are made, much greater confidence in a moderate value for S is easily justified, with an upper 95% probability limit for S easily shown to lie close to 4oC, and certainly well below 6oC. These results also impact strongly on projected economic losses due to climate change.

The notion of a climate sensitivity of as high as six degrees is a catastrophic prospect, and it is sometimes portrayed as a realistic low likelihood worst case scenario. This paper is more encouraging, suggesting that non-negligible likelihoods for such high sensitivity are not well founded.

Background: what is sensitivity

Climate sensitivity is a measure of how much Earth's temperature changes in response to an imbalance in energy, called a forcing. Forcings can arise for all kinds of reasons; changes to the Earth's albedo, to the composition of the atmosphere, or to the solar input.

If everything about the makeup of the Earth remained unchanged, other than temperature, then it's comparatively straightforward to estimate how much temperatures increase for a given forcing. This turns out to be of the order of 0.3 degrees per W/m2 of forcing, and is sometimes called the "Planck response". That is, if you can deliver about 1 Watt of extra energy per square meter over all the Earth, and prevent anything other that temperature changing in response, then temperatures would rise about 0.3 degrees.

However, the Earth does change in all kinds of ways. As temperatures alter, so does humidity, cloud, surface cover, vegetation, and lots more besides. These can all have further knock-on effects, to give a boost or a damping to the temperature changes. The amount of temperature change in reality, taking all these factors into account, is called sensitivity.

Sensitivity is one of the great unknowns of climate science. Most methods of estimating sensitivity obtain values in the range of 0.5 to 1.2 degrees per unit forcing. Frequently, forcing is given in degrees per doubling of CO2, since that is a simple and quite well understood benchmark forcing; although note that sensitivity is a general feature of climate, which bears upon any forcing whether CO2 is involved or not.

Using 2xCO2 as the unit for a forcing, climate sensitivity estimates tend to be around 2 to 4.5 degrees per 2xCO2, with the possibility of values above or below this range. A wider range of 1.5 to 6 degrees is sometimes quoted as conceivable.

The long tail of uncertainty?

Nearly all the evidence indicates that the net sensitivity is greater than the Planck response. That is, the complexities of the climate system tend to amplify the response of the planet somewhat. The specific evidence for various sensitivity estimates is another topic I plan to take up sometime.

The processes that give this amplification or damping are called climate feedbacks, because they arise as a result of a temperature change, and then give a further contribution to the energy balance and so feedback into temperature again.

It is a feature of this amplification effect that you get distributions with a long tail. That is, if you give a simple uniform range of feedback factors, as a map from a temperature change to the feedback driven energy imbalance you get a probability distribution on the total sensitivity which has a certain peak value, and then a long tail off to the right, towards higher sensitivity, but a fairly sudden cutoff to the left, for low sensitivity.

A widely cited paper on this feature of sensitivity estimation is

Here's a picture of a likelihood distribution from that paper, showing the long tail:
RoeBaker-fig2-partA.JPG


Based on these kinds of analysis, it is widely reported that there's a likelihood, or probability, of about 5%, that climate sensitivity could be as high as 6 degrees per 2xCO2. That's widely seen as a catastrophic outlook; a worst case scenario that should be considered as a realistic risk.

What is probability, in this context?

There's a real philosophical problem here in identifying what probability even means. It's hard to make sense of "probability" here in terms of frequencies of outcomes for a repeated event.

The paper specifies at the outset that they are using "the standard Bayesian paradigm of probability as the subjective degree of belief of the researcher in a proposition". To give a likelihood in this way, it is necessary to take into account a "prior belief" for the probability distribution, which is then modified in the light of empirical evidence. This is necessarily subjective, in the selection of the "prior" to be used in an analysis.

The paper is focused upon the issue of choosing priors, and shows that non-negligible probabilities for extremely high sensitivities follow from use of a "uniform" prior that is taken as a way of representing "ignorance". The paper argues that this is not a sensible choice and are in fact quite extreme.

Conclusions

The IPCC 4AR gives bounds on climate sensitivity as "likely" to be in the range 2 to 4.5 degrees per 2xCO2. The language is intended to reflect 67% confidence. These estimates occur along with the comment that "values substantially higher than 4.5°C still cannot be excluded".

In their conclusion, Annan and Hargreaves suggest: "Thus it might be reasonable for the IPCC to upgrade their confidence in S lying below 4.5oC to the “extremely likely” level, indicating 95% probability of a lower value."

The paper also considers the economic implications, for planners taking into account the cost and likelihood of these worst case scenarios.

All in all, in my opinion this is a welcome general contribution to the wide open question of climate sensitivity. The most likely sensitivity values remain around about 3 degrees or a bit less taking into account some recent work on the Earth Radiation Budget Experiment; with a substantial spread of uncertainty. But this paper argues it is reasonable to consider the very large sensitivity values sometimes proposed to be extremely unlikely.

Cheers -- sylas
 
Last edited by a moderator:
Earth sciences news on Phys.org
  • #2
My impression, is that it's the paleoclimate data that has lead to the extremely long tailed uncertainity distribution for climate sensitity. Of course, such data is filled with uncertainity and possible biases, but we also need to realize that the geologoy of the Earth today is not the same as it was in the past. Plate tectonics have changed the distribution of land masses so much that albedo feedbacks may have actually been greater 55 million years ago than they are now.
 
  • #3
Xnn said:
My impression, is that it's the paleoclimate data that has lead to the extremely long tailed uncertainity distribution for climate sensitity. Of course, such data is filled with uncertainity and possible biases, but we also need to realize that the geologoy of the Earth today is not the same as it was in the past. Plate tectonics have changed the distribution of land masses so much that albedo feedbacks may have actually been greater 55 million years ago than they are now.

I don't think so... there are many methods used to constrain sensitivity, and as far as I know none of them particularly stick out as having a long tail more than others. It's certainly true that sensitivity is likely to have altered over time; but I don't think that's the major cause for the long tail towards high sensitivity in many papers.

A key factor is that net feedback seems to be positive. As you increase the feedback strength, the response approaches the runaway state, where sensitivity races away towards infinite. We know that sensitivity is NOT infinite; but there's still this aspect that a feedback parameter (in units of W/m2 per degree) has a much greater impact as the sensitivity gets larger.

Mathematically, you can represent it like this. I'm using first order linear approximations for simplicity. Let ΔQ be a change in Earth's energy balance (units Wm-2). Let ΔT be a corresponding change in the mean global temperature (units K, or degrees Celcius).

Let λ0 be the Planck response, with units of K/(Wm-2). This is the temperature increase per unit forcing if nothing changes other than temperature. Let C represent the additional feedbacks, with units of Wm-2/K, which is the additional change to energy balance that occurs because of temperature driven changes. That is, given a temperature change ΔT, there is an additional change in the energy balance of CΔT, because of changes in the climate system arising from temperature.

The dimensionless "feedback factor" is simply f = λ0 C, which shows up as follows. The total temperature change for a forcing ΔQ will be
[tex]\begin{align*}
\Delta T & = \lambda_0 ( \Delta Q + C \Delta T ) \\
& = \Delta Q \frac{\lambda_0}{1 - \lambda_0 C} \\
& = \Delta Q \frac{\lambda_0}{1 - f}
\end{align*}[/tex]​

The climate sensitivity is thus λ = λ0/(1-f); and the gain is 1/(1-f).

The long tail comes about because if you look at uncertainties in the magnitude of the feedback, and represent this as some kind of uniform distribution of possible values for C, you end up with a skewed distribution for λ.

As C approaches 1/λ0, f approaches 1, and the gain races away to infinite.

It's represented pictorially in Row and Baker as follows:
RowBaker-fig1.JPG

If you presume a symmetrical likelihood distribution of some kind for the feedback factor, the sensitivity ends up with a long tail, simply because of the way sensitivity races off to infinite as feedback factor goes to 1.

The paper here is really mathematical. It's not a specific line of evidence on sensitivity (which I may take up another time in another thread). It's rather about the appropriate way to represent uncertainties -- since we are definitely uncertain.

I'd put it simply like this. It's not the case that we have no prior information about sensitivity. We know for a fact that sensitivity is not infinite, and hence that the feedback factor is less than 1. This is legitimate prior information; and it really corresponds to knowledge that very high feedback factors are a priori less likely that very low feedback factors. The non-negligible likelihoods often quoted for very high climate sensitivity are an artifact of the shape of the relation between sensitivity and feedback, combined with simplistic assumptions about prior likelihoods.

Cheers -- sylas
 
Last edited:
  • #4
Perhaps I should have stated that paleo-data has not been particularly useful in constraining climate sensitivity.

Anyways, isn't it the inertia of the system that is at the root of the potential for high sensitivity? That is while we know there is some warming, we don't yet have a good handle on how long it will take for the system to regain equilibrium. In other words, the longer it takes to reach equilibrium, the larger sensitivity will be and visa versa.
 
  • #5
Xnn said:
Perhaps I should have stated that paleo-data has not been particularly useful in constraining climate sensitivity.

I think it has; though of course the caveat you mention is important to bear in mind. The sensitivity in the past is not necessarily the same as it is today.

But the paleo data has been quite helpful as an additional check on more direct measurements of the climate system in the present. The disadvantage of paleodata is that there are larger uncertainties in all the data and much of it is indirect, by proxies. The advantage is that you can look at longer periods of time and larger swings in temperature.

A paper I have cited previously on this is
  • Annan, J. D., and J. C. Hargreaves (2006), http://www.agu.org/pubs/crossref/2006/2005GL025259.shtml, in Geophys. Res. Lett., 33, L06704, doi:10.1029/2005GL025259.

This is closely related to the paper cited in the first post of this thread, but deals much more with the actual numbers from different independent ways of estimating sensitivity. In this paper, the following lines of evidence are considered:
We then survey some recent attempts to estimate climate sensitivity using several different approaches: the global temperature trend over the last century; short-term cooling following volcanic eruptions; the climate at the Last Glacial Maximum; modern climatological patterns; and the global temperature change in the Maunder Minimum.

The constraint Annan and Hargreaves give for sensitivity based on the last glacial maximum include a substantial additional uncertainty (but subjective) to reflect possible differences in sensitivity, and end up with a roughly gaussian constraint on sensitivity with mean 2.7oC and standard deviation 1.7oC (sensitivity to the forcing unit 2xCO2).

I think there has also been some work using the Holocene to constrain sensitivity on the basis of paleoclimate data. It's all useful additional information.

Xnn said:
Anyways, isn't it the inertia of the system that is at the root of the potential for high sensitivity? That is while we know there is some warming, we don't yet have a good handle on how long it will take for the system to regain equilibrium. In other words, the longer it takes to reach equilibrium, the larger sensitivity will be and visa versa.

Inertia and sensitivity are different things, which impact response in different ways. The primary factor for inertia in climate response is the large heat capacity of the ocean. In brief, inertia constrains how long it takes to get to equilibrium, and sensitivity constrains the magnitude of the new equilibrium once you get there.

One very useful constraint on sensitivity that I've mentioned before is the response of the climate system to large but transient volcanic forcings; and in particular how long it takes for the system to recover. This is not really the same as inertia. More inertia in the system would mean a smaller maximum response to the transient forcing, as it takes time for the effects to be realized; and of course more sensitivity means a larger response.

A relevant paper for this work is
This is also one of the lines of evidence considered in Annan and Hargreaves (2006).

From the paper:
We begin with a simple piece of pedagogy to provide some insights into the factors that control the response to time-dependent external forcing. This response is determined primarily by the climate sensitivity and the thermal inertia of the climate system, with the relative importance of these two factors depending on the timescale of the forcing. For very slow (multicentury) timescale forcing changes the system is able to maintain near equilibrium with the forcing, so sensitivity effects dominate. For rapid forcing changes, such as those for the seasonal insolation cycle, the response is dominated by inertia effects. Volcanic forcing lies between these two extremes, and it is easy to demonstrate that both sensitivity and inertia effects are important.
 
  • #6
Maybe I'm being a bit simplistic, but isn't this paper pretty hard to reconcile with this paper published by MIT earlier this year.

Sokolov A, Stone P, Forest C, Prinn R, Sarofim M, et al. (2009) Probabilistic forecast for 21st century climate based on uncertainties in emissions (without policy) and climate parameters. Journal of Climate

http://globalchange.mit.edu/files/document/MITJPSPGC_Rpt169.pdf"

I only skimmed the paper, but the 5.1 degree centigrade increase seems to assume a CO2 level of 866ppm The table on page has a 2 sigma range between 3.5 to 7.4 degrees. That works out to a 137% increase instead of doubling, but since temperature should increase logarithmically to CO2 , that still seems very high.
 
Last edited by a moderator:
  • #7
Maybe I'm being a bit simplistic, but isn't this paper pretty hard to reconcile with this paper published by MIT earlier this year.

To the contrary it would seem that these two papers help to confine the climate sensitivity to within closer bounds. One paper gives reason to expect a lesser probability of low climate sensitivity while the other reduces the likelihood of a high sensitivity. Something in the vicinity of +3C degrees is looking more probable than ever, according to these studies.
 
  • #8
Maybe I have missed something here, but I find nothing in any of these formulations that resembles a real world scenario. Climate is both local and global, with enormous local variation within a fairly stable global energy budget. Measures of forcing such as W/m^2 have no meaning in nature, because no climate effect, to include forcing, occurs consistently over the whole globe. Yes, the effect of any forcing event may be reported in W/m^2 or in global average temperature numbers, but that may tell us almost nothing about the actual effects of such a change. Right now, for example, we are finding large temperature changes in the very coldest parts of our planet, with much lower changes in other parts. In fact, where such results in melting, it may cool the rest of the waters of the planet, at least for some period. Also, rises in sea level may have knock on effects that have nothing to do with the albedo changes that are usually considered. Consider the changes in water chemistry, and the death of vegetation from salt water poisoning. In short, it matters a lot where these things happen, because locally extreme effects will often be more important than any possible future equilibrium, especially when the rates of local change exceed the ability of the local environment to adjust.
 
  • #9
sylas said:
The paper also considers the economic implications, for planners taking into account the cost and likelihood of these worst case scenarios.
I'm curious, assuming this paper is to be published in Geophys. Res., Science, or the like, why is the economic aspect appropriate in that venue? Or more importantly, what is the competence of the peer reviewers on the the economic aspect of this paper? I would think the paper should become two, one for CO2 sensitivity in Geophys. Res., and one for the economic impact in, say, Econometrica.
 
  • #10
mheslep said:
I'm curious, assuming this paper is to be published in Geophys. Res., Science, or the like, why is the economic aspect appropriate in that venue? Or more importantly, what is the competence of the peer reviewers on the the economic aspect of this paper? I would think the paper should become two, one for CO2 sensitivity in Geophys. Res., and one for the economic impact in, say, Econometrica.

It is due to be published in Climatic Change; I put this in the original citation, but without a link to the journal.

This is described as "An Interdisciplinary, International Journal Devoted to the Description, Causes and Implications of Climatic Change", so I guess this is part of implications. Certainly the references in this paper show that the policy implications of sensitivity estimates has been a primary focus of a number of other papers in this particular journal. (Betz 2007, Harvey 2007, Risbey 2007) In this paper, however, the primary focus is the sensitivity calculations, with policy implications as a minor add on.

I'm inclined to speculate that it may have been advice from reviewers which has brought this economic dimension to the limited prominence it has! The guts of the paper is all about the sensitivity estimation; with economic numbers seeming to be there simply to demonstrate how the ways scientists estimate sensitivity has a substantial implication for economic estimates. But there's not nearly enough on economics to be a paper; and the authors have nothing to say about the specifics of economic analysis. They simply show how existing economic calculations are altered by their work on sensitivity estimation.

I'm not really all that interested in questions of where a paper should be published, and I have very little interest in the economic dimension anyway. But for what it is worth, the extent and limits of the economic stuff is set out in the paper as follows:
Thus, we do not claim here to provide a comprehensive probabilistic assessment of the economic harm associated with climate change. Rather, the economic assessment is intended to demonstrate how the details of the choice of prior may have a substantial downstream impact on users of climate science information. Our results are qualitatively insensitive to the particular choice of the DICE model as the basis for the economic cost. Alternative economic damage functions which also show substantial relative rises in harm across moderate to high temperature rises (say 3–10oC) would support a qualitatively similar analysis and conclusions.

Cheers -- sylas
 
  • #11
mheslep said:
I'm curious, assuming this paper is to be published in Geophys. Res., Science, or the like, why is the economic aspect appropriate in that venue? Or more importantly, what is the competence of the peer reviewers on the the economic aspect of this paper? I would think the paper should become two, one for CO2 sensitivity in Geophys. Res., and one for the economic impact in, say, Econometrica.

The trouble is that I can't see anyway of doing climate predictions without using an economic model to forecast future GHG emissions. The IPCC handles it by running their climate models using various economic models. They use 4 different economic scenarios as I recall. You can read the details here:
http://www.grida.no/publications/other/ipcc_sr/?src=/climate/ipcc/emission/"

Now that I think about it, I think that MIT report I mentioned above is not using the IPCC scenarios.
 
Last edited by a moderator:
  • #12
joelupchurch said:
The trouble is that I can't see anyway of doing climate predictions without using an economic model to forecast future GHG emissions. [...]

Quite right... however this is not a concern with the paper being considered here. Annan and Hargreaves are not doing climate predictions at all... they are doing sensitivity estimates; a different thing.

Now one of the applications of estimates for sensitivity, or pdfs for a distribution of likelihoods, is climate prediction, and they are then used in turn for estimates of economic costs.

Annan and Hargreaves don't actually do such estimates themselves, or give any critique of such methods. They focus nearly all their work on how to obtain estimates of sensitivity as likelihood distributions.

They do also look at the way sensitivity estimates are used in policy decisions, by taking an off-the-shelf cost estimate of some kind, which presumably considers some distribution of emissions and some way of translating that to atmospheric changes. They then show how the calculated cost varies when you vary the sensitivity estimates. They note that other cost estimation functions would show similar behaviour.

Summary: Annan and Hargreaves (2009) don't worry about comparing or developing cost estimates or climate predictions. They rather show the importance of a choice of priors for sensitivity estimations. They also show the scale of influence for any cost estimate, with reference to one example cost function as a demonstration.

Cheers -- sylas
 
  • #13
From Sylas' link

we establish at the outset that the notion of probability discussed here is the standard Bayesian paradigm of probability as the subjective degree of belief of the researcher in a proposition.

I wonder if this is science as in a scientific method or is it a most sophisticated variation of the argumentum at populum?
 
  • #14
Andre said:
I wonder if this is science as in a scientific method or is it a most sophisticated variation of the argumentum at populum?

The former. There is no "ad populum" used here, sophisticated or otherwise. Read the paper for more.

Felicitations -- sylas
 

What is "A low likelihood for high climate sensitivity"?

"A low likelihood for high climate sensitivity" refers to the concept that the Earth's climate may not be as sensitive to increases in greenhouse gas emissions as previously thought. This means that the planet may not experience drastic changes in temperature and weather patterns even with significant increases in greenhouse gases.

What does low climate sensitivity mean for the future of our planet?

If the Earth's climate is less sensitive to greenhouse gas emissions, it could mean that the effects of climate change may be less severe than initially predicted. However, this also means that we may have less time to take action and reduce our greenhouse gas emissions before irreversible damage is done to the planet.

What evidence supports the idea of low climate sensitivity?

Recent research has shown that climate models used to predict the effects of greenhouse gas emissions may have overestimated the planet's sensitivity to these gases. Additionally, historical data from past periods of high levels of carbon dioxide in the atmosphere do not show as drastic changes in temperature as predicted by current models.

What are the potential drawbacks of low climate sensitivity?

While a low likelihood for high climate sensitivity may seem like good news, it could also lead to complacency in addressing climate change. If we assume that the effects will be less severe, we may not take the necessary actions to reduce our greenhouse gas emissions and mitigate the impacts of climate change.

What further research is needed to better understand climate sensitivity?

Scientists are continuing to study the Earth's climate and develop more accurate models to better understand climate sensitivity. More research is also needed to examine potential feedback mechanisms that could affect the planet's sensitivity to greenhouse gases. Additionally, studying past periods of high levels of carbon dioxide can provide valuable insights into the Earth's climate system.

Similar threads

  • Earth Sciences
Replies
28
Views
2K
Replies
13
Views
3K
  • Earth Sciences
Replies
7
Views
1K
Replies
15
Views
7K
  • Introductory Physics Homework Help
Replies
17
Views
2K
  • Earth Sciences
3
Replies
101
Views
23K
  • Earth Sciences
Replies
4
Views
4K
Replies
39
Views
11K
  • Introductory Physics Homework Help
Replies
1
Views
1K
Replies
89
Views
34K
Back
Top