Is the Faint Young Sun Problem Solved by Increased Greenhouse Gases?

Click For Summary
Global warming is widely accepted as a scientific fact, with rising global surface temperatures and ocean temperatures, along with accelerating sea level rise due to thermal expansion and melting ice. The oceans have absorbed over 80% of the heat from global warming, which is a significant factor in climate change. Historical data shows that glaciers and ice caps have lost mass, contributing to sea level rise, while permafrost warming has been observed in various regions. The discussion also touches on the complexities of natural climate cycles and the impact of greenhouse gases, with some skepticism about the extent of human influence on climate change. Overall, the evidence strongly supports the reality of global warming and its implications for the planet.
  • #61
Andre said:
Maybe also counts is the failure to retract a paper that should have been retracted by all standards, obviously http://www.agu.org/pubs/crossref/1999/1999GL900070.shtml better known as http://www1.ipcc.ch/pdf/climate-changes-2001/synthesis-spm/synthesis-spm-en.pdf . After the rejection of the http://www.agu.org/pubs/crossref/2005/2004GL021750.shtml. (M&M)

The allegations of M&M have been evaluated by two commissions/panels, a ad hoc commision Wegman and the NAS panel of North. Both confirmed the crtique of M&M , despite all attempts to cover that. As shown before,this can be seen from the senate hearings:



Maybe also an indication of the science is the vigorous attempts to resurrect the hockeystick afterwards and the attemps to discredit the Wegman report. See this thread.

I think that it is accepted that some of the methods were inappropriate. I think that it is more important that the results are correct, except for an estimation of the errors.
 
Last edited by a moderator:
Earth sciences news on Phys.org
  • #62
Bored Wombat said:
I think that it is accepted that some of the methods were inappropriate. I think that it is more important that the results are correct, except for an estimation of the errors.

Yes; there was nothing here even remotely close to requiring a withdrawal. In fact, it remains one first papers reporting work of this kind, and the conclusions have been confirmed by many subsequent independent investigations.

The only real issue was the way in which principle component analysis was used in the first attempt at this kind of multi-proxy reconstruction in the 1998 paper. There are better ways to do PCA. There is no hint of fraud or error; merely a case that the original methods can be improved. As it turns out, repeating the analysis with the improved methods makes no significant difference to the results; but it is a better method and Mann, Bradley and Hughes also use the improved PCA techniques in subsequent work.

This kind of criticism and response is perfectly normal in scientific work.

The idea that this is the kind of thing for which a paper is withdrawn is bizarre. It's a minor legitimate criticism; which makes no practical difference to the results, and which has resulted in improvements to the statistical methods in subsequent work, all of which continues to confirm the major conclusions of the original paper.

Cheers -- sylas
 
  • #63
Well thanks for the most excellent demonstration of my point, the defence of a confirmed flawed paper that may have been constructed because http://epw.senate.gov/hearing_statements.cfm?id=266543

Let's quote David Deming a bit:

In 1769, Joseph Priestley warned that scientists overly attached to a favorite hypothesis would not hesitate to "warp the whole course of nature." In 1999, Michael Mann and his colleagues published a reconstruction of past temperature in which the MWP simply vanished. This unique estimate became known as the "hockey stick," because of the shape of the temperature graph.

Normally in science, when you have a novel result that appears to overturn previous work, you have to demonstrate why the earlier work was wrong. But the work of Mann and his colleagues was initially accepted uncritically, even though it contradicted the results of more than 100 previous studies. Other researchers have since reaffirmed that the Medieval Warm Period was both warm and global in its extent.
 
Last edited:
  • #64
Andre said:
Well thanks for the most excellent demonstration of my point, the defence of a confirmed flawed paper that may have been constructed because http://epw.senate.gov/hearing_statements.cfm?id=266543

Let's quote David Demng a bit:

This is not a valid reference for the forum. It's a senate hearing; and a good sign of just who is really politicizing things in this topic.

In fact, the quote you emphasize is particularly absurd. It was given in a statement to a senate hearing by David Deming. He gives no source for the quote, saying only that it had been sent to him in an email. The email has never been provided; the person who allegedly made this statement has never been identified, and there is no context available to judge why it was said, or by whom, or what it should be removed from. As it turns out, of course, science continues to study the medieval warm period just fine.

Deming says that "Mann and his colleagues was initially accepted uncritically, even though it contradicted the results of more than 100 previous studies".

This is a strange statement in all kinds of ways.

First -- so what if the work contradicts previous work? Isn't this permitted in science?

Second -- actually, the work didn't rule out the medieval warm period. The papers by Mann and colleagues are actually very circumspect on the MWP, and certainly don't claim to rule it out. For example, in Mann, Bradley and Hughes (1998) which is presumably what Deming refers to, the only mention is the following:
Given the high level of skill possible in large-scale reconstruction back to 1400 with the present network, it is reasonable to hope that it may soon be possible to faithfully reconstruct mean global temperatures back over the entire millennium, resolving for example the enigmatic medieval period.[/color]​

Third -- Deming misrepresents the state of past science. The scope and extent of the medieval warm period has never been a settled thing, as he seems to suggest. It has always been enigmatic: Mann et al are more accurate than Deming on this. They refer to: Hughes, M. K. & Diaz, H. F. Was there a ‘Medieval Warm Period’ and if so, where and when? Clim. Change 26, 109–142 (1994). Deming's inference that the MWP was well understood is flatly incorrect.

Fourth -- the work was not "accepted uncritically". It was subject to quite detailed critical examination.

Fifth -- what counts more is SUBSEQUENT work. Multiple independent studies have continued to confirm the basic result of Mann, Bradley and Hughes, and the evidence is that the MWP existed, but it was primarily a NH phenomenon, and not as warm as the end of the twentieth century.

Sixth -- the alleged email Deming claims to have received but has never actually revealed doesn't make sense in the context of his own work. Deming's own paper to which he refers was strictly regional (USA) and there's no conflict at all with having regional climate extremes.

These senate hearings were under Senator Inhofe; as far as science is concerned they are irrelevant, and the quote by Deming doesn't come close to refuting the conclusions of the NAS study of the whole affair, or its confirmation in a host of ongoing research.

The REAL test of new ideas in science is subsequent research by the scientific community... which is what we OUGHT to be focusing on in this forum. Senator Inhofe's little stage play is meaningless. There are no indications of error or fraud or anything of the sort in the Mann et al paper that would suggest withdrawal was appropriate. The criticisms that can be made are reasonable, and have been addressed, and don't actually change anything.

What would make more sense for THIS science based forum is not endless focus on the 1998 paper, but a focus on all the ongoing work since by other scientists to try and replicate or falsify his work with the usual scientific method... repeating the analysis independently, with new data, new methods, new insights.

Cheers -- sylas
 
Last edited:
  • #65
http://www.sciencedirect.com/science?_ob=ArticleListURL&_method=list&_ArticleListID=1144196635&_sort=r&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=81ba2bc433606a5e75401a3ed1932a6c

But you are implying that David Deming -under oath- is lying. Anyway it is not a secret who this person was, who said "we have to get rid of the global warming". Deming told it to several others on some occasion.
 
Last edited by a moderator:
  • #66
Andre said:
http://www.sciencedirect.com/science?_ob=ArticleListURL&_method=list&_ArticleListID=1144196635&_sort=r&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=81ba2bc433606a5e75401a3ed1932a6c

But you are implying that David Deming -under oath- is lying.

Actually no I am not. There's no implication of the kind. I am saying his statement is absurd. And I explained why. He's a rather odd person, that's for sure; not that this has any bearing on anything... I have no reason to presume he's honest, or dishonest; but his hearsay of an unknown email with no context and no source should have no bearing on anything either.

ESPECIALLY because we are in the science forum, and we have much much better ways to proceed; like independent scientific replication of the original work, to confirm or falsify it. There's a lot of that available, which is far more appropriate to the forum and far more useful for sorting out what matters.

By the way. Can you please give something a little bit informative when you give links? Some clue as to what you are actually linking to and what point it is trying to make? If you are trying to say the medieval warm period is still considered in science -- then that is what I have ALSO said in my post... and is further demonstration of just how idiotic it would be for anyone to say "we have to get rid of the medieval warm period".

Ironically, you could have made the same point (which is also a part of my point) simply by citing the most recent paper by Mann Bradley and Hughes, and colleagues.

  • http://www.sciencemag.org/cgi/content/abstract/326/5957/1256
    by Michael E. Mann, Zhihua Zhang, Scott Rutherford, Raymond S. Bradley, Malcolm K. Hughes, Drew Shindell, Caspar Ammann, Greg Faluvegi, Fenbiao Ni,
    in Science 326(5957), 27 November 2009, pp 1256-1260

I guess whoever sent Deming that absurd email forgot to pass it on to Mann, Bradley and Hughes, eh?

Cheers -- sylas
 
Last edited by a moderator:
  • #67
sylas said:
and is further demonstration of just how idiotic it would be for anyone to say "we have to get rid of the medieval warm period".

Maybe Try this

Sorry no peer reviewed article around, just the reports of the 1998 AGU fall meeting.
 
  • #68
Andre said:
But you are implying that David Deming -under oath- is lying. Anyway it is not a secret who this person was, who said "we have to get rid of the global warming". Deming told it to several others on some occasion.

His honesty or lack of has little bearing on the science of global warming.
 
Last edited by a moderator:
  • #69
Even i think it is a fact but , some continents like Europe are experiencing fall in temperature that they have never ever had before , global warming is reversing itself?
 
  • #70
Andre said:
Maybe Try this

Sorry no peer reviewed article around, just the reports of the 1998 AGU fall meeting.

You are linking to a google search, for heaven's sake!

Can you actually link to something specific and relevant, and give an informative label to your links so that we have some idea what it is that you are linking to and what your point is intended to be? That would help.

One of your search terms is "Overpeck". It has been widely rumoured, on what basis I do not know, that Jonathan Overpeck was the source of the email sent to Deming. As far as I know, Deming has never explicitly confirmed this or shown the email, but there you go.

In any case, Johnathan Overpeck has responded to these rumours. Here's a report carried in the Arizona Daily Star (6 Dec 2009). (http://www.azstarnet.com/sn/printDS/320270 )

The comment that Overpeck may or may not have made — no one has produced e-mail evidence — was a statement to an Oklahoma researcher back in the 1990s that, "We need to get rid of the Medieval Warm Period."

Overpeck and some of his colleagues have said the Medieval Warm Period wasn't as warm as cracked up to be, or that it was warm only in parts of the world, such as Great Britain or northern Europe, and not globally.

[... snip comments by another person...]

This comment has been repeatedly reported — but without Overpeck's name attached — by longtime warming skeptic David Deming, a geophysicist at the University of Oklahoma. In an article published last March, Deming said that back in 1995, "one of the lead authors" of a just-finished Obama administration report on climate change "told me that we had to alter the historical temperature record by 'getting rid' of the Medieval Warming Period." In 2006 testimony before a U.S. Senate committee, Deming said that in the 1990s, "… I received an astonishing e-mail from a major researcher in the area of climate change. He said, 'We have to get rid of the Medieval Warming Period.' "

For two years now, many bloggers have theorized that Deming was speaking of Overpeck, who before arriving at UA in 1999 was a leading National Atmospherics and Oceanic Administration paleoclimatologist. Reached at his Norman, Okla., home last week, Deming declined to comment.

Overpeck said last week that he had searched through his e-mails dating back a decade, and could find none like Deming referred to. Overpeck pointed out that he has written papers dating to the late 1990s saying that various records, including tree rings, stretching back 1,200 years, confirm earlier assertions that the Medieval period was warmer than today in the North Atlantic and northern Europe — but not globally.

"My papers are the record of fact, and in this case, I obviously did not try to get rid of the MWP," Overpeck said. "Instead, I have tried hard to be clear what it likely was and was not."[/color]​

This newsreport is confirming what I have said to you. It's nonsense to say anyone wants to get rid of the Medieval Warm Period. It continues to be discussed in the literature just fine by Overpeck, Mann, Bradley, Hughes, and heaps of other people. What on Earth Deming is talking about we don't know. He hasn't given a source, or context for this alleged email.

Overpeck also refers to Deming in terms of complete puzzlement in the private emails that were stolen from the CRU. He apparently doesn't even know him well enough to spell the name right, and had to use google to find out about him.

So did I. And my reaction was much the same... Deming is quite an oddball. Who knows what he is thinking. He does not give the email or the context, so why are you taking it seriously?

Everyone else involved in this (Mann and his colleagues, Overpeck, and everyone else as far as I can tell) continue to refer to the medieval warm period just fine; so it's just silly to think any of them want to "get rid of the MWP". The questions researchers continue to investigate are about its magnitude and scope... which are surely valid questions.

Cheers -- sylas
 
Last edited by a moderator:
  • #71
Eric McClean said:
Even i think it is a fact but , some continents like Europe are experiencing fall in temperature that they have never ever had before , global warming is reversing itself?

If you are referring to the short term, then keep in mind that global warming does not mean there will no longer be winter weather. Even a year is considered to be a brief period for the climate. Generally 10 years is the shortest period, but technically a 30 year average is needed to establish the climate for a particular region.
 
  • #72
Absense of falsifiability

The basis for belief in global warming from carbon dioxide emissions is the set of models that are referenced by the United Nations Intergovernmental Panel on Climate Change (IPCC) in its 2007 report. According to an author of this report, the climatologist Kevin Trenberth ( <http://blogs.nature.com/climatefeedback/recent_contributors/kevin_trenberth/> ), the IPCC models do not make predictions. It follows that: a) the IPCC models are not falsifiable and b) the IPCC models are not scientific, by the definition of "scientific."

One should not argue that a model built by scientists is a scientific model. To do so is to employ the logical fallacy of arguing from authority.
 
Last edited by a moderator:
  • #73
Terry Oldberg said:
The basis for belief in global warming from carbon dioxide emissions is the set of models that are referenced by the United Nations Intergovernmental Panel on Climate Change (IPCC) in its 2007 report.

Not quite. That is part of working out some of the details; but the major basis for identifying carbon dioxide as the major factor for warming is the thermodynamics of radiation transfer in the atmosphere, which allows to calculate the associated forcing. This is not based on climate models.

No other forcing is known that is as large as this one. This not based on models either; it's rather a case that evidence indicates all the proposed factors simply have smaller, or negative forcings. The major basis for this conclusion is empirical studies and measurements; not models.

According to an author of this report, the climatologist Kevin Trenberth ( <http://blogs.nature.com/climatefeedback/recent_contributors/kevin_trenberth/> ), the IPCC models do not make predictions. It follows that: a) the IPCC models are not falsifiable and b) the IPCC models are not scientific, by the definition of "scientific."

This is a link to an archive of all Kevin Trenberth's contributions to the Climate feedback blog at nature.com. This is a good set of articles to learn more about how the science works.

It is not clear what article you are referring to, but I think the most relevant may be this one: http://blogs.nature.com/climatefeedback/2007/06/predictions_of_climate.html, from June 04, 2007. Here is the start and end of that article. The whole article is good as well.
I have often seen references to predictions of future climate by the Intergovernmental Panel on Climate Change (IPCC), presumably through the IPCC assessments (the various chapters in the recently completedWorking Group I Fourth Assessment report ican be accessed through this listing). In fact, since the last report it is also often stated that the science is settled or done and now is the time for action.

In fact there are no predictions by IPCC at all. And there never have been. The IPCC instead proffers “what if” projections of future climate that correspond to certain emissions scenarios. There are a number of assumptions that go into these emissions scenarios. They are intended to cover a range of possible self consistent “story lines” that then provide decision makers with information about which paths might be more desirable. But they do not consider many things like the recovery of the ozone layer, for instance, or observed trends in forcing agents. There is no estimate, even probabilistically, as to the likelihood of any emissions scenario and no best guess.

[...]

So if the science is settled, then what are we planning for and adapting to? A consensus has emerged that “warming of the climate system is unequivocal” to quote the 2007 IPCC Fourth Assessment Working Group I Summary for Policy Makers (pdf) and the science is convincing that humans are the cause. Hence mitigation of the problem: stopping or slowing greenhouse gas emissions into the atmosphere is essential. The science is clear in this respect.

However, the science is not done because we do not have reliable or regional predictions of climate. But we need them. Indeed it is an imperative! So the science is just beginning. Beginning, that is, to face up to the challenge of building a climate information system that tracks the current climate and the agents of change, that initializes models and makes predictions, and that provides useful climate information on many time scales regionally and tailored to many sectoral needs.

We will adapt to climate change. The question is whether it will be planned or not? How disruptive and how much loss of life will there be because we did not adequately plan for the climate changes that are already occurring?[/color]​

I recommend people look at the whole thing; and the other articles in the blog as as well. Kevein Trenberth is an expert in energy balance in particular, and is a fair minded reporter of what is known and unknown in the science of climate. In particular, he points out that one important aspect of actually giving reasonable predictions of climate -- rather than general trends and understanding of the major forcings -- is being able to model the short term variations, like the Pacific Decadal Oscillation and other such factors. The science does indeed give falsifiable conclusions about the importance of carbon dioxide and the fact that it drives a substantial part of the current measured global warming phenomenon. But there are still many unknowns before definite predictions could be given.

The IPCC reports give broad ranges of likely outcomes for different emissions scenarios, and explicitly notes the uncertainties of the details in climate patterns as the planet warms.

Cheers -- sylas
 
Last edited by a moderator:
  • #74


Terry Oldberg said:
The basis for belief in global warming from carbon dioxide emissions is the set of models that are referenced by the United Nations Intergovernmental Panel on Climate Change (IPCC) in its 2007 report. According to an author of this report, the climatologist Kevin Trenberth ( <http://blogs.nature.com/climatefeedback/recent_contributors/kevin_trenberth/> ), the IPCC models do not make predictions. It follows that: a) the IPCC models are not falsifiable and b) the IPCC models are not scientific, by the definition of "scientific."

One should not argue that a model built by scientists is a scientific model. To do so is to employ the logical fallacy of arguing from authority.

This is all wrong. For one thing, "the basis for belief" is the fundamental physics of radiative transfer, and the "broad brush science" was laid down late in the 1800's and early in the 20th century. If you can disprove that CO2 is a greenhouse gas and does not possesses the chemical properties to absorb and emit infrared radiation, then you can "falsify" AGW as we know it (although "AGW" is a very ill-defined and broadly encompassing term, so we should establish what exactly we're trying to falsify). In fact, much of the more important parts of anthropogenic climate change, including stratospheric cooling, tropospheric warming, polar amplification, oceans heating slower than land, etc (some of which are unqiue to greenhouse perturbation, other things which happen in a warmer climate of any cause) have their basis in theoretical or very simple radiative-convective models, and are not sensitive to various assumptions which end up producing differing results across models.

Trenberth is probably referring to the fact that models make projections, not predictions. The difference may seem trivial to a non-specialist, but these things have very different implications. It's the difference between saying "If traffic is flowing like business-as-usual today, then I will arrive at my next destination in 30 +/- 4 minutes" versus "we will have no car accidents, no unusual amount of red lights, no traffic jams, etc, and therefore we will get at our destination in 30 +/- 4 minutes." I hope this is clear. Aside from that (note I haven't actually researched the Trenberth quote), models do indeed give falsifiable projections, although doing so requires careful consideration of the timescale involved, the model and observational uncertanties, the "noise" in your record, etc.
 
Last edited by a moderator:
  • #75
Bill Illis said:
Anyone ever put Myhre's forcing estimates into the Stefan-Boltzmann equations.

If the total forcing increase from GHGs is 1.7 W/m2, the Stefan-Boltzmann equations predict very little temperature change from an increase this small.

Surface TempK Today = (390 W/m2/5.67E-08)^.25 = 287.98K = 15.0C

Surface TempK Pre-Ind = (388.3 W/m2/5.67E-08)^.25 = 287.66K = 14.7C

So either Myhre's estimates are not really the traditional watts/metre^2 measure we use normally or the Stefan-Boltzmann equations aren't even being used.

Bill,

You might find this interesting in view of the recent emails disclosed and what was left out.




Temperature Rise of the Earth from Global Warming derived from the Stefan-Boltzmann Law of Physics, ignored by the U.N.

Notes on global warming, one way or the other:

The Stefan-Boltzmann Law concerns the radiation striking the Earth and other bodies. It also covers the radiation of energy back into space from the earth.

It states:
Power = (Surface area of the the earth) times (the Stefan-Boltzmann constant) times (the net emissivity of the earth) times (the temperature of a body) raised to the fourth power.

So let's use this formula with and without man made global warming, GW. The real problem is to eliminate the net emissivity variable which is where GW believers fuzz up the math, the science, and say this law of physics doesn't apply. Really? It does and that can be proved.

The surface area of a sphere like the solid earth, SA, is almost constant. So is the Stefan-Boltzmann constant, SB, verified by Wein and Planck.

P1 = SA * E * SB * T1^4 without GW
P2 = SA * E *SB *T2^4 with GW from man

Just below, all the constants are combined into one number that is a new constant, K. It's value is about 1, (SA*E1*SB)/(SA*E2*SB). Let's assume the emissivity of the Earth is constant. If it wasn't, then the temperature fluctuations over time on this planet would be larger than they are, year to year or within a year. So K is really equal to one on a constant climate earth, for now.

So, dividing these two formulas yields:

P2/P1 = K * (T2/T1)^4 = 1 * (T2/T1)^4

Or for a warming of 0.1 degrees at our average temperature on Earth we get,

P2/P1 = K * [(288.1° Kelvin) / (288° Kelvin )]^4

P2/P1 is a power ratio that can be expressed as a fraction or percentage. T2/T1 can also be expressed as a percentage but it varies as the forth power.

All we need to do first is calculate the percentage of change of K from a slight 0.1 degree of temperature rise to get a feel for the change in E per 0.1 degree Kelvin or Centigrade.

(288.1º / 288º )^4 = 1.00138 = 1.0014

That's how much 0.1 degree of GW will change the constant K. It will change by 1.0014 or go up by a whopping 0.14%




Now let's look at the other side of the formula, the power ratio. Here we have the same problem. How do you measure the radiation from the activities from man alone? Well, let's go outside the environmentalist box and run a few numbers.

Here are the two facts we need:
3.9×10^22 Joules, the estimated energy contained in the world's TOTAL fossil fuel *reserves* as of 2003.
5.5×10^24 Joules, the total energy from the Sun that strikes the face of the Earth each year. This is the value of P1 normally hitting the earth.
P2 is really P1 plus the extra (heat) power from GW, Pgw, or P2 = P1 +Pgw. However, we are interested in only the GW portion from Pgw caused by man, not the total increase from natural heating by the Sun.

Now remember, this is the total energy from all fossil fuel reserves not yet burned up but let's burn up all the fossil fuels up in one year in a super duper gas guzzler engine and the coal in a gazillion new Chinese and Indian power plants.

(3.9x 10^22 Joules burned up in one year) / (5.5 x 10^ 24 Joules from the sun per year) =

(0.709 X 10^-2) = 0.00709 or 0.709%
from only man's activities as defined above.

So if we burned up all the fossil fuels remaining on the Earth in one year, what would the resulting temperature rise be? K was only changed by 0.14% from our fourth power of T calculation.

0.14% raises the temperature of the Earth by 0.1 degree. So (0.1) * (0.709) / (0.14) is a 0.5 degrees Kelvin, Centigrade, or Celsius of warming from burning all the fossil fuel reserves in and on the Earth in one year!

What a big threat. The temperature will go up 0.5 degrees after we burn up a 100 to 200 hundred year supply of all fossil fuels in one year. So the real temperature rise from the activities of man over time will be 0.5 degrees spread out over a hundred years or 0.005 degrees per year, assuming a straight line plot of usage. If I use 200 years, it will be even lower.

In my calculations, I assumed my Earth had a constant emissivity because that is a sticky problem for environmentalists. The U.N. hates and doesn't use the S-B law and besides the true emissivity is hard to determine. Can we actually back calculate the value of the emissivity of the Earth or at least it's range? Yes we can.

I assumed that E2/E1 was equal to one in my outside the box calculation.
Now E for the Earth is about 0.64. E can vary but how much does it vary because of MAN MADE GW? All we have to do is look at K = (SA*E2*SB)/(SA*E1*SB) = E2/E1.
So for the earth, E1 = 0.64. So any change in temperature has to be directly affecting E2, the new emissivity from the extra GW.

"The emissivities of terrestrial surfaces are all in the range of 0.96 to 0.99"******
"Clouds, however, which cover about half of the Earth's surface, have an average emissivity of about 0.5"******
"Taking all this properly into account results in an effective Earth emissivity of about 0.64"*****
"E ice is 0.98"
"E water is 0.67"
"E black stuff is 0.96"
"E aluminum foil is 0.09!"******
"E gold, polished foil is 0.03 (reflects infrared better than Al)"

Now the Earth is not a polished gold surface nor a perfect reflector. It is more like a mix of sand, dirt, clouds, water, ice, and the biggest green house gas of all, water vapor. So we can increase the Earth's emissivity to a totally outrageous painted black Earth to near the new absurd E value of one by using the factor 1.0 / (0.64). Applying this new directly proportional and ridiculous factor, one can derive a new value for 0.005 degrees per year which I derived from the power side of the equation to yield a new maximum increase of 0.0078 degrees per year,
[(1 * 0.005) / 0.64]. What's this ridiculous temperature rise in 100 years? 0.78 degrees. You'll see this number as 0.74 later.

So let's check this calculation with KNOWN published GW facts. In the last 100 years, the Earth has only warmed less than a degree or 0.5 degrees.

IPCC_Fourth_Assessment_Report
"Warming in the last 100 years has caused about a 0.74 °C increase in global average temperature. This is up from the 0.6 °C increase in the 100 years prior to the Third Assessment Report."
So the real UN temperature rise from a new recalculation is being used to prove GW was only in error by a puny 0.14 degrees.

Oh darn. My bonfire guzzler temperature rise number of 0.5 degrees for a hundred years is off from the real rise of global temperatures of the "new" 0.74° C rise in the new "reevaluation" in the "new" UN report. Who's collecting this data and doing these calculations?

However, if the Earth was painted almost so called black (E=0.999), then,
(100 years) * [(1 * 0.005° C) / 0.64] is 0.78 degrees per year. So the UN number of 0.74 appears to be real and has just proved the Earth is painted black. Their effective emissivity value for E2 must be nearer to 0.99 but none of this matches our real color of the earth, does it?

I am crushed. The infallible UN report has proved we were doomed over the last 100 years and are already dead. We were living on a black water world that absorbed all the energy since the Civil War in the USA.

So if the power ratio for man GW goes up by a factor of 0.00709 times, what is the temperature rise in a back calculation in our Stefan-Boltzmann derived formula as related to the rise or fall in E2? What is the real E2 of the earth?
The total power ratio would be 1 plus 0.00709.

1.00709/(E1/E2)(SA*SB/SA*SB) = 1.00709E2/E1 = (T2/288)^4
For E1 = 0.64 for the real earth,
1.5736 * E2 = (T2/288°)^4
The UN says the rise from man made GW is now recalculated to be 0.74 degrees. Fine.
So that's T2 = 288.74°
E2 = [(288.74/288)^4]/1.5736 = 0.642
So Mother Nature has buffered the effects of any global warming by using the biggest green house gas of all, water vapor that can change it's state of matter to do that buffering, unlike the trace gas that stays a gas, CO2. E2 really is almost constant like I initially used above.

Now let's look at the painted black Earth of the UN where E2 = 0.99
1.00709/(E1/E2)(SA*SB/SA*SB) = 1.00709E2/E1 = (T2/288)^4
For E2 = 0.99,
1.5736 * 0.99 = (T2/288)^4
1.5579^.25 * 288 = T2 = 321.8
That's a whopping rise of 33.75 degrees C or K for a UN doomsday emissivity of 0.99 for the good black earth.
So 321.8°K yields an average temperature of the Earth of 119.6 degree F. Really?

Clearly the emissivity of the Earth changed very little to 0.642 and is almost constant.
Okay. It's a 0.31% rise that the climate of Earth and man changes the Earth's emissivity to yield an incorrect rise of 0.74 degrees C, according to the UN report.
Remember I used the infallible UN's new temperature rise number, 0.74, and the factor for burning up all the fossil fuels on Earth in one year, 1.00709.

The levels of CO2 have risen 50% in the last hundred years but the emissivity has not changed much at all and neither has the Earth warmed up to the UN's 119.6 degrees F average temperature. A 50% rise in the emissivity would be E2 = 0.64 * 1.50 = 0.96. Oops! CO2 didn't make the emissivity rise to 0.96, did it?
My average temperature at my house has never ever peaked to 119.6 degrees F (E2 = 0.99).

So CO2 levels are changing wildly. What didn't change much? What was the green house gas that can buffer the effects of the Sun's output and those of man? What is the only condensible gas that is a green house gas above -30 C? What gas is present in high concentrations so that minor fluctuations will not affect the average emissivity or Earth climate much? The answer is water vapor, the biggest green house buffering gas of them all.

The burning of all the fossil fuel reserves on the Earth in one year is like throwing a tanker truck of sulfuric acid into the huge volume of the ocean and saying you raised the pH of the ocean an alarming amount after mixing it in. How about a pH change of one part per googolplex?
When things get out of control, Mother Nature either condenses water vapor with its associated removal of the heat of vaporization, forms ice from water with its removal of its heat of fusion, heats up the top layer of the ocean to volatilize water and create clouds with its added heat of vaporization, or melts ice to water with an added heat of fusion. CO2 can't do all this and is a trace gas.

Mother Nature created a beautiful water molecule feedback system to maintain her various creations of DNA macromolecules and buffer the effects of the variable output of her creation, our Sun. It's a beautiful system that follows the Stefan-Boltzmann law proven by the likes of Max Planck, Wein, and others mentioned in the 1911 Noble Peace Prize speech available on line. Stefan and Boltzmann never knew how big and fundamental their law really was. It was Wein that hammered in the first nail to prove half the Stefan-Boltzmann Law of physics. It was Planck and Planck's Law that hammered in the last nail to prove the Stefan-Boltzmann law of physics covers a wide range of radiation that must be part of any global heating or cooling.

No wonder the UN never used the Stefan-Boltzmann Law in their report to prove man made global warming.

cheers,

stefan.
 
  • #76
stefanslaw said:
The Stefan-Boltzmann Law concerns the radiation striking the Earth and other bodies. It also covers the radiation of energy back into space from the earth.

I have explain how this law is used in [post=2497769]msg #28[/post] of the thread.

stefanslaw said:
Now let's look at the other side of the formula, the power ratio. Here we have the same problem. How do you measure the radiation from the activities from man alone? Well, let's go outside the environmentalist box and run a few numbers.

Here are the two facts we need:
3.9×10^22 Joules, the estimated energy contained in the world's TOTAL fossil fuel *reserves* as of 2003.
5.5×10^24 Joules, the total energy from the Sun that strikes the face of the Earth each year. This is the value of P1 normally hitting the earth.
P2 is really P1 plus the extra (heat) power from GW, Pgw, or P2 = P1 +Pgw. However, we are interested in only the GW portion from Pgw caused by man, not the total increase from natural heating by the Sun.

This is not relevant. Human impact on the climate is not from the energy released, but primarily by the atmospheric greenhouse effect from changes in atmospheric composition.

In my calculations, I assumed my Earth had a constant emissivity because that is a sticky problem for environmentalists. The U.N. hates and doesn't use the S-B law and besides the true emissivity is hard to determine. Can we actually back calculate the value of the emissivity of the Earth or at least it's range? Yes we can.

Note that the science reported by the IPCC does in fact use radiation laws correctly, including the Stefan-Boltzman law. This is a law for a blackbody, which has no frequency dependence on emissivity. In general, Plank radiation laws are used. And they really are used.

I assumed that E2/E1 was equal to one in my outside the box calculation.

You also assumed the Earth radiates like a simple blackbody surface. However, the Earth has an atmosphere, which results in a difference between temperature at the surface, and temperature of thermal radiation emitted out into space. The greenhouse effect basically changes this difference. You can calculate that from radiation physics with a few details of the atmosphere; but you have to use frequency dependent Planck radiation laws, and consider equations for radiation through a transparent medium -- the atmosphere.

The major greenhouse gas in the atmosphere is water vapour, but the amount of water in the atmosphere is determined largely by temperatures. This makes it a feedback. The second most important greenhouse gas is carbon dioxide. As you add CO2, this increases the difference between surface temperatures and the temperature of radiation into space... effectively warming the surface. This in turn increases the capacity of the atmosphere to hold water, which gives an amplification with a further greenhouse effect. This is called a feedback.

Cheers -- sylas
 
  • #77
stefanslaw said:
However, if the Earth was painted almost so called black (E=0.999), then,
(100 years) * [(1 * 0.005° C) / 0.64] is 0.78 degrees per year. So the UN number of 0.74 appears to be real and has just proved the Earth is painted black. Their effective emissivity value for E2 must be nearer to 0.99 but none of this matches our real color of the earth, does it?

Now let's look at the painted black Earth of the UN where E2 = 0.99
1.00709/(E1/E2)(SA*SB/SA*SB) = 1.00709E2/E1 = (T2/288)^4
For E2 = 0.99,
1.5736 * 0.99 = (T2/288)^4
1.5579^.25 * 288 = T2 = 321.8
That's a whopping rise of 33.75 degrees C or K for a UN doomsday emissivity of 0.99 for the good black earth.
So 321.8°K yields an average temperature of the Earth of 119.6 degree F. Really?

First, as Sylas explained it's not the heat from the combustion of fossil fuels that drives global warming. Instead the warming is due to the heat trapping properties of CO2 as it persists in the atmosphere.

Also, emissivity is not a measure of the color of the Earth's surface (that would be albedo). Instead, emissivity is a measure of how well the atmosphere transmits infrared radiation. It's basically the ratio between the infrared flux at the top of the atmosphere to that on the surface. The "thicker" the atmosphere, the lower the emissivity and the warmer the surface.

So, as emissivity rises the Earth surface temperature will fall in response.

Finally, by thickness, I do not mean to imply that emissivity is proportional to density.
CO2 is after all trace gas, but when it comes to infrared radiation, it behaves like a dye.
It becomes especially important at higher elevations in the atmosphere where the amount of water vapor is low.
 
  • #78
No one is up to the challenge?

I'm asking for examples of significant corrections (alters the result) and retractions of mainstream climate science papers (papers that did not stand as a challenge to conventional ideas of climate).

The presence of these corrections and retractions would have been a good indicator that healthy professional criticism (don't confuse with layman/politicized/crackpot skepticism) is present and errors are being caught.
 
  • #79
Responses to posting #73

Sylas:
Thank you for taking the time to comment on my posting. I'll preface my remarks by explaining that I'm an engineer (mechanical, electrical, nuclear) with a background in research and in the successful development of statistically validated models of complex systems. My background in climatology is limited to what I've learned in the past couple of months. However, I have a strong background in fluid mechanics and heat transfer plus a bit of background in atomic physics.

A couple of months ago, I decided to look into the controversy over anthropogenic global warming as a kind of civic duty. As I started this work, I assumed that climatologists knew what they were talking about when the claimed that carbon dioxide emissions were warming the Earth to a significant degree.

One of my first acts was to Google on the phrases "IPCC models" and "validation." This search turned up nothing resembling a validation excercise. However, it did turn up a Web
posting ( http://www.tech-know.eu/uploads/Spinning_the_Climate.pdf ) by a physical chemist named Vincent Gray. Gray explained that he had been a reviewer of each successive IPCC report. In that capacity, he said he had flagged misuse of the words "validation" and
"prediction" and that the IPCC had responding by changing these words to "evaluation" and
"projection" in some areas of its reports but not others. "Validation" and "prediction"
are statistically meaningful words. In particular, if a model makes predictions, it can be
statistically validated. "Evaluation" and "projection" are statistically meaningless; use of
these terms by the IPCC obscures the important issue of whether the IPCC models can be
validated.

According to Gray, he urged description by the IPCC of how the IPCC's models could be
statistically validated. He says that the IPCC blew him off on this issue and implies that the IPCC's models are neither validated nor susceptible to validation. If Gray is correct, then the
IPCC's models are not "scientific" models under Karl Popper's criterion of falsifiability.

I read the IPCC's 2007 and found no evidence that a validation exercise had been performed. Replacing validation were various comparisons of projected to measured temperatures.
It appeared that the authors might themselves have confused "evaluation" with "validation" and "projection" with "prediction," thus arriving at an illogical conclusion for presentation to policy makers. Later, I stumbled across the posting by Kevin Trenberth which I referenced in my posting to the Physics Forum. In his posting, Trenberth seems to confirm what Gray has to say.

In my posting to the Physics Forum, my intent is to open the issue of the falsifiability up
for discussion. I've looked into the matter and have found that a "projection" is a
different kind of entity than a "prediction"; the latter supports statistical validation but
not the former. If true, this finding is of fundamental importance to the debate on policy.

I doubt that there are any policy makers, journalists or political activists with interests in anthropogenic global warming that currently are aware of this finding. Climatologists seem confused by the issue.

With my preface complete, I'll try to respond to issues which you seem to raise in
posting #73.

1. It sounds as though you may be quibbling about my claim that the IPCC models are the
"basis for belief" in a CO2-temperature relationship but I'm unsure of what this quibble is.

So far as I am aware, the IPCC models are the sole vehicle by which the IPCC produces its projections of temperatures, with and without regulation of carbon dioxide emissions, for
consideration by policy makers.

2. The builders of the IPCC models employ the method of reasoning that is called
"mechanistic reductionism." Under this method, the model builder attempts to project the
phenomenology onto well known and interacting mechanisms. In some cases, understanding of the mechanism is relatively secure. In others, the mechanism is unknown but has a large
potential effect on the temperature. Radiative transport, which you mention in your posting,
is one of the mechanisms for which understanding is relatively secure. According to the IPCC
itself, understanding of the mechanism by which cloud formation affects the albedo is not
secure yet variations in the albedo may, and according to some studies do, have a large effect on the temperature.This effect may dwarf the effect of CO2.

Mechanistic reductionism sometimes works. It is, for example, the basis for engineering
design and often produces successful models in that context. However, when we employ
mechanistic reductionism in research on complex systems, it tends to fail. Whether the
attempt at applying mechanistic reductionism to modeling the climate is successful can be
determined only by testing the validity of the proposed models. The validity of a model
cannot be tested unless this model is structured to be falsifiable.

Meteorology is relatively rich in observed statistical events; thus, meteorologists are in a
good position to validate their models and they do so. Climatology is relatively poor in
observed statistical events. I wonder if this feature of climatology has led to disinterest
among climatologists in defining the nature of climatological events and interest in
obfuscating the issue.

3. You may be unclear on the distinction between a "projection" and a "prediction" so I'll
expand upon this topic. A "projection" is a mathematical function that maps the time to the
computed global average temperature. A "prediction" is a logical proposition that states the
outcome of a statistical event. A prediction has a variable which is called its "truth-
value." The truth-value of a prediction is true or false. A projection has no truth-value.

A projection supports comparison of the projected to the measured temperature and
computation of the error. A projection does not support the falsification of a model for the
conclusion cannot be reached that this model is falsified by the evidence. As the IPCC
models cannot be falsified, they lie outside science under Karl Popper's widely accepted
definition of "science."

To make the distinction between a "projection" and a "prediction" more concrete, I'll supply a fictional example. In the example, the projected temperature on January 1, 2020 at 00:00 hours Greenwich mean time is 16.3724 Celsius. The measured temperature is 16.3424 Celsius. The two temperatures differ. Does this observation falsify the associated model or does it not? This question cannot be answered without reference to the associated and as yet unspecified statistical event.

Climatology is not about the instantaneous values of climatological variables. It is about
the averages of these variables over time. In defining the statistical event, the period of
time over which this average is taken must be specified. Let us assume this period has been
specified and that the average measured temperature over this period on January 1, 2020 at
00:00 hours GMT is 16.3817 Celsius. Is the model falsified by this evidence or is it not?

The answer depends upon the as yet unspecified definitions of the outcomes. If each outcome is a temperature, then the model is falsified, for the predicted temperature differs from the measured one. Suppose, however, that an outcome is an element in the set of RANGES of temperatures {...16.2-16.3, 16.3-16.4, 16.4-16.5...}. In this case, the predicted outcome is 16.3-16.4. The measured outcome is 16.3-16.4. As the predicted outcome matches the measured one, the model is not falsified by this evidence.

In order for a model to avoid falsification, the outcomes must be ranges of temperatures
rather than temperatures. There is a further complication. When we model complex sysems, there is sure to be missing information about the outcome, given the observed state of the system at the time a prediction is made. It follows that a model can do no better than to predict the probabilities of the various outcomes and the uncertainties that are associated with these probabilities. It also follows that whether the model is or is not falsified must be determined by comparison of the predicted probabilities of the various outcomes to the observed relative frequencies of the same outcomes in a base of out-of-sample validation data. So far as I've been able to determine, there is no such data.

If my understanding of this situation is correct, then we are in the middle of a social phenomenon in which scientists are stating to policy makers they have a high level of confidence in models that are not scientific models. Lay persons, including politicians and journalists, continually confuse a model that is built by scientists with a scientific model thus reaching conclusions that are logically and empirically unwarranted.
Terry
 
  • #80


Terry Oldberg said:
Sylas:
Thank you for taking the time to comment on my posting. I'll preface my remarks by explaining that I'm an engineer (mechanical, electrical, nuclear) with a background in research and in the successful development of statistically validated models of complex systems.

Welcome to physicsforums, Terry.

Engineering is, of course, rather different to science. "Validation" is a normal part of working for designed or engineered constructions; but it does not have quite so central a position in empirical science.

For example... what would it mean to "validate" a climate model? We already know that they are not complete and only give a partial picture of climate. To quote a common phrase: climate models are always wrong but often useful.

According to Gray, he urged description by the IPCC of how the IPCC's models could be statistically validated. He says that the IPCC blew him off on this issue and implies that the IPCC's models are neither validated nor susceptible to validation. If Gray is correct, then the IPCC's models are not "scientific" models under Karl Popper's criterion of falsifiability.

Trenberth's article that you cited previously is a good guide on this. A climate model is not a scientific theory, but a tool that is based on many scientific theories. The hypotheses used in climate science are falsifiable in the usual sense of the word, scientifically. But a model is not a hypothesis that can be said to judged as scientific or not using Popper's criterion. Climate models can be used to help test some theories. They have limited skill, and their skill is improving. It is a good hypothesis in the usual sense of the word that a climate model does capture some aspect of climate to a certain accuracy; and people are testing ideas like that all the time. But a complete model, able to give validated projections? No such thing.

1. It sounds as though you may be quibbling about my claim that the IPCC models are the "basis for belief" in a CO2-temperature relationship but I'm unsure of what this quibble is.

So far as I am aware, the IPCC models are the sole vehicle by which the IPCC produces its projections of temperatures, with and without regulation of carbon dioxide emissions, for consideration by policy makers.

It's not a quibble; but a fundamental point about the nature of a scientific hypothesis. The scientific hypothesis is that CO2 has a strong effect on temperature. This is a scientific hypothesis in good standing, and it is now confirmed by multiple independent lines of evidence and theory to the point of being basic background to atmospheric physics. In particular, the associated forcing is known to a good level of accuracy; better than any of the other forcings involved.

However, that is not enough to make a prediction for temperatures. There's more involved than the CO2 forcing. Using some basic physics you can indeed get into the right ball park for the magnitude of CO2 effects on temperature. I've described how this works in the thread [thread=307685]Estimating the impact of CO2 on global mean temperature[/thread]; the thread has been inactive now for a couple of months but the first few posts spells out how this estimate proceeds -- and it doesn't use climate models. The basis of this kind of calculation is from work that is all reported in the IPCC; which is a pretty comprehensive summary of the relevant science.

To get an actual prediction for future temperatures, however, you would need to know all the other forcings as well, plus all the details of climate response. And -- as Trenberth points out -- we don't have that level of detail.

This does nothing to damage the strong support for the basic fact of the warming effect of carbon dioxide. It just means there's a lot more than this required to make predictions. We need to keep these two distinct aspects in mind.

What has been calculated are rough estimates of what is possible under different emission scenarios. For a given scenario, there is still a wide uncertainty in the consequences; but these are quantified. There are strong lower bounds on the consequences as far as global temperature rise is concerned; and this does indeed constitute a conventional falsifiable hypothesis. But because it's not possible to give fully validated models of the whole climate system, the best scientific information gives you a range of possible outcomes; and there are a heap of open research questions for sorting that out and refining out understanding further.

2. The builders of the IPCC models employ the method of reasoning that is called "mechanistic reductionism." Under this method, the model builder attempts to project the phenomenology onto well known and interacting mechanisms. In some cases, understanding of the mechanism is relatively secure. In others, the mechanism is unknown but has a large potential effect on the temperature. Radiative transport, which you mention in your posting, is one of the mechanisms for which understanding is relatively secure. According to the IPCC itself, understanding of the mechanism by which cloud formation affects the albedo is not secure yet variations in the albedo may, and according to some studies do, have a large effect on the temperature. This effect may dwarf the effect of CO2.

Your question about albedo is legitimate... and something that has to be sorted out using empirical studies. A model is not going to be good enough for this, because clouds have such a strong albedo contribution, and they are one of the hardest things to model well.

Of course, people ARE studying albedo. The evidence so far strongly suggests that albedo changes are not as great as the forcing from carbon dioxide.

The situation with albedo is a bit vexed, because albedo changes can arise as a feedback from rising temperatures. A purely empirical study of changing albedo levels is not sufficient for sorting that out... for this I think Trenberth's call for better monitoring of energy flows over all may help allow the different theories for cloud to be tested. But in any case, the albedo measures themselves, from satellites and other indirect methods, all seem to indicate that the albedo effect is not as great as the direct CO2 forcing. For more on these studies, see [post=2497270]msg #17[/post] of thread "Another climate update".

The answer depends upon the as yet unspecified definitions of the outcomes. If each outcome is a temperature, then the model is falsified, for the predicted temperature differs from the measured one. Suppose, however, that an outcome is an element in the set of RANGES of temperatures {...16.2-16.3, 16.3-16.4, 16.4-16.5...}. In this case, the predicted outcome is 16.3-16.4. The measured outcome is 16.3-16.4. As the predicted outcome matches the measured one, the model is not falsified by this evidence.

In order for a model to avoid falsification, the outcomes must be ranges of temperatures rather than temperatures. [...]

It is perfectly normal in all areas of science to give results with quantified uncertainties, and to falsify them to a certain level of confidence with observations that are statistically implausible given the probability distributions of theory. As evidence of this accumulates, the model is falsified. This is not limited to climate; it occurs in all areas of science.

I think your account is a bit too limited to really explain the nature of a scientific model and hypothesis testing.

There is also the more fundamental point that even if climate models were shown to be severely incorrect because of (for example) a pervasive systemic error in the handling of cloud albedo common to all the models, this would still not be falsify the warming effects of carbon dioxide. It would just falsify the rather more comprehensive numerical model of climate that takes a lot of other things into account.

The temperature effect of carbon dioxide is two fold. There is the forcing -- which is well known to be about 5.35 W/m2 per natural log of atmospheric concentrations. And there is the climate sensitivity to that forcing, which is much less well constrained and subject to ongoing scientific work, attempting to propose and falsify different scientific hypotheses. Climate models are not the only tool for this. A common hypothesis is that the temperature response is greater than 2 degrees per doubling of concentrations. There have been some lines of evidence proposes to falsify this, which we discuss from time to time. But these are an exception; and wildly considered to be flawed themselves. The vast bulk of work attempting to constrain response has rather falsified the idea of smaller sensitivities.

As is normal in active science, this is all being investigated and new lines of evidence and argument published and considered. To simplistic a focus on the falsification criterion for science tends to miss much of the scientific process in which people test ideas, and test and tests, and work their way towards improved understanding.

Cheers -- sylas
 
  • #81
Terry Oldberg said:
When we model complex sysems, there is sure to be missing information about the outcome, given the observed state of the system at the time a prediction is made. It follows that a model can do no better than to predict the probabilities of the various outcomes and the uncertainties that are associated with these probabilities. It also follows that whether the model is or is not falsified must be determined by comparison of the predicted probabilities of the various outcomes to the observed relative frequencies of the same outcomes in a base of out-of-sample validation data.

True. However, you may do well to read Chapter 8 on Climate Models and their Evaluation in the IPCC Physical Science Basis:

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8.html

Anyhow, validation isn't the correct term for climate models; try detection and attribution.
Climate Science is basically a signal to noise problem.

Framing anthropogenic global warming as a true/false proposition is too simplistic.
We know that CO2 causes warming. However, what we don't know is how fast. The consensus science is that it's in the range of 2 to 4.5 C/CO2 doubling with a 95% confidence. Also, what we don't know very well is how precipitation patterns will be altered.
There is a wide spread among the models in this regard and precipitation patterns may turn out to be more important to us humans than temperature.

To make a testable prediction, the initial state of the system must be understood in the first place. The Oceans comprise about 90% of the thermal inertia of the Earth's climate so it not possible to make a prediction without modeling the oceans. Up until 2003, there was limited data on the oceans and the ARGO system wasn't considered complete until around 2006. So, we have only have a few years of reasonably good data for an initial state and it will probably be several years before the testable validation type test results you are looking for are available. At that time, the range for CO2 warming will likely be narrowed. It could be 2 to 3C/CO2 doubling or maybe 3.5 to 4.5C/doubling or it may even turn out to be 4 to 5C/doubling. However, it is extremely unlikely that it will be found to be less than 1.5C/doubling. The lower end of sensitivity has been examined closely and there is too much data supporting the higher sensitivities. There is clearly a skewed distribution of probabilities.
 
Last edited:
  • #82
Response to posting #81

Xnn: Thanks for taking the time to respond.

I've read Chapter 8.

I wonder if you'd share your understanding of how numerical values are assigned to the probabilities of the various ranges of sensitivities. For example, how is the value of 0.95 assigned to the probability that the range of sensitivites lies in the interval between 2 and 4.5 C/CO2 doubling?

Terry
 
  • #83
Terry;

Basically, in order to constrain climate sensitivity a stable period of time is needed that is long enough so that we can assume equilibrium is reached. The last glacial maximum (LGM) is frequently used. Models are then run with all that is known about the LGM for about a 1000 times. The results are that sensitivities greater than about 4.5C or less than 2C generally don’t match the data very well. Similarly, it can be demonstrated that < 1.5C and > 6C can probably be ruled out with greater confidence.

Here is a link to a paper on the subject:

http://www.pik-potsdam.de/~stefan/Publications/Journals/Schneider_etal_ClimDyn_2006.pdf

Notice, this paper found a range of 1.2–4.3C/doubling. I believe the IPCC range of 2-4.5 is based on a number of different models, but like I said until we get a better handle on the Oceans, there won't be much of a narrowing the range and about 3C/doubling looks to be mostly likely.
 
  • #84
Terry Oldberg said:
Xnn: Thanks for taking the time to respond.

I've read Chapter 8.

I wonder if you'd share your understanding of how numerical values are assigned to the probabilities of the various ranges of sensitivities. For example, how is the value of 0.95 assigned to the probability that the range of sensitivites lies in the interval between 2 and 4.5 C/CO2 doubling?

Xnn gives a nice link to an interesting study of the Last Glacial Maximum. (Thanks; Xnn! That's a useful one and I've taken a copy.) This uses climate models with a range of parameters to fit the LGM and the likelihood comes from a Monta Carlo sampling of the parameter space. This is a good example of how climate models are often used in research. They allow for a kind of virtual experiment and constrain physically credible results, without actually giving specific projections or predictions.

However, given your questions about models, there's another study you may find interesting discussed in thread [thread=334005]A low likelihood for high climate sensitivity[/thread]. This shows how Bayesian analysis is applied, and in particular looks at the dependence on the priors used. The authors, Annan and Hargreaves, have done a fair amount of study on climate sensitivity. Two major references, both discussed in that thread and directly relevant to your question, are:
  • Annan, J. D., and J. C. Hargreaves (2006), http://www.agu.org/pubs/crossref/2006/2005GL025259.shtml, in Geophys. Res. Lett., 33, L06704, doi:10.1029/2005GL025259.
  • Annan, J.D., and Hargreaves, J.C. (2009) On the generation and interpretation of probabilistic estimates of climate sensitivity, Climatic Change, online-first Oct 10 2009, doi: 10.1007/s10584-009-9715-y. (http://www.jamstec.go.jp/frcgc/research/d5/jdannan/probrevised.pdf ).

For details on how the IPCC report summarizes sensitivity estimates, chapter 9 of the 4AR is best.

Cheers -- sylas
 
Last edited by a moderator:
  • #85
The ice ages do not provide any evidence for the CO2 sensitivity. Global temperatures are thought to decline by -5.0C and the changes in CO2 can only explain about -1.8C of the temperature change at 3.0C per doubling.

The ice ages do not match the 100,000 year orbital cycle or the high-latitude summer solar insolation Milankovitch Cycles either.

What Albedo estimate are they using in these climate simulations. All those glaciers and sea ice and snow are reflecting much more sunlight than is currently the case. I have read a dozen of these papers and I have never seen a single Albedo estimate provided yet.

I like to download data and check the numbers against the assertions made. The only way to get the ice ages to work is that Albedo must increase to about 0.333 (this is as high as one can get) and then it has to become its own self-sustaining climate forcing that even the Milankovitch Cycles can only break about a third as often as they should.

http://img51.imageshack.us/img51/2127/last3iceages.png

...

http://img109.imageshack.us/img109/9195/milkanvsiceages.png

...

http://img27.imageshack.us/img27/3616/dustandiceages.png
 
Last edited by a moderator:
  • #86
Bill Illis said:
The ice ages do not provide any evidence for the CO2 sensitivity.

Do you have any basis for this claim in the scientific literature, other than your own personal skepticism of the various papers that use the last glacial maximum to infer bounds on sensitivity? It's usually expected that we discuss the normal progress of science, by referring to ideas that have support in the literature.

Global temperatures are thought to decline by -5.0C and the changes in CO2 can only explain about -1.8C of the temperature change at 3.0C per doubling.

Minimum CO2 levels were about 180ppm, and the current value (ignoring anthropogenic increases since the industrial revolution) is about 280. The number of doublings is log2(180/280) = -0.64, which gives about -1.9C difference; so I get a little bit more cooling than -1.8, but we are in the same ball park. To this you also add a little extra for the drop in methane and possibly nitrous oxide, to a slightly stronger net cooling from reduced greenhouse effects at the LGM.

Albedo effects are generally thought to be comparable to greenhouse forcing at the LGM. An albedo of 0.33 as opposed to the current 0.3 would give a forcing of 0.03 * 342 = over 10 W/m2. That's enormous; about 4 times as much as the carbon dioxide forcing of around 2.5. I don't believe your numbers; do you have a reference or is this your own estimate?

Conventionally, a forcing of -3 W/m2 would correspond to an albedo of just over 0.31. This is the estimate in the paper by Annan and Hargreaves which I cited for you previously.

Your final diagram states "There is almost no period of increased dust during a glacial expansion phase." What is your basis for this claim, and what confidence can you assign it? It conflicts with what I have seen in the literature. Dust based forcings are considered uncertain; but there is believe to be an increase... described in Xnn's reference.

The ice ages do not match the 100,000 year orbital cycle or the high-latitude summer solar insolation Milankovitch Cycles either.

I would say it matches both, but at different times within the Quaternary.

What Albedo estimate are they using in these climate simulations. All those glaciers and sea ice and snow are reflecting much more sunlight than is currently the case. I have read a dozen of these papers and I have never seen a single Albedo estimate provided yet.

The papers I have cited suggest an albedo forcing of about -3 W/m2 at the LGM, which corresponds to about 0.31, and possibly a little bit more with dust effects; but nowhere near 0.333

Cheers -- sylas
 
Last edited:
  • #87
sylas said:
It's usually expected that we discuss the normal progress of science, by referring to ideas that have support in the literature.

Actually more like -1.9. Minimum CO2 levels were about 180ppm, and the current value (ignoring anthropogenic increases since the industrial revolution) is about 280. The number of doublings is log2(180/280) = -0.64, which gives about -1.9C difference.

Albedo effects are generally thought to be comparable to greenhouse forcing at the LGM. An albedo of 0.33 as opposed to the current 0.3 would give a forcing of 0.03 * 342 = over 10 W/m2. That's enormous; about 4 times as much as the carbon dioxide forcing of around 2.5. I don't believe your numbers; do you have a reference or is this your own estimate?

Conventionally, a forcing of -3 W/m2 would correspond to an albedo of just over 0.31. This is the estimate in the paper by Annan and Hargreaves which I cited for you previously.

Your final diagram states "There is almost no period of increased dust during a glacial expansion phase." What is your basis for this claim, and what confidence can you assign it? It conflicts with what I have seen in the literature. Dust based forcings are considered uncertain; but there is believe to be an increase... described in Xnn's reference.

The papers I have cited suggest an albedo forcing of about -3 W/m2 at the LGM, which corresponds to about 0.31, and possibly a little bit more with dust effects; but nowhere near 0.333

Cheers -- sylas

I'm just using the data from the Antarctic ice cores.

The CO2 minimum is 184.4 ppm at 22,000 years ago and at 3.0C per doubling of CO2 [using CO2 as a proxy for all the GHGs], the temperature change is 1.8C.

I am also using the simplest form of the Climate Framework so that one can separate out the Albedo and Solar Irradiance changes from the impacts of the GHG forcing (I don't have a Climate Model).

Earth Surface Temperature = Solar Energy Effect + Greenhouse Effect

Which is equal to:

288K = 15C = 255K + 33K

And the Solar Energy Effect equals:

Temp Solar Effect = [Solar Irradiance * (1-Albedo) /4 /Stefan-Boltzmann Constant] ^0.25

255K = [1,366 * (1 -0.298) / 4 / 5.67E-08]^0.25

And the Greenhouse Effect equals:

Temp Greenhouse Effect = [33K +/- CO2 Sensitivity@CO2 Doublings]​

Now if one is going to use a temperature sensitivity of 0.75C/watt/metre^2 of forcing as the temperature change at the surface, then one could end up with a lower Albedo estimate for the last glacial maximum but that contradicts the estimates one gets from the Stefan-Boltzmann equations for the impact of solar forcing. The Stefan-Boltzmann equations do not predict a temperature change of 0.75C/watt/metre^2.

I am also using this framework to go farther back in time when the Faint Young Sun becomes important and Solar Irradiance was lower so I need to use a simpler framework. It still provides the same temperature change.

Albedo is an important concept that should not be buried inside a climate model. No matter what Albedo estimate one uses, 3.0C per doubling of CO2/GHGs only explains a small part of the temperature change and the Milankovitch Cycles only partly match up.

On the dust concentrations issue, I just plotted the numbers against the timelines. The dust concentrations could be explained by the dust and loess created by glaciers as they advance. When they stop advancing or melt-back, all that material is left behind and when it dries out and the wind blows (and there is a lot of wind around glacial fronts) there is increased dust. The other explanation of dry conditions and increased deserts in mid-latitudes is certainly part of the picture, its just that the numbers don't match up entirely with that explanation. There is one other explanation in that when the Antarctic glaciers stop accumulating snow and ice, (there is little snowfall and the layers accumulate years of material in very thin layers like at glacial maximums) there is just increased dust recorded in the layers. In years when more snow is falling, there is little dust accumulation in thicker layers.
 
Last edited:
  • #88
Bill Illis said:
I'm just using the data from the Antarctic ice cores.

The CO2 minimum is 184.4 ppm at 22,000 years ago and at 3.0C per doubling of CO2 [using CO2 as a proxy for all the GHGs], the temperature change is 1.8C.

OK... except that you can't use CO2 as a proxy for the others. There's a small additional contribution. CO2 is the major greenhouse contribution, but CH4 and N2O add on to that.

The forcing for CO2 should be about 5.35*Ln(184.4/280)... using your numbers... which gives about -2.23 W/m2. To this you add about -0.3 for CH4 and the same for N2O. So it gets up to -2.8 W/m2 for the greenhouse forcing, which (at 3 K/2xCO2) would be close to -2.3 K total temperature contribution.

You can find the estimates for the other gases in chapter 6 of the IPCC 4AR WG1: the chapter on paleoclimate, page 448.

I am also using the simplest form of the Climate Framework so that one can separate out the Albedo and Solar Irradiance changes from the impacts of the GHG forcing (I don't have a Climate Model).

That's fine; you can get into the ballpark that way.

Earth Surface Temperature = Solar Energy Effect + Greenhouse Effect
Which is equal to:

288K = 15C = 255K + 33K

And the Solar Energy Effect equals:

Temp Solar Effect = [Solar Irradiance * (1-Albedo) /4 /Stefan-Boltzmann Constant] ^0.25

255K = [1,366 * (1 -0.298) / 4 / 5.67E-08]^0.25

And the Greenhouse Effect equals:

Temp Greenhouse Effect = [33K +/- CO2 Sensitivity@CO2 Doublings]​

Now if one is going to use a temperature sensitivity of 0.75C/watt/metre^2 of forcing as the temperature change at the surface, then one could end up with a lower Albedo estimate for the last glacial maximum but that contradicts the estimates one gets from the Stefan-Boltzmann equations for the impact of solar forcing. The Stefan-Boltzmann equations do not predict a temperature change of 0.75C/watt/metre^2.

That's improper use of Stephan Boltzman in all kinds of ways. Most crucially you can't just treat Earth as a static radiator. The 3 C/doubling you are using already assumes a substantial feedback; and the same applies here with albedo.

The more useful quantity is the forcing. With insolation at the top of the atmosphere at close to 342 W/m2, each 0.1 change in albedo gives you a forcing of 3.4 W/m2.

A simple no-feedback response estimate can be given as T/4Q, where Q is emission of about 240 W/m2 and T is the surface temperature of about 288K. (See [post=2497769]msg #28[/post] for more detail.) This gives you about 0.3 K per W/m2, which is less than 0.75 as you say. But by using a sensitivity of 3 K per 2xCO2, which is what you proposed previously, and noting that 2xCO2 is a forcing of about 3.7 W/m2, the real response 3/3.7 = 0.8 K per W/m2.

So your 0.1 change in albedo is a temperature contribution of 3.4*0.8 = -2.7 degrees; about the same as the greenhouse contribution. This is pretty standard with analysis of the LGM in the literature. Greenhouse and ice-albedo effects are of a similar magnitude.

Albedo is an important concept that should not be buried inside a climate model. No matter what Albedo estimate one uses, 3.0C per doubling of CO2/GHGs only explains a small part of the temperature change and the Milankovitch Cycles only partly match up.

Your analysis makes a simple error by taking feedback into account for greenhouse forcing but not for the albedo forcing. You need feedback to affect both.

On the dust concentrations issue, I just plotted the numbers against the timelines. The dust concentrations could be explained by the dust and loess created by glaciers as they advance. When they stop advancing or melt-back, all that material is left behind and when it dries out and the wind blows (and there is a lot of wind around glacial fronts) there is increased dust. The other explanation of dry conditions and increased deserts in mid-latitudes is certainly part of the picture, its just that the numbers don't match up entirely with that explanation.

Well, from your graph it appears that there is substantial dust at the LGM, so you can't ignore it. This is pretty standard in the literature.

Your framework will give you a rough ball park with what is obtained with the published work done more thoroughly by working researchers in paleoclimate, once you treat the climate sensitivity consistently for the various forcings.

Cheers -- sylas
 
  • #89
sylas said:
A simple no-feedback response estimate can be given as T/4Q, where Q is emission of about 240 W/m2 and T is the surface temperature of about 288K. (See [post=2497769]msg #28[/post] for more detail.) This gives you about 0.3 K per W/m2, which is less than 0.75 as you say. But by using a sensitivity of 3 K per 2xCO2, which is what you proposed previously, and noting that 2xCO2 is a forcing of about 3.7 W/m2, the real response 3/3.7 = 0.8 K per W/m2.


Cheers -- sylas


Thanks sylas, I guess I will have to give up this effort.


But I still have a problem with the inconsistency in all these numbers.

240 watts/metre^2 of solar energy results in 255.0K or 1.06K/watt/metre^2

3.7 watts/metre^2 of 2XCO2 forcing results in 3.0K or 0.8K/watt/metre^2

1.0 watt/metre^2 of extra energy in the SB equation results in 0.3K/watt/metre^2

288K at the surface is the equivalent of 390 watts/metre^2 in the SB equation (or 0.74K/watt/metre^2)

adding 1.0 extra watt/metre^2 to 390 watts/metre^2 only results in an extra 0.18K/watt/metre^2


I think there has been too much averaging in all these estimates and in the climate models and the incremental differentials are not being used (the equations should be logarithmic).

This is the way I look at it and I will delete this post (and probably move on to other issues) if people have a problem with it.

http://img187.imageshack.us/img187/6840/sbEarth'surfacetemp.png

You can extend this chart all the way out to 63,250,000 watts/metre^2 for the surface of the Sun and the temperature of 5,779K will be correct.

The incremental each extra watt/metre^2 at the surface is only 0.18K. The Sun needs to add 50,000 watts/metre^2 to add 1.0K to its surface temperature.

http://img189.imageshack.us/img189/2608/sbtempcperwatt.png
 
Last edited by a moderator:

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
7K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 25 ·
Replies
25
Views
8K
  • · Replies 180 ·
7
Replies
180
Views
35K
  • · Replies 13 ·
Replies
13
Views
6K
  • · Replies 19 ·
Replies
19
Views
7K
  • · Replies 54 ·
2
Replies
54
Views
13K
  • · Replies 75 ·
3
Replies
75
Views
21K
  • · Replies 7 ·
Replies
7
Views
4K