Is the Faint Young Sun Problem Solved by Increased Greenhouse Gases?

AI Thread Summary
Global warming is widely accepted as a scientific fact, with rising global surface temperatures and ocean temperatures, along with accelerating sea level rise due to thermal expansion and melting ice. The oceans have absorbed over 80% of the heat from global warming, which is a significant factor in climate change. Historical data shows that glaciers and ice caps have lost mass, contributing to sea level rise, while permafrost warming has been observed in various regions. The discussion also touches on the complexities of natural climate cycles and the impact of greenhouse gases, with some skepticism about the extent of human influence on climate change. Overall, the evidence strongly supports the reality of global warming and its implications for the planet.
  • #101
Xnn said:
If it weren't for all of us humans, the world would be icing up.

I think that is going much too far. The cooling contribution of natural forcings since the middle of the twentieth century has been tiny. If you go back further, natural forcings certainly contributed to the heating in the first part of the century. If you look on the scale of millenia, the Holocene is usually believed to have a long time yet to run, even if humanity was out of the picture entirely.

But we risk topic drift here.

Cheers -- sylas
 
Earth sciences news on Phys.org
  • #102
There is a relevant paper on the subject of an early Anthropocene.

http://earth.geology.yale.edu/~avf5/teaching/Files_pdf/Ruddiman2003.pdf
 
Last edited by a moderator:
  • #103
Xnn said:
There is a relevant paper on the subject of an early Anthropocene.

http://earth.geology.yale.edu/~avf5/teaching/Files_pdf/Ruddiman2003.pdf

Yes; I am familiar with this notion. The major proponent is William Ruddiman, who is famous for his "early anthropocene", meaning that he believes the human impact on climate began with agriculture, thousands of years ago, and it has prevented the onset of the next ice age.

This is a minority few at present; and the weight of evidence is running against it. The debate goes on, and Ruddiman's stature ensures that it continues to be taken seriously.

Basically, Ruddiman proposed that the Holocene would be coming to an end by now, and the start of a new ice would have begun, if not for the climatic effects of widespread agriculture.

The alternative view, proposed by Loutre and Berger and others, is that Holocene should be expected to be a long interglacial due to low eccentricity, and that absent any human impact the next ice age is still some 30 thousand years or more in the future. A similarly long interglacial occurred in stage 11, some 400 thousand years ago, when eccentricity was also low.

Pre-industrial human impact on climate is mostly regional, rather than global, according to other research.

The nice thing about the IPCC reports is that they give a pretty comprehensive survey of the literature on competing ideas like this. Chapter 6, in paleoclimate, is the relevant part. The more common view at present is for a long integlacial, discussed in section 6.4.1.8 (When will the current interglacial end). Ruddiman's early anthropocene idea, and criticisms, are discussed in section 6.5.1.2 (Why Did Holocene Atmospheric Greenhouse Gas Concentrations Vary Before the Industrial Period?).

The evidence seems to be running against Ruddiman's proposal, though it is still open. The paper you have cited was followed shortly by a "comment" paper by other researchers indicating that Ruddiman's proposal was flawed.

References (starting from the paper cited by Xnn and looking at the subsequent exchange of contrasting views.)
  • Berger, A. and Loutre, M.F. (2002) "An Exceptionally Long Interglacial Ahead?" in Science, v297 (23 Aug 2002), pp 1287-1288.
  • Ruddiman, W.F. (2003) "The Anthropocene Greenhouse Era began Thousands of Years Ago", in Climatic Change v 61 pp 261–293.
  • Claussen, M. et. al. (2005) "Did Humankind Prevent a Holocene Glaciation?" (comment on Ruddiman 2003) in Climatic Change v 69, pp 409-417. (The author's answer is "no".)
  • Ruddiman, W.F. (2007), "The early anthropogenic hypothesis: Challenges and responses", in Rev. Geophys., 45, RG4001, doi:10.1029/2006RG000207. (Acknowledges widespread criticisms and proposes answers to them.)

There are more papers by both Claussen and Ruddiman and their respective colleagues that continue to debate this matter. Ruddiman appears to acknowledge that he has not so far managed to convince the rest of the paleoclimate community, and also that there is still not a conclusive case. By and large, the long interglacial hypothesis seems to have more support, and pre-industrial human impact on climate is not considered to have prevented an ice age.

Cheers -- sylas
 
Last edited by a moderator:
  • #104
Sylas;

Appreciate your feedback and understand that Ruddiman's original hypothesis should not be completely accepted. However, my impression it is accepted that there was a pre-industrial age human contribution of roughly 10 ppm CO2 and 100 ppb CH4 to the atmosphere. In addition, we know that over the last 5000 years orbital changes have lead to a gradual cooling of the arctic that is expected to continue for several thousand years. So, absent human activities, we could have expected an expansion of glacial coverage in the northern hemisphere. This doesn't mean that there should have been a rapid expansion of ice conditions, but rather a continuation of what is known as the little ice age. That is, there would have been an gradual icing of the earth, especially in the northern hemisphere.

Unfortunately, I haven't been able to locate copies of the papers criticizing Ruddiman's work to study. However, I notice that a integrated analysis of solar insolation such as that performed by Huyber (figure 2E) appears to distinguish between recent solar forcing and that of 420 Kyrs ago.

http://www.sciencemag.org/cgi/reprint/313/5786/508.pdf
 
Last edited:
  • #105
sylas said: "Engineering is, of course, rather different to science. "Validation" is a normal part of working for designed or engineered constructions; but it does not have quite so central a position in empirical science.

For example... what would it mean to "validate" a climate model? We already know that they are not complete and only give a partial picture of climate. To quote a common phrase: climate models are always wrong but often useful.
"

Sorry, I can't accept that. If you want to say that it is against my religion, fine. But in this case my religion is what I learned from professors and colleagues that it takes to do statistics right. The difference can be summed up in two books: Lies, Damned Lies and Statistics by Michael Wheeler, and The Visual Display of Quantitative Information by Edward Tufte. Go read both books, then go look at the original hockey stick again and decide which author describes it...

Terry Oldberg said: "According to Gray, he urged description by the IPCC of how the IPCC's models could be statistically validated. He says that the IPCC blew him off on this issue and implies that the IPCC's models are neither validated nor susceptible to validation. If Gray is correct, then the IPCC's models are not "scientific" models under Karl Popper's criterion of falsifiability."

Amen Terry! For me this sums up perfectly why I count myself an anthropogenic global warming skeptic even though I think there are good reasons for reducing CO2 emissions.

Although much of my work has been in the area of programming languages and algorithms, I have a MS in Statistics. I recently made a post on a completely different topic in which I said: "Beware of EDA (exploratory data analysis) without validation. There are times when you must split the sample available to you in half, one part for EDA, the other for validation. Anything else is junk science, or at least junk statistical analysis." (Since I am quoting myself here, I felt I could make some minor edits without adding brackets. ;-)

So the problems I have with climate science as a whole, are the incestuous sharing and editing of data, which makes anything like an independent test almost impossible, and the attitude that even when falsified, climate models should be kept the parameters juggled, new data added etc. No one should be surprised that climate models have no predictive power given the culture in which they have been created.

Will there be a real climate science at some point in the future? Probably. But I can't see it evolving from the current climate experts. They are not real scientists, although they do claim that on TV.

Am I being too harsh? I don't think I am being harsh enough. As a statistician who finds EDA to be fun, I often ask for the data behind papers in in areas such as cosmology, superconductivity, and the solar neutrino problem to mention a few. (I have also asked for such data in fields where I am known, but I am ignoring that.) In every area except climate science, the only problem I have is that the scientists are so glad to have someone interested who is a statistician and programmer who can help with validation, that my validation (or falsification) is not independent.* But in climate science the story is much different.

Yes, I have looked at the data which is now publicly available, and I am regularly shocked at how poor it is. What use is data from sites where measurements were only taken when it wasn't too cold, snowing, or raining? Or where normal and unavailable are represented by the same symbol (0). Climate researchers have often further processed these data to substitute proxies where original data is missing, but now any validation is impossible. At best, attempts to duplicate results will end up either accepting the researcher's assumptions, or with a much different data set. This is the "one tree in Siberia" problem.** If you have access to the original data you can run tests to determine the possible degree of measurement error and the amount of (unbiased) random error present. But when the only data available have been merged, outliers removed, and otherwise "cleaned up," you have to either accept it on religious grounds or reject it as unverifiable.

The other way, of course, to test models is to use them for prediction. Yes, I have seen models which did predict colder temperatures in 2008 and 2009--but they are based on sunspots, and cosmic rays. They are definitely not part of any IPCC consensus.

Finally, Bill Illis has linked to some (decent) NOAA data showing CO2 levels over 6000 ppm, or 20 times current levels millions of years ago. The simple application of the Stefan-Boltzman law would call for about 16 degrees C of ratiative forcing, which was clearly not the case. I certainly understand why the simple thermodynamics doesn't work. There are windows in the CO2 absorption spectrum which remain open no matter how many doublings of CO2 partial pressures occur. It doesn't take much data analysis either to realize that the answer is that water vapor and clouds have a complex response to temperature. But the shouting down by "climate scientists" of weathermen who use statistics to develop and validate complex models of how water vapor actually works is pretty shameful. (They probably watch the predictions of those models every night on TV to see what to wear in the morning, while shouting the models down during the day.)

I'd love to share some of those weather models with you, but there is an entirely different problem involved. The weather data itself is available from NOAA to anybody, in more detail than most desktop computers can handle. But there is a tremendous competition, part of it commercial, part of it military, to come up with better long-term weather prediction models. (And when long-term is long enough, it becomes climate.) Most of this work is done within a few miles of Bedford, MA, either at commercial weather forcasting companies, or by government agencies headquartered at Hanscom AFB.

I worked at MITRE Bedford for over a decade, and as I said at the beginning, my work involvement in these projects usually involved such things as programming languages support, distributed systems development, or (complexity) analysis of algorithms. So the data and models were not mine to distribute. However, the development effort in house was approximately three times as much effort on validating models as on creating new ones. When the models were turned over to the Air Force, the additional validation costs were huge. The additional work created by running multiple models side-by-side for a year or so is appalling. Back then it was basically one model run per supercomputer with multiple runs of the same model necessary every day. Part of the pain is that the data used in the run are usually twelve to twenty-four hours stale by the time the run finishes--and it often took two supercomputers to produce one result every twelve hours. If you read the forcasts from the National Hurricane Center you will find that today (well this year) they usually run four different models against the data from each hurricane every few hours.

The big prediction problem incidently is still the rain/snow line in three dimensions. It is easy to see how snow vs. rain on the ground results in a different albedo once the storm is over, but the same line is very important in heat transport within summer thunderstorms. And validation there is just a bit harder. I've seen pictures of aircraft that came back with (fortunately only slightly)) bent wings. And with lots of dents from flying into hail.


*I remember one case where I suggested non-parametric statistics as a way to tease a signal out of the background. The Wilcoxon rank sum test pulled out a very clear and statistically significant result--which after a few more days of data turned out to be from the heating system in the building. Sigh! At least I helped him track down what would have been an embarrassing systematic error, even if it did mean he had to start data collection all over again.

** Yes, I know that the tree ring data from Siberia was not based on data from just one tree. The problem is that the processed data wipes out the normal random variations that can be used to test for significance.
 
  • #106
Xnn said:
Appreciate your feedback and understand that Ruddiman's original hypothesis should not be completely accepted. However, my impression it is accepted that there was a pre-industrial age human contribution of roughly 10 ppm CO2 and 100 ppb CH4 to the atmosphere. In addition, we know that over the last 5000 years orbital changes have lead to a gradual cooling of the arctic that is expected to continue for several thousand years. So, absent human activities, we could have expected an expansion of glacial coverage in the northern hemisphere. This doesn't mean that there should have been a rapid expansion of ice conditions, but rather a continuation of what is known as the little ice age. That is, there would have been an gradual icing of the earth, especially in the northern hemisphere.

Fair enough. 10 ppm CO2 is a forcing of about 0.2 W/m2, and 100 ppb CH4 is a forcing of about 0.06 W/m2, using approximation formulae for estimating forcings from a change in greenhouse gas concentrations. (Formulae are in the http://www.grida.no/publications/other/ipcc_tar/?src=/climate/ipcc_tar/wg1/222.htm .) If climate sensitivity is sufficiently high, this could make a significant difference. Climate sensitivity is about 0.8 +/- 0.4 degrees per W/m2, so this anthropogenic pre-industrial forcing could drive as much as 0.3 degrees... not much more, I think.

I was reacting to the phrase "the world would be icing up". If you mean a little bit of additional glaciation, then yes; but it could easily be taken as something much more than this; a new ice age. I think that is unlikely. The Holocene would more likely continue mild, with small variations and perhaps a very slow cooling trend of the order of fractions of a degree per century.

An example of Holocene cooling trend estimation is
  • Kullman, L (1993) http://www.jstor.org/pss/2997659, in Global Ecology and Biogeography Letters 2, pp 181-188.

I've only read the abstract. It derives a cooling trend of 0.12 degrees per millenum, a bit less than one hundredth of the current warming trend. The abstract indicates this value is consistent with the work of Berger on a long interglacial, cited previously. This trend is over some 8000 years, and corresponds to a drop in temperatures of about a degree, which seems about right. It would mean current temperatures are not quite as high as at the Holocene thermal maximum as yet, which makes sense to me. So 0.3 degrees off set from that helps.

We came out of the little ice age mainly because of natural variations, I think; it was not part of a longer trend but more of a dip... and mostly regional rather than global.

Unfortunately, I haven't been able to locate copies of the papers criticizing Ruddiman's work to study. However, I notice that a integrated analysis of solar insolation such as that performed by Huyber (figure 2E) appears to distinguish between recent solar forcing and that of 420 Kyrs ago.

http://www.sciencemag.org/cgi/reprint/313/5786/508.pdf

Found a freely available preprint of that:
I'll have a look at it. It appears to look at the Early Pleistocene, rather than the current Late Pleistocene. The final sentence of the paper is:
However, the 100-ky glacial cycles of the late Pleistocene have a more complicated relationship with the forcing, and their explanation will require a better understanding of ice sheet–climate interactions.[/color]​
420,000 years ago is still part of the late Pleistocene. The paper seems to divide early and late at about 1 million years ago, according to figure 2E, which is all labeled as part of the late Pleistocene.

There's another source you might find interesting. Ruddiman has a guest post available at the realclimate blog, which is a deliberate attempt to communicate issues in climate science to a wider general audience. He discusses the contrasts between his ideas and those of Berger and indicates what it would take (in his view) to distinguish them. It is a nice informal and open ended discussion.

Cheers -- sylas
 
Last edited by a moderator:
  • #107
Hi eachus. I don't mind if we disagree on general matters such as the quality of climate science generally, or the role of "validation" in science, and so on. I'm content with my perspective as stated previously, particularly in [post=2507536]msg #80[/post] which you have quoted, and don't see much value in expanding on it.

So I'll just comment on a two simpler points that can be resolved more easily.

eachus said:
The other way, of course, to test models is to use them for prediction. Yes, I have seen models which did predict colder temperatures in 2008 and 2009--but they are based on sunspots, and cosmic rays. They are definitely not part of any IPCC consensus.

If you have an actual citation for such predictions, then this could be the basis for an interesting new thread. It would be better as a new thread, it's a bit off topic here. To my knowledge the proposals for sunspots and cosmic rays are nowhere near able to give clear predictions like that, and certainly don't explain the large short term variations. Colder temperatures in 2008 is quite unconventionally recognized as part of the short term ENSO cycle. 2009 is heating up again, and appears to be the fifth hottest year on record. So if anyone was predicting colder temperatures, they've been falsified.

Finally, Bill Illis has linked to some (decent) NOAA data showing CO2 levels over 6000 ppm, or 20 times current levels millions of years ago. The simple application of the Stefan-Boltzman law would call for about 16 degrees C of ratiative forcing, which was clearly not the case.

I think you are mistaken. Bill's links go back some 20 million years, and CO2 levels have remained below 600ppm over that period. Is 6000 a typo?

In any case, the forcing for 600ppm over pre-industrial levels would be around 4 W/m2, and given current best estimates for sensitivity this would give a temperature rise of around 3 degrees, plus or minus 1.5; it is close to double to pre-industrial level.

But even assuming 6000 there still seems to be an error. Using 3*Log2(C/C0) for climate sensitivity of 3 degrees per doubling, this works out to a bit over 13 degrees gain over the pre-industrial levels of 280ppm. Note that this is presuming positive feedbacks to get climate sensitivity of about 3. If you simply ignore feedbacks and use radiative transfers alone, which seems to be what you mean by "Stefan-Boltzman", the temperature effect is less than half this.

I don't know how you got 16, but I think there is an error somewhere in your calculation as well.

Also... the highest CO2 levels in the Cenozoic were during the "Paleocene-Eocene Thermal Maximum" (PETM) about 55 million years ago. This was a short period of greatly elevated temperatures. CO2 levels beyond a million years or so reply on estimates from carbon models and proxies, as we don't have ice cores to give direct readings. Estimates are as high as 3000ppm. You don't get higher than that until you go back more than 400 million years.

Cheers -- sylas
 
Last edited:
  • #108
Andre said:
The allegations of M&M have been evaluated by two commissions/panels, a ad hoc commision Wegman and the NAS panel of North. Both confirmed the crtique of M&M , despite all attempts to cover that
.This is not correct and I have often seen these Committees mixed up. Wegman was indeed chair of the National Academy of Sciences (NAS) Committee on Applied and Theoretical Statistics, but he produced the non peer-reviewed 'Wegman Report' for the US House Committee on Energy & Commerce.A peer-reviewed 'Committee on Surface Temperature Reconstructions for the Past 2,000 Years' was assembled by the National Research Council (NRC) of the National Academies (Board on Atmospheric Sciences and Climate) under Gerald North, and their report acknowledged that there were statistical shortcomings in the MBH analysis, but concluded that they were small in effect.
http://books.nap.edu/openbook.php?record_id=11676&page=R1"Additionally, the American Statistical Association (ASA), in a session at the Joint Statistical Meetings, 2006 - which included Wegman - came to the same conclusion as the NRC.
http://www.amstat-online.org/sections/envr/ssenews/ENVR_9_1.pdf"
 
Last edited by a moderator:
  • #109
JohnMurphy said:
.


This is not correct and I have often seen these Committees mixed up.

and yet another attempt to discredit wegman.

see here
 
  • #110
sylas said:
I think you are mistaken. Bill's links go back some 20 million years, and CO2 levels have remained below 600ppm over that period. Is 6000 a typo?

Cheers -- sylas

The data I linked to sylas goes back 570 million years with the highest CO2 level being 7,069 ppm at 520 million years ago (GeoCarbIII from Berner). There are estimates going farther back with the last solid calculated ones being 12,000 ppm at the end of the last Snowball Earth period 635 million years ago (this wouldn't have been high enough to end the Snowball so it had to be super-continents Rhodinia/Pannotia breaking-up and moving off the south pole) and 4,200 ppm at 770 million years ago. It is inferred that CO2/GHG levels were increasingly higher as we go back farther in time or the Earth would have been a frozen iceball in all earlier time periods due to the Faint Young Sun.
 
  • #111
Bill Illis said:
The data I linked to sylas goes back 570 million years with the highest CO2 level being 7,069 ppm at 520 million years ago (GeoCarbIII from Berner). There are estimates going farther back with the last solid calculated ones being 12,000 ppm at the end of the last Snowball Earth period 635 million years ago (this wouldn't have been high enough to end the Snowball so it had to be super-continents Rhodinia/Pannotia breaking-up and moving off the south pole) and 4,200 ppm at 770 million years ago. It is inferred that CO2/GHG levels were increasingly higher as we go back farther in time or the Earth would have been a frozen iceball in all earlier time periods due to the Faint Young Sun.

Thanks for the correction. I missed that.

These estimates are based on models of the geological carbon cycle (GeoCarbIII), and they do involve very large concentrations many hundreds of millions of years ago, which I did refer to in my post, though I failed to find them in your links. After hunting around, I found the location within the NCDC ftp site: ftp://ftp.ncdc.noaa.gov/pub/data/paleo/climate_forcing/trace_gases/phanerozoic_co2.txt. Values are for RCO2, which is the ratio to present levels (pre-industrial). The dataset reference suggests RCO2 = 1 corresponds roughly to 300ppm.

Of course, going back hundreds of millions of years means we also need to consider changes in the solar output. As you say, the Sun was dimmer but the elevated CO2 levels still meant Earth was warmer than today.

The major reference for this data set proposes the following formulae:
  • Temperature impact of RCO2 is 4*Ln(RCO2).
  • Temperature impact of faint Sun t million years ago is 7.4*t/570

At 520 Mya, with RCO2 = 26.2, we have 13 degree gain from the greenhouse effect, and 6.75 degree loss from the faint Sun. These are very rough estimates, but they are in the ball park for a Cambrian that is about 6 degrees warmer than now, which is about right.

Going back to the snowball/slushball is very interesting; and I know you have done some work on this. Alas, we are going of topic.

Thanks for setting me straight on this.

Cheers -- sylas
 
  • #112
sylas said:
[*]Temperature impact of faint Sun t million years ago is 7.4*t/570
[/list]


Cheers -- sylas

I think it is better to think of it as the Solar Irradiance reaching the Earth was 30% lower 4.55 billion years ago and it has increased in very close to a straight line over time.

So, Solar Irradiance 520 million years ago was = 1366 (0.7*520/4550) = 1319 = 252.8K

or -2.2K change 520 million years ago from lower solar irradiance.

Go back to 4.55 billion years ago and the Te was 233K.

This comes from D. Gough 1981, Kasting 1988 and outlined a little better in a more recent paper by Kasting.

http://geosc.psu.edu/~kasting/PersonalPage/Pdf/annurev_03.pdf
 
Last edited by a moderator:
  • #113
Bill Illis said:
I think it is better to think of it as the Solar Irradiance reaching the Earth was 30% lower 4.55 billion years ago and it has increased in very close to a straight line over time.

So, Solar Irradiance 520 million years ago was = 1366 (0.7*520/4550) = 1319 = 252.8K

or -2.2K change 520 million years ago from lower solar irradiance.

Go back to 4.55 billion years ago and the Te was 233K.

This comes from D. Gough 1981, Kasting 1988 and outlined a little better in a more recent paper by Kasting.

http://geosc.psu.edu/~kasting/PersonalPage/Pdf/annurev_03.pdf


Ahhh! Thanks for the link, I had a copy of this about 2 years ago and lost it, memory wasn't good enough to pull it up in a search.
 
Last edited by a moderator:
  • #114
Bill Illis said:
I think it is better to think of it as the Solar Irradiance reaching the Earth was 30% lower 4.55 billion years ago and it has increased in very close to a straight line over time.

So, Solar Irradiance 520 million years ago was = 1366 (0.7*520/4550) = 1319 = 252.8K

or -2.2K change 520 million years ago from lower solar irradiance.

Go back to 4.55 billion years ago and the Te was 233K.

This comes from D. Gough 1981, Kasting 1988 and outlined a little better in a more recent paper by Kasting.

http://geosc.psu.edu/~kasting/PersonalPage/Pdf/annurev_03.pdf

Thanks for the link. The calculation you present is better founded on the power law relating temperature to energy; but over the time scales of GeoCarb, this is not actually going to make much difference. A linear approximation works okay.

The real problem is that this calculation ignores all feedback effects, which are actually pretty crucial to the final temperature. Going back 520 Mya, the fractional decrease in insolation is 0.3*520/4550 = 0.0343. Insolation, after allowing for albedo, now about 240 W/m2. The reduction is 0.0343*240 = 8.3 W/m2. Assuming a climate sensitivity of about 0.75 K per W/m2 you get a temperature change at the surface of 6.2K.

This is about what we get also from the numbers in the reference for the GeoCARB III reference, which is

The two papers are consistent with each other, and the factors used for temperature difference over time due to the dimmer Sun take into account both the numbers you have presented for how solar radiance changes, and also the climate feedbacks that affect how the planet responds to that change, which is not simply given by Stefan-Boltzman.

Your latest reference spells this out explicitly. You have cited
On page 442 (my bolding)
If one reduces the value of S by 30% in (1), holding A and ΔTg constant for simplicity, one finds that Te drops to 233 K and Ts = 266 K, well below the freezing point of water. If the calculation is repeated with a climate model that includes the positive feedback loop involving water vapor, the problem becomes even more severe. The dashed curves in Figure 4 show Te and Ts calculated using a one-dimensional, radiative-convective climate model, assuming constant CO2 concentrations and fixed relative humidity (Kasting, Toon & Pollack 1988). The results are remarkably similar to those predicted earlier by Sagan & Mullen: Ts drops below the freezing point of water prior to ~2 Ga. Combined with the snow/ice-albedo feedback loop, this temperature drop would almost certainly lead to a globally glaciated Earth. However, geologic evidence tells us that liquid water and life were both present as far back as 3.5 Ga and maybe longer. The oldest zircons, zirconium silicate minerals that must have formed in liquid water, are dated at more than 4.3 Ga and may indicate the presence of an ocean at that time (Catling & Kasting 2002, Mojzsis, Harrison & Pidgeon 2001, Wilde et al. 2001).

How can the faint young Sun problem be solved? A large decrease in cloudiness would do it (Rossow et al. 1982), but this seems unlikely for reasons mentioned in Section 3.1. Instead, the answer probably lies in increased concentrations of greenhouse gases. Both CO2 and CH4 are plausible candidates. ...[/color]​

The numbers used in the GeoCARB III reference, which give a temperature difference of about 6 degrees 520 Mya, correspond to estimates that take water vapour and other factors into account, just as Kasting and Catling describe here.

Cheers -- sylas
 
Last edited by a moderator:

Similar threads

Back
Top