#### FireBones

I was doing some research today and realized a somewhat silly fact. It appears that Earth's atmosphere actually lowers the total amount of radiation Earth's surface absorbs, even though it raises the mean temperature.

This should not sound impossible, the atmosphere provides [by re-radiation] a second source of thermal energy for the surface, but it also prevents/removes a great deal of thermal energy [by reflecting some radiation and absorbing some before it ever reaches earth...and also allowing evaporation and convection]

Here is why I'm thinking one can say the net effect is actually to cause less energy to reach earth.

The amount of energy actually disabused due to evaporation and convection is rather small (compared to radiative heat loss). For the most part, the medium-mediated heat transfer caused by the atmosphere serves to spread heat around [making the night side of the earth much warmer than it would be without any such atmosphere] rather than offload the heat completely.

Ignoring that heat loss, the earth essentially has one and only one way to lose its heat: radiation. The temperature of the surface is an indication of how much heat energy the earth is receiving because, for reasonably stable temperatures, the amount of radiation the earth absorbs must be approximately equal to the amount it gives off.

This leads to the [not too profound] statement that the temperature of the earth is indicative of how much heat it receives. The more heat it receives, the hotter it gets before the rate of radiating away energy equals the rate of absorption of thermal energy.

Naively, this would suggest that the Earth's atmosphere causes it to receive more total heat than it would if there were no atmosphere because one can compare the average temperature of the Earth to the average temperature of the Moon and point out that the Earth, on average, is warmer than the Moon.

The issue, though, is that what we are interested in is the RATE OF ENERGY TRANSFER, which is based on the fourth power of the temperature, not the temperature itself. The earth may have a higher mean temperature, but the mean value of T^4 is almost certainly higher on the Moon due to its massive temperature fluctuations.

This would suggest that the atmosphere actually does not increase the amount of energy the surface of the earth receives via radiation but rather its effect of redistributing energy conspires with our poor choice of metric to make it appear that it does.

Related Earth Sciences News on Phys.org

#### sylas

The standard reference for energy flows is
• Trenberth, K.E., Fasullo, J.T., and Kiehl, J. (2009) http://ams.allenpress.com/archive/1520-0477/90/3/pdf/i1520-0477-90-3-311.pdf [Broken], in Bulletin of the AMS, Vol 90, pp 311-323.

The Earth's surface, with temperatures around 18 C or so, radiates something approaching 400 W/m2 upwards. That's substantially more than we get from the Sun directly. And on top of that there is additional energy flowing upwards from convection and latent heat of evaporation. This has to balance with energy coming down. It turns out that the largest single source of energy flowing down to the surface is atmospheric backradiation. This is because the atmosphere has gases -- mostly water, also CO2 and various others -- which can interact with thermal radiation, and hence the atmosphere emits this radiation as well, both up and down as well.

The atmospheric backradiation has been known for a long time. It was first explained in the mid nineteenth century, figured prominently in the first energy balance work around 1915, and then was measured directly in 1954. It's an important feature of Earth's climate system, quite apart from any concerns about changes or warming (which are not topics physicsforums currently wants to engage).

But the direct measurement of backradiation (night and day) in 1954 is as follows:

There are small differences with more modern measurements, as seen in Trenberth et al (2009), but the difference is small. The measure rate of energy flux on the Moon's surface is significantly less than that of Earth. The two important factors for surface temperatures and surface radiation flux (according to the Stefan Boltzman relations) are albedo (how much solar energy is absorbed) and atmospheric effect (how efficiently radiation from the surface gets out into space). The Moon has a lower albedo (absorbs more energy) but because there is no atmosphere, it radiates it directly to space.

The Earth's atmosphere does indeed mean heat doesn't get away from the surface as easily, and this is what makes the Earth livable. Without this effect, Earth's surface temperature would be about 33 degrees Celcius cooler, on average.

The mentors will let us know if this discussion strays into improper territory, but these details are actually independent of "climate change". It's rather about the straight thermodynamics of how having an atmosphere makes a difference for the conditions at the surface.

Cheers -- sylas

Last edited by a moderator:

#### FireBones

The standard reference for energy flows is

The Earth's atmosphere does indeed mean heat doesn't get away from the surface as easily, and this is what makes the Earth livable. Without this effect, Earth's surface temperature would be about 33 degrees Celcius cooler, on average.

Cheers -- sylas
Thanks for the references, but I think you may be missing my point a bit.

You are correct in claiming the atmosphere causes the temperature on the earth to be higher "on average," but the question is whether that is due to its receiving more total energy [The sun plus radiation downward from the atmosphere] compared to what it would do without an atmosphere [the sun only] OR is it due to the fact that a planet with moderated temperature will ALWAYS be hotter "on average" than a planet that radiates the same amount of energy outward but has a skewed temperature.

Consider the moon. It's mean temperature is, say, 250 Kelvin, and it radiates out, say, X watts per energy total. Now, if the Moon (magically) had moderated temperatures instead of the highly variable ones it has, its temperature would have to be significantly higher to radiate that same amount of energy out into space.

So, my question is whether it is really fair to say that the atmosphere adds to the total amount of energy the earth receives or rather it would be fairer to say that the atmosphere raises the average temperature by making the earth a less efficient radiator. Note that I do not mean this as "because it is harder for the energy to make it out of the atmosphere" The atmosphere does not appreciably affect the amount the Earth radiates (it only affects how much of that radiation escapes the Earth-Atmosphere metropolis). I'm saying the atmosphere makes earth a less efficient radiator because it moderates the temperature so that it sends out less radiation than it would if the day side were, say, 40 degrees hotter and the night side 40 degrees colder.

And you are correct, this has nothing to do with climate change, so I would hope it stays here as a valid topic [obviously, climate change, unlike radiative heat outlay, is based on temperature, not temperature to the fourth power].

#### FireBones

I followed the useful link that Sylas gave, and it answered my question.

On the other hand, it has engendered another one: how much of the temperature increase is due to back-radiation and how much of it is due to moderation of temperatures?

Luckily, I can figure that out myself.

Thanks, Sylas!

#### FireBones

I modeled this yesterday [or, rather, this morning] and verified the conclusion I had suggested earlier, most of the warming of Earth's surface due to the atmosphere comes not from the increased energy due to back radiation but rather due to temperature moderation lowering the effectiveness of Earth as a blackbody radiator.

Just to reiterate, by "lowering the effectiveness of Earth as a blackbody radiator" I do not mean "because some of the outgoing radiation is trapped." Radiation escaping earth's surface counts as radiation regardless of whether that radiation is subsequently absorbed by the atmosphere.

Rather, I'm referring to the fact that the effectiveness of a blackbody radiator is not linearly dependent on temperature. A radiator made of a hot side (400 K) and a cold side (200 K) radiates much more energy than radiator at the middle temperature (300 K).

Just to give an impression how much more energy, compare 2^4 + 4^4 [16 + 256 = 272] to 3^4 + 3^4 [81+81 = 162]...so the hot-and-cold radiator radiates 68% more energy than the warm one, though they have the same "average" temperature.

You can estimate this effect by comparing:

A. The average surface temperature of the moon [~ -36 degrees Celsius]
B. The blackbody temperature of the Earth [5.5 degrees Celsius]
C. The blackbody temperature of the green-house affected Earth [16 degrees Celsius]

The gap between B and C is roughly the increase in temperature due to the fact that the Earth receives more radiation [due to back radiation mitigated by convection/latent heat flux]. The gap between A and B is approximately the difference due to moderation of temperatures.

(Actually, since regolith has a higher albedo than earth, the gap between A and B slightly over-states this difference. The difference in Bond Albedo is about 7% so that suggests a difference in Kelvin temperature of about 1.8 percent, or about 5 degrees as a correction.)

You can also see the same thing by comparing the surface temperature of the Moon to the temperature of the regolith underneath. The mean temperature of the moon's regolith one meter in is significantly hotter than the mean temperature of the surface [by 35-40 degrees celsius], showing the reverse effect that the moon's surface [varying widely through the lunar day] is colder on average because this variable temperature makes it a more efficient radiator.

#### sylas

Sorry I took so long to get back to this. I have been on the road. The basic thermodynamics of planets, and comparisons between planets, is something in which I have a particular interest. In this post I'm still looking only at very basic broad details of the thermodynamics of an atmosphere and surface, without worrying about changes over time.

One thing in particular was new to me in this. Your information on the difference between surface and subsurface means on the Moon was not something of which I had been aware, and I have spent a bit of time to check out more about the details of how it is measured and what it implies. Thanks very much for that!

I've given a couple of corrections to your analysis in what follows, by way of returning a favour... I hope it may be of some help. This gets a bit long; but I've found it interesting and worth recording.

I modeled this yesterday [or, rather, this morning] and verified the conclusion I had suggested earlier, most of the warming of Earth's surface due to the atmosphere comes not from the increased energy due to back radiation but rather due to temperature moderation lowering the effectiveness of Earth as a blackbody radiator.
The warming effect of Earth's atmosphere is mainly a greenhouse effect, which can be considered in a number of different but physically equivalent ways; backradiation being one of them, and lowering radiating effectiveness as another. It is two ways to look at the same thing. So whatever you want to call it, it is directly associated with the backradiation; I don't think the distinction you draw above is valid. The surface radiates a lot of energy due to its temperature, and so it has to receive a matching energy input. The largest single contributing energy flux to balance the surface thermal emission is atmospheric backradiation. Direct measurement of backradiation is one simple way to identify the existence of a strong greenhouse effect, without which our planet would be completely frozen over.

The atmosphere also helps transport heat horizontally around the globe to moderate extremes, and this is a different effect. It is not nearly enough to account for the balmy surface we enjoy. I'll show some of the maths below, when we compare with the Moon.

Just to reiterate, by "lowering the effectiveness of Earth as a blackbody radiator" I do not mean "because some of the outgoing radiation is trapped." Radiation escaping earth's surface counts as radiation regardless of whether that radiation is subsequently absorbed by the atmosphere.
You are right that "trapping" is a misleading term, in the sense that the energy still has to get out. But the atmosphere does block a substantial amount of thermal radiation from the surface. Those bands still get radiated into space... but from the atmosphere, not from the surface; and this IS very important.

One essential aspect of the atmosphere is the lapse rate... the decrease in temperature with altitude. Without this, you'd not have any greenhouse effect at all. The role of certain gases is that they are opaque, so they absorb and emit thermal radiation. Rather than having thermal radiation go straight from the surface into space (as it does on the Moon, or in the band of the spectrum where Earth's atmosphere has an "infrared window"), this radiation goes only a short distance before being absorbed. The frequencies that are absorbed are ALSO frequencies at which the atmosphere can emit. (Kirchoff's law.) But note well (and this is crucial): the atmosphere is cooler, because it has a lapse rate. The lapse rate is independent of how radiation is absorbed/transmitted; it is based on the thermodynamics of how gases at different temperatures and pressure rise and fall... vertical convection. Basically the lapse rate is at a kind of thermodynamic minimum point and convection drives it towards that thermodynamically optimal fall of temperature with height. The lapse rate corresponds to the temperature change as rising air expands adiabatically.

The bottom line is, in those frequencies where the atmosphere interacts strongly with radiation, what gets out into space is being emitted not from the surface, but from high in the atmosphere, where it is much cooler. A cool body emits less than a warm one. So for a planet with an atmosphere with water or CO2 or other such gases, the "effective radiating temperature" will be cooler than the surface temperature. The effective radiating temperature is what matches the solar energy input; and the surface temperature is warmer than this, because of the greenhouse effect.

You can estimate this effect by comparing:

A. The average surface temperature of the moon [~ -36 degrees Celsius]
B. The blackbody temperature of the Earth [5.5 degrees Celsius]
C. The blackbody temperature of the green-house affected Earth [16 degrees Celsius]

The gap between B and C is roughly the increase in temperature due to the fact that the Earth receives more radiation [due to back radiation mitigated by convection/latent heat flux]. The gap between A and B is approximately the difference due to moderation of temperatures.
I don’t have a good reference for (A), the mean surface temperature of the Moon. The -36 oC you mention sounds reasonable; this is about 237K. Do you have a reference for this number?

Your number for (B) is incorrect. The effective blackbody temperature for the Earth is about 255K, or -18 oC. The Moon has a blackbody temperature of about 270K, or -3 oC. These numbers are pretty standard. For example, check out the Moon Fact Sheet at NASA. It compares Moon and Earth, and gives blackbody temperatures of 270.7 K and 254.3 K respectively. I'll show how to calculate these below.

I presume by (C) blackbody temperature of greenhouse affected Earth you mean an effective radiating temperature for the surface of the Earth. This is indeed around about 15 or 16 oC. But the Earth as a planet radiates into space with the effective temperature of -18 oC, because so much radiation into space actually comes from high in the atmosphere, as I have explained above. The greenhouse effect accounts for an extra 33 degrees at Earth's surface. Earth's atmosphere can do this only because it has gases that are opaque to substantial bands of thermal radiation.

(1) Calculating the effective radiating temperature of a planet (as seen from space)

The difference between effective radiating temperatures of Moon and Earth is due to albedo. The Moon absorbs more of the Sun's energy, and hence it has a higher effective radiating temperature, by about 15 degrees.

Here's what you should do. The solar constant is roughly 1366 W/m2, out at the distance of the Earth/Moon from the Sun. The surface area of a sphere is 4 times the surface area of a circle with the same radius, and so when getting the total energy over the surface of a sphere, it works out to 0.25 of the above, or 341.5 W/m2, for both Moon and Earth.

Now the albedo of the Earth is about 0.30, and for the Moon it is about 0.12. This means that the energy being absorbed by the Earth is about 239 W/m2, and for the Moon it is about 300 W/m2.

This energy Q balances with what is radiated, and so the "blackbody temperature" is given by the Stefan-Boltzman law, using Q = σT4. The blackbody temperatures, also called the effective radiating temperatures, are as follows:
• 255 K or -18 oC for the Earth
• 270 K or -3 oC for the Moon

The "mean" temperature is less that the radiating temperature, as you have noted. This is a special case of "Holder's inequality". Because radiation goes as the fourth power of temperature, varying temperature gives more radiated energy, and the effective radiating temperature, or blackbody temperature, is therefore greater than the mean temperature.

If the only effect of the atmosphere was to damp out variations in temperature and share heat between different parts of the globe, the most this could do would be raise mean temperatures to -18 oC! In fact, very little of Earth's surface ever gets that cold, and the surface is on average about 33 degrees warmer than the planetary effective radiating temperature. Our balmy climate is due to the greenhouse effect.

(2) Mean temperature and effective radiating temperature

The Moon has very little transport of heat around the surface. This means it has much lower and higher extremes, with a mean temperature that is significantly less than the effective radiating temperature of -3 oC. There is no atmosphere and no greenhouse effect, so the surface temperature directly determines the effective temperature into space.

In any case, under the midday sun, the Moon's equator is receiving four times as much energy than the average over the whole sphere, and this means the temperature is greater by a factor of 40.25, or about 1.414.

The midday equator temperature on the Moon should thus be about 270*1.414, or about 382K. This is 109 oC. Measurements are pretty close to that. Maximums are a little bit higher, because some parts of the Moon are darker and have even lower albedo. These regions heat up the most.

At night, temperatures on the Moon plunge. Typical values are around -153 oC, or 120K. These numbers can also be confirmed at http://education.ksc.nasa.gov/esmdspacegrant/LunarRegolithExcavatorCourse/Chapter5.htm#SurfaceTemperature [Broken] (at NASA education sites).

(Actually, since regolith has a higher albedo than earth, the gap between A and B slightly over-states this difference. The difference in Bond Albedo is about 7% so that suggests a difference in Kelvin temperature of about 1.8 percent, or about 5 degrees as a correction.)
The difference in the effective radiating temperature due to albedo is actually about 15 degrees, as shown in the calculations above. You can find this in a number of fairly standard references, such as the Moon Fact Sheet at NASA cited previously.

(3) Subsurface vs surface temperatures

You can also see the same thing by comparing the surface temperature of the Moon to the temperature of the regolith underneath. The mean temperature of the moon's regolith one meter in is significantly hotter than the mean temperature of the surface [by 35-40 degrees celsius], showing the reverse effect that the moon's surface [varying widely through the lunar day] is colder on average because this variable temperature makes it a more efficient radiator.
Actually, the cause of higher subsurface temperatures is unrelated to the atmospheric greenhouse effect. It is because of a temperature dependent conductivity.

The temperature on the Moon at shallow depths is stable because the regolith, or surface material, is such a good insulator. The surface temperature is governed by blackbody radiation, as calculated above. I was initially very surprised by your mention of the 35-40 difference with subsurface temperatures, and spent some time checking the literature on this. I would have expected that the subsurface temperature must be very close to the mean surface temperature, because heat from below the surface is mostly lost by conduction, not radiation.

However, it turns out that the conduction of the upper one or two centimeters of regolith is dependent on temperature, conducting slightly better at higher temperatures, although it is still a good insulator in all cases. This means that during the day, heat from the surface penetrates into the upper couple of centimeters more effectively than the heat escapes up to the surface at night, and this means that the mean temperature at 2cm depth is greater than the mean at the surface. As you go deeper, the temperature variation reduces rapidly because the conduction is so slow, and hence also the conductivity becomes stable, and the mean temperatures increase very slowly with further depth, due simply to the trickle of internal heat from the center to the surface.

Here's a good picture of what goes on, from Vasavada et al (1999) "Near-Surface Temperatures on Mercury and the Moon and the Stability of Polar Ice Deposits", in Icarus vol 141, pp 179–193 (http://dx.doi.org/10.1006/icar.1999.6175 [Broken]; unfortunately full text needs a subscription.)

This is showing minimum, maximum and mean temperatures on the Moon with depth, based on a computational model taking into account the thermodynamic properties of surface materials, and showing the kink in mean temperature right at the top 2 cm. The physical cause of this the difference between surface and subsurface temperatures -- temperature dependent conduction -- is physically a different thing from the atmospheric greenhouse effect that means Earth's surface is so much warmer than the effective radiating temperature.

Another good basic reference is "Lunar sourcebook: a user's guide to the moon" by Grant Heiken, David Vaniman, Bevan M. French, (Cambridge Uni Press 1991). This is widely used, and it says of the information from Apollo missions (on page 38):
3.6.2 Conclusions.
The upper 1 to 2 cm of lunar regolith must have extremely low thermal conductivities (1.5 x 10-5 W/cm2), with conductivity increase 5 to 7 times at a depth of 2cm. At the Apollo sites, mean temperatures 35 cm below the lunar surface are 40 to 45 K above those at the surface. [...] It is noteworthy that an insulating blanket of only about 30cm of regolith is sufficient to dampen out the ~280 K lunar surface temperature fluctuation to +/- 3K variation. [...]​
The physical reason is given on page 34:
Langseth and Keihm (1977) describe a large difference in mean temperature (i.e., the temperature averaged over a complete day-night cycle) just below the lunar surface. At the Apollo 15 site, the mean temperature at a depth of 35 cm is 45 K higher than that of the surface; at the Apollo 17 site, the difference is 40K. The increase in the mean temperature is related mostly to the temperature dependence of thermal conductivity of the topmost 1 to 2cm of lunar soil.​
The measurements by Apollo 15 and 17 were 252K and 255K respectively at 1m depth, which implies surface means of 207 and 205K respectively. This is rather less than the value for a global mean you suggest, but that might just be because of the particular landing sites; I am not sure.

---

Thanks for a most interesting topic!

Cheers -- sylas

Last edited by a moderator:

#### mgb_phys

Homework Helper
There's no paradox here - it's exactly what a greenhouse does.
Putting glass over your plants reduces the amount of light reaching them (reflection and absorption losses) but raises the temperature by trapping warm air.

#### FireBones

I've given a couple of corrections to your analysis in what follows, by way of returning a favour...
Thanks for taking the time to write this, but I really think you are missing my point regarding the nuances of the S-B law. I'm going to try to make that more clear in my replies. I apologize if some of them sound bitter...this medium is not a good one sometimes ;)

The warming effect of Earth's atmosphere is mainly a greenhouse effect
This is what I'm disputing...Most of the warming due to our atmosphere has nothing to do with the greenhouse effect.

which can be considered in a number of different but physically equivalent ways; backradiation being one of them, and lowering radiating effectiveness as another.
The "lowering radiating effectiveness" I refer to is not the one you are referring to. The lowering in the effectiveness of Earth's radiation I refer to would occur even if none of the atmospheric gases absorbed IR energy. This proves that the lowering of radiating effectiveness I refer to is not equivalent to backradiation.

Direct measurement of backradiation is one simple way to identify the existence of a strong greenhouse effect, without which our planet would be completely frozen over.
That's not true. If our atmosphere were transparent to radiation (going both ways) it would not be frozen over. The type of claim you are making mixes two different models: it half-transforms the molecules in the atmosphere (taking out the absorption aspect while leaving the reflection aspect) and uses that number. It is much more reasonable to either talk about:

A. "What the temperature would be if the earth had no atmospehere"
B. "What the temperature would be if the earth had an atmosphere, but the atmosphere lacked gases that interfered with radiation."

In neither of these cases would the Earth be frozen over. It is only when you create a physically impossible version where you keep the atmosphere as is and pretend it's molecules violated certain laws of physics without factoring in any other consequences of those violations.

The atmosphere also helps transport heat horizontally around the globe to moderate extremes, and this is a different effect. It is not nearly enough to account for the balmy surface we enjoy. I'll show some of the maths below, when we compare with the Moon.
Actually, the warming due to moderation is four times the warming due to the radiation-interfering effects of the atmosphere.

I don’t have a good reference for (A), the mean surface temperature of the Moon. The -36 oC you mention sounds reasonable; this is about 237K. Do you have a reference for this number?
Actually, that value I gave is wrong...should be more like 220K or 225K. A good reference for this [and the same place I found the information on sub-surface temperatures of the moon] is "Lunar Sourcebook: A user guide to the Moon" (available via google books).

Your number for (B) is incorrect. The effective blackbody temperature for the Earth is about 255K, or -18 oC.
My number is not incorrect if one is asking about the "effect of the atmosphere on Earth's temperature." Those sources that site "-18" or "-19" mix and match their numbers. They say they are calculating the blackbody temperature of the earth without its atmosphere, but then they calculate the albedo of the Earth as though it had an atmosphere!

It's another case of mixing and matching aspects of models, pretending the Earth had the atmosphere it had today but assuming gases were chemically altered to exhibit reflection without exhibiting absorption. Or they pretend there is no atmosphere while (at the same time) pretending clouds still existed. This is, of course, impossible.

A more honest determination of the black-body temperature for Earth's surface is to use the albedo of....Earth's surface.

You can breakdown the effect of the Earth's atmosphere (which is about 73 degrees C total) into three units:

1. Heating effects due to radiative interference -> 33 degrees
2. Cooling effects due to radiative interference -> -22.5 degrees
3. The general warming that exists due to moderation of temperature with no reference to radiation. -> ~50.5 degrees

My beef is that people refer to the first in isolation to the second...comparing Earth to a contrived version where only the warming radiative effects of its atmosphere are considered without reference to the cooling effects of its atmosphere and clouds (clouds that would not exist without the presence of an atmosphere...and in fact would not exist in a different atmosphere). My second beef is that the warming that is due to moderation (which makes the Earth a less efficient S-B radiator) is ignored completely, even though it dwarfs the combined radiative effect.

Just to reiterate the effect of a moderated temperature [having nothing to do with the gases absorbing light], I'll repost something:

FireBones said:
Rather, I'm referring to the fact that the effectiveness of a blackbody radiator is not linearly dependent on temperature. A radiator made of a hot side (400 K) and a cold side (200 K) radiates much more energy than radiator at the middle temperature (300 K).

Just to give an impression how much more energy, compare 2^4 + 4^4 [16 + 256 = 272] to 3^4 + 3^4 [81+81 = 162]...so the hot-and-cold radiator radiates 68% more energy than the warm one, though they have the same "average" temperature.

#### FireBones

There's no paradox here - it's exactly what a greenhouse does.
Putting glass over your plants reduces the amount of light reaching them (reflection and absorption losses) but raises the temperature by trapping warm air.
I'm not sure what paradox you are talking about...no one here is suggesting there is a paradox regarding warming due to the atmosphere.

The "Greenhouse effect" has nothing to do with how a greenhouse works. It got that name because about 150 years ago scientists thought a greenhouse worked differently. In the early part of the 20th century, this was proven wrong and the name stuck.

A greenhouse causes things to become warm by preventing convection
The "Greenhouse effect" causes the Earth to become warm by
B. Moderating the temperature of the Earth's surface so it does not radiate heat away as well as a planet with extreme temperatures.

My point is that "Effect B" is much more significant than "Affect A," but no one talks about it.

#### FireBones

Sylas,
Sorry to pile on here, but I realized another problem with the model that spits out the "-18" or "-19" degree value. Even if you are trying to model the non-physical fantasy-land of "what the Earth would be like with no greenhouse effect," you simply cannot use the modern-day albedo due to cloud cover...even if you are pretending the atmosphere keeps its water vapor (but assuming the laws of physics are changed to stop that water vapor from absorbing energy).

The issue is, even in this fantasy land where gases are allowed to reflect energy but not absorb it, clouds would not form in the same way [if at all] because the atmosphere would, of course, be much, much, much colder if it were not absorbing radiation. This would affect cloud formation (and might prevent it entirely). You certainly cannot use the cloud-cover albedo from one model in the other in such a facile way as is often done since the cloud-cover would be very much different (perhaps non-existent) in a world where atmospheric gases did not absorb energy.

#### mgb_phys

Homework Helper
I'm not sure what paradox you are talking about...no one here is suggesting there is a paradox regarding warming due to the atmosphere.
"It appears that Earth's atmosphere actually lowers the total amount of radiation Earth's surface absorbs, even though it raises the mean temperature."
--sounds like it was stating a paradox.

The "Greenhouse effect" has nothing to do with how a greenhouse works. It got that name because about 150 years ago scientists thought a greenhouse worked differently. In the early part of the 20th century, this was proven wrong and the name stuck.
Yes that's why I didn't say the greenhouse effect.
A passive solar water heater is closer to the visible in, blocked IR out atmospheric greenhouse effect. A regular glass greenhouse is more like simply putting on more clothes.

Anyway it was just a simple aside - the modelling described in the other answers give a deeper explanation.

#### FireBones

"It appears that Earth's atmosphere actually lowers the total amount of radiation Earth's surface absorbs, even though it raises the mean temperature."
--sounds like it was stating a paradox.
Oh, no. When I wrote that post I was misremembering some data regarding the degree of back-radiation.

The atmosphere does increase the amount of radiation the surface absorbs...it's just that the increase in temperature attributable to this is less than the increase attributable to surface temperature moderation (which has nothing to do with absorption of IR).

#### sylas

Thanks for taking the time to write this, but I really think you are missing my point regarding the nuances of the S-B law. I'm going to try to make that more clear in my replies. I apologize if some of them sound bitter...this medium is not a good one sometimes ;)
No problem at all. I don't mix up robust disagreement with aggression or bitterness. Furthermore, I don't expect you just to fall over and accept my explanations. I anticipate that we'll soon conclude this discussion without reaching agreement, and that's fine by me, as long as we have both been able to explain our particular view.

Then, perchance, if you still find any of this worth pursuing yourself, keep on double checking it for yourself, from a range of sources. I would personally recommend looking at what is said in a range of textbooks that deal with basics of atmospheric and planetary thermodynamics. Comparisons of different planets are a good way to look at the various gross factors that impact surface temperatures. I use the Moon and Earth comparison quite a lot to explain the differences in albedo, thermal absorption of an atmosphere, surface heat capacity, lateral heat transfer, and so on. But that's entirely up to you; this has been a good exchange in any case.

This is what I'm disputing...Most of the warming due to our atmosphere has nothing to do with the greenhouse effect.
Then we do have a genuine point of real difference here, and it may be worth sorting it out, or at least clarifying how we get such different conclusions.

I have some questions about your account, which may help clarify where we stand. I've kept them pretty straightforward and hopefully friendly and collegial. I believe you are mistaken, but I'll be happy to have your answers or clarifications of what you are proposing. Similarly, I have tried to answer your questions.

If you can give references for your claims, this will clarify your claims much better. It is the usual expectation in this forum is that we back up claims with reference to peer reviewed scientific references or a suitable equivalent. I would consider that any conventional text book used in mainstream universities should be okay. The Lunar handbook, we've both used, for example, is fine.

The idea here, as set out in our guidelines, is to learn about the practice of modern physics, as used and published by working physicists. That is what I have been using.

The "lowering radiating effectiveness" I refer to is not the one you are referring to. The lowering in the effectiveness of Earth's radiation I refer to would occur even if none of the atmospheric gases absorbed IR energy. This proves that the lowering of radiating effectiveness I refer to is not equivalent to backradiation.
Have you got a reference for this effect?

Suppose that the entire Earth surface was all the same temperature, of 16 oC, which you described previously as a "blackbody temperature of the green-house affected Earth". That is 289K, and by the Stefan-Boltzman law, a blackbody at 289K radiates just over 395 W/m2. Now this is indeed pretty close to what has been conventionally given as the amount of energy being radiated upwards from the surface. The modern value is usually given as 390 W/m2, from Trenberth et al (2009), corresponding to blackbody of about 15 oC. In either case, this is significantly more than the total energy available from the Sun, which is at most 342 W/m2.

Note that we have both already agreed that any variation of temperatures above and below a mean of 16 oC makes the Earth a more effective radiator, and thus increases the energy being radiated to be above 395 W/m2.

Question 1. Do you disagree that the energy radiated up from the surface is significantly more than the energy radiated down from the Sun? Or if you agree with this, then where does the surface get the additional energy required?

That's not true. If our atmosphere were transparent to radiation (going both ways) it would not be frozen over. The type of claim you are making mixes two different models: it half-transforms the molecules in the atmosphere (taking out the absorption aspect while leaving the reflection aspect) and uses that number. It is much more reasonable to either talk about:
A. "What the temperature would be if the earth had no atmosphere"
B. "What the temperature would be if the earth had an atmosphere, but the atmosphere lacked gases that interfered with radiation."
I'll take those as questions for my analysis. There is almost no reflection from the clear sky atmosphere. There is significant reflection from cloud. Of course, more reflection has a cooling effect. I presume we are agreed on all of this? Here then are my answers.

(B) is fairly easy to answer, if we just assume there's no interference with thermal radiation and leave the albedo unchanged. I did the calculation in the last post, and the result is a mean surface temperature of -18 oC, or less. As we have both noted, any [strike]sharing[/strike] variations in temperatures around the globe would give a more effective radiator, and hence even lower mean temperatures to shed the same incoming solar energy. So by any account, this will freeze the planet very thoroughly.

(A) is more subtle, but in the end the same. If we assume no atmosphere, then we assume no cloud. So as well as having no greenhouse effect, we also have no reflection from the cloud. Now Earth is mostly ocean, and ocean water has a very low albedo, of around 0.06. (Ref: http://nsidc.org/seaice/processes/albedo.html [Broken] albedo page) The solar input of 341.5 W/m2, and albedo of 0.06, the total absorbed energy is 321 W/m2, and that has a blackbody temperature of about 1 oC, all of which flows unimpeded into space. As before, the mean temperature must be below the blackbody temperature, and so there will be a lot of frozen ocean; much more than at present with mean temperatures about 15 degrees higher. But ice has a very high albedo, of around 0.5 to 0.7. So this means the Earth's albedo will actually end up higher than the current albedo, and then exactly as in the case (A) above, you end up with effective blackbody temperatures a long way below freezing, and a mean surface temperature less than -18 oC as before.

As before, removing any horizontal mixing effect of the atmosphere only drives mean temperatures lower still, to shed the absorbed energy. The -18 is a strict upper bound.

So my answer to (A) and (B) is that without atmospheric interactions with thermal radiation, the temperature of the Earth's surface would have an average temperature of something less than -18 oC, and it would freeze solid. If you can explain your method in similarly quantified terms, it will help to see where the differences lie.

Question 2. What calculations or physical theory would you apply to estimate the effects in either case (A) or (B) above?

As a reference for my conclusions, the consequence of atmospheric interaction with thermal radiation I am using was first identified by John Tyndall, in the mid nineteenth century. See: "Contributions to Molecular Physics in the Domain of Radiant Heat" (Tyndall, 1872) (17 Mbyte djvu file, 446 pages). Pages 421-424 contain a public lecture from 1863, in which he describes the freezing consequences that would occur without the capacity of the atmosphere to interact with thermal radiation. This remains basic in modern thermodynamics, and a part of any course in atmospheric physics.

Actually, the warming due to moderation is four times the warming due to the radiation-interfering effects of the atmosphere.
Question 3. I know what the word "moderation" means, but not as a technical term in thermodynamics for any specific effect. How do you obtain that factor of 4? Do you have a reference for this effect?

I think I have made a reasonable guess at this further on... but having in your own words would be better.

Actually, that value I gave is wrong...should be more like 220K or 225K. A good reference for this [and the same place I found the information on sub-surface temperatures of the moon] is "Lunar Sourcebook: A user guide to the Moon" (available via google books).
Ok, that's fine. It's more in line with what I would expect; especially given the even lower values at the landing sites for Apollo 15 and 17. But I could not find a one value for the calculated mean temperature over the whole surface in that reference.

My number is not incorrect if one is asking about the "effect of the atmosphere on Earth's temperature." Those sources that site "-18" or "-19" mix and match their numbers. They say they are calculating the blackbody temperature of the earth without its atmosphere, but then they calculate the albedo of the Earth as though it had an atmosphere!
You described the number as (quote) "The blackbody temperature of the Earth". That is a well understood technical number; it is a radiating temperature for the whole planet, as measured from space, right now. This temperature for Earth is 255K, or -18 oC, and it is about 33 degrees cooler than the radiating temperatures at the surface, below our atmosphere. I gave references for it previously. It is a measurable quantity for our planet.

Of course, you could also try to calculate the temperature that would result if the whole atmosphere was stripped from the Earth somehow, and considering that the albedo would change. This is not the same thing, but I have shown a calculation above. It would freeze much of the ocean, since the radiating temperature would now be the same as the surface temperature, and this would in turn drive the albedo to very high values, dropping the mean surface temperature to well below the current planetary blackbody temperature of -18 oC.

You can breakdown the effect of the Earth's atmosphere (which is about 73 degrees C total) into three units:

1. Heating effects due to radiative interference -> 33 degrees
2. Cooling effects due to radiative interference -> -22.5 degrees
3. The general warming that exists due to moderation of temperature with no reference to radiation. -> ~50.5 degrees
References or methods of calculation would help here as well. However, I can work back from your numbers to figure out how they are obtained. Let me know if I have understood or not.

Point (1) I can understand. It is the difference between temperatures measured by the radiation going into the space, and temperatures at the surface below the atmosphere; taking temperatures which would give the appropriate amount of energy if uniform over the surface. It is, in fact, a quantification of the atmospheric greenhouse effect.

Point (2) is, I think, a measure of the effects of plantary albedo. Is that right? But remember, the atmosphere is not the only source of albedo! The bare surface albedo is greater than 0.06 (the low value for open ocean). By figures from Trenberth et al (2009), cited in msg #2, overall surface albedo is 23/184 or 0.125; similar to the Moon. Using 341.5 W/m2 as the solar input, we get the following table:
$$\begin{array}{l|l|l|l} \text{Albedo} & \text{Energy flux} & \text{blackbody temperature (C)} \\ \hline 0 & 341.5 & 5.4 & \text{(Completely black planet)}\\ 0.06 & 321 & 1 & \text{(Planet all open ocean)} \\ 0.125 & 299 & -3.7 & \text{(Estimated albedo for Earth's surface only)} \\ 0.3 & 239 & -18 & \text{(Actual albedo for Earth)} \end{array}$$​
So the cooling effect we can attribute to planetary albedo in total is about 23.5 degrees... similar to what you have given. The cooling effect of atmospheric albedo, on the other hand, is a difference with bare surface albedo, and corresponds to about 14 or 15 degrees. Note also that this is still not the same as just removing the atmosphere, because the consequent freezing of the ocean would raise the surface albedo considerably, to be in the end higher than what we have at present.

Your point (3), is, I think, the consequence of horizontal heat transport and thermal inertia to help equalize temperatures around the globe. Without these effects, each point on the surface would simply be a blackbody temperature for the solar input incident on that particular point.

But here again, this is not only an atmospheric effect. The atmosphere has a role, but not as much as the ocean. The heat transport of ocean currents is larger than atmospheric transports. Even under sea ice the ocean continues to move heat around the surface. Also, the Earth has a much shorter day than the Moon, and water has a very high capacity to absorb heat. This is a stark contrast to the Moon with a month long "day", and a regolith surface.

Your number of -50.5 is very similar to the difference between backbody temperature of the Moon (270K) and mean surface temperature of the Moon (220K). This is completely useless for the Earth, even without an atmosphere.

My beef is that people refer to the first in isolation to the second...comparing Earth to a contrived version where only the warming radiative effects of its atmosphere are considered without reference to the cooling effects of its atmosphere and clouds (clouds that would not exist without the presence of an atmosphere...and in fact would not exist in a different atmosphere). My second beef is that the warming that is due to moderation (which makes the Earth a less efficient S-B radiator) is ignored completely, even though it dwarfs the combined radiative effect.
Given your numbers, I suspect that by "moderation" you mean the effects of thermal inertia and horizontal heat transport to help equalize temperatures around the globe.

Consider this. The heat capacity of water is 4186 J/K/kg. One meter depth of water has 1000 kg per m2. At a temperature of, say, 25 oC, or 298K, water radiates approaching 450 W/m2; cooler temperatures will be less, of course. But radiating all night (12 hours), one meter depth of warm water will shed almost 20 MJ of energy. That's enough to drop temperatures nearly 5 degrees. This still neglects all the effects of ocean currents to share temperatures around. Actual diurnal temperature ranges for the ocean are less than this, and certainly less than the 260 degrees of diurnal temperature variation on the Moon! Without an atmosphere, the mixing depth of the surface ocean layers is quite small, around a meter or so. This is what we see now in still air. Without the backradiation, the ocean will radiate quite efficiently, but the very high heat capacity still strongly damps any variation from day to night. Having about 5 degrees in the diurnal temperature range for the ocean is a reasonable upper bound.

Although it is certainly possible to look at temperature effects of thermal inertia, albedo, and horizontal heat transport, none of them are exclusive to the atmosphere only, and none of them invalidate the conventional calculation of the contribution of our atmosphere's greenhouse effect.

Finally, you really need to take into account that backradiation is a directly measured quantity. The earliest direct measurements were in 1954, and they confirm that as far as influx of energy to the surface, which is required to balance the large amount heat being radiated, atmospheric backradiation is the largest energy input to the surface, even larger than what is absorbed from the Sun. This can only happen because the atmosphere is able to interact with thermal radiation. Reference:

Cheers -- sylas

Last edited by a moderator:

#### FireBones

No problem at all. I don't mix up robust disagreement with aggression or bitterness. Furthermore, I don't expect you just to fall over and accept my explanations. I anticipate that we'll soon conclude this discussion without reaching agreement, and that's fine by me, as long as we have both been able to explain our particular view.
Actually, we may well end up persuading each other more than that... it seems we are both willing to wrestle for the truth, and I once again very much appreciate your response!
In any event, it appears the difference in our views hinges on one particular thing that I think you have overlooked.

I further appreciate your taking the time to draw out the disagreements into clear questions...this is complicated stuff, and it is easy to talk past one another if things are nailed down and kept track of.

I should point out one possible [minor] point of confusion. When I say "blackbody temperature" I mean "blackbody temperature assuming emissivity." In other words "the temperature the planet would have to be if it were isothermal and made of material equal to its average emissivity." Obviously, this means the determination of emissivity is important.

Have you got a reference for this effect?

Suppose that the entire Earth surface was all the same temperature, of 16 oC, which you described previously as a "blackbody temperature of the green-house affected Earth". That is 289K, and by the Stefan-Boltzman law, a blackbody at 289K radiates just over 395 W/m2. Now this is indeed pretty close to what has been conventionally given as the amount of energy being radiated upwards from the surface. The modern value is usually given as 390 W/m2, from Trenberth et al (2009), corresponding to blackbody of about 15 oC. In either case, this is significantly more than the total energy available from the Sun, which is at most 342 W/m2.

Note that we have both already agreed that any variation of temperatures above and below a mean of 16 oC makes the Earth a more effective radiator, and thus increases the energy being radiated to be above 395 W/m2.
Okay, we have to stop there for a moment. I think you are looking at this from the wrong side of the telescope. The Blackbody temperature
is 15 oC or 16oC (let's say 15 just for discussion), but that does not mean the actual mean is that (a point you mention later)... so it is not good to speak of "variation of temperatures above a and below a mean of [black-body radiation mean]" because that is not the mean about which temperature varies.

What I would say is, the True Mean Temperature (TMT for this discussion) is less than the blackbody temperature of 15, and the reason why the TMT is less than 15 is because the (small) differences in temperature around the TMT increases the efficiency of Earth as a radiator so that the earth can radiate 390 W/m^2 at a lower TMT than it would have if it were truly all the same temperature. These variations are pretty small on Earth, so the increase in efficiency is pretty small, so I suspect that the TMT is pretty close to the 15 oC blackbody radiation.

I might be splitting hairs here, but I want to make sure we are not talking about starting out with an idealized, 15-degree-everwhere earth and then increasing the radiation given off by shifting temperatures. Rather, the variation in temperature means that the Earth does not heat up (on average) to the full blackbody radiation temperature.

Question 1. Do you disagree that the energy radiated up from the surface is significantly more than the energy radiated down from the Sun? Or if you agree with this, then where does the surface get the additional energy required?
The energy radiated up from the Earth is significantly greater than the energy radiated down from the sun. I'm happy accepting, for example, the information on page 314 of "http://ams.allenpress.com/archive/1520-0477/90/3/pdf/i1520-0477-90-3-311.pdf" [Broken]" as approximately true for the sake of our discussion.

I'll take those as questions for my analysis. There is almost no reflection from the clear sky atmosphere. There is significant reflection from cloud. Of course, more reflection has a cooling effect. I presume we are agreed on all of this? Here then are my answers.

(B) is fairly easy to answer, if we just assume there's no interference with thermal radiation and leave the albedo unchanged. I did the calculation in the last post, and the result is a mean surface temperature of -18 oC, or less. As we have both noted, any sharing of temperature around the globe would give a more effective radiator, and hence even lower mean temperatures to shed the same incoming solar energy. So by any account, this will freeze the planet very thoroughly.

(A) is more subtle, but in the end the same. If we assume no atmosphere, then we assume no cloud. So as well as having no greenhouse effect, we also have no reflection from the cloud. Now Earth is mostly ocean, and ocean water has a very low albedo, of around 0.06. (Ref: http://nsidc.org/seaice/processes/albedo.html [Broken] albedo page) The solar input of 341.5 W/m2, and albedo of 0.06, the total absorbed energy is 321 W/m2, and that has a blackbody temperature of about 1 oC, all of which flows unimpeded into space. As before, the mean temperature must be below the blackbody temperature, and so there will be a lot of frozen ocean; much more than at present with mean temperatures about 15 degrees higher. But ice has a very high albedo, of around 0.5 to 0.7. So this means the Earth's albedo will actually end up higher than the current albedo, and then exactly as in the case (A) above, you end up with effective blackbody temperatures a long way below freezing, and a mean surface temperature less than -18 oC as before.

As before, removing any horizontal mixing effect of the atmosphere only drives mean temperatures lower still, to shed the absorbed energy. The -18 is a strict upper bound.

So my answer to (A) and (B) is that without atmospheric interactions with thermal radiation, the temperature of the Earth's surface would have an average temperature of something less than -18 oC, and it would freeze solid. If you can explain your method in similarly quantified terms, it will help to see where the differences lie.
Alright, now we are getting into (at least part of the heart) of the matter...and seeing the complexity of the question.

Here is where I disagree with your analysis. If the Earth had no atmosphere [or had none of the gases that interacted with light], not only would we have no clouds, but we would have no water. With no water vapor to provide vapor pressure for ice, all water on the earth would either melt or sublimate away.

Question 2. What calculations or physical theory would you apply to estimate the effects in either case (A) or (B) above?
If the Earth had no atmosphere [and hence no water], I would model it as a blackbody radiator with low albedo and low emissivity. I would calculate the expected temperature on the sun-side of Earth by using the S-B law [where each square-meter is at the appropriate temperature to radiate away the light it is receiving]. I would model the night-side of the Earth as approximating the values on the night-side of the moon.

If instead we allowed for an atmosphere, but said it lacked water vapor, methane, ozone, etc., then it would come down to estimating the ability of Earth to moderate its temperature in the absence of oceans. I have not attempted that.

Question 3. I know what the word "moderation" means, but not as a technical term in thermodynamics for any specific effect. How do you obtain that factor of 4? Do you have a reference for this effect?
I compared the values I mentioned in my first post. However, I now realize that the certainty of this number is hard to determine because it is altogether unclear how a lack of radiative absorption in the troposphere would affect cloud formation. This is the problem with mixing and matching models. You cannot "turn off" radiative absorption and then use the current cloud-dependent albedo. Cloud formation trends are one of the least understood concepts in climatology.

To arrive at my figure, I removed clouds all together and calculated the difference between the current the blackbody temperature at 390 W/m^2 versus the blackbody temperature of an atmosphereless/cloudless earth.

This puts the effect of radiative absorption at around 11 oC

To determine the effect of moderation of temperature, I compared the blackbody temperature of a body at 1 AU to the mean temperature of a body with only negligible means of moderation. The Moon is a good example. The moon [at 220K] is about 50 degrees colder than it would be if it were isothermal.

So, the effect of moderation (that is...the warming required to make up for being isothermal, or nearly so) is about 50K

From reading the rest of your post, it appears that the key difference between your view and mine is "What happens to the water?"

In your view, Earth would be moderated in temperature anyway due to the water and its ability to moderate temperature all by itself.
In my view, Earth wouldn't have any water if it were not due to the atmosphere, so the warming due to moderation of temperature is attributable to the atmosphere.

In your view, the Earth has a high albedo anyway you slice it because you either have clouds in the air or you have ice on the ground.

In my view, this is once again only due to the atmosphere's existence to begin with.

One thing that I think is important to consider is, when describing "the greenhouse effect" as a general phenomenon, one should be able to talk about mechanisms applicable in the absence of special considerations. The considerations you bring up are not only based on water vapor being extant in great enough proportions to accomplish clouds/ocean, but are also based on the nearness to the freezing point of Earth's average temperature. The subtle issue where removing the atmosphere lowers the temperature of the Earth just enough to cause ice is a peculiar situation unique to Earth. Most planets are not setup to have their albedo double if they drop 10K.

With that in mind (and due to the fact that our oceans would not exist without at atmosphere), I think when discussing "the effect of an atmosphere" it is better to speak in terms that would apply on any planet. It is a simple mathematical fact that a planet with extreme temperatures will have an average temperature much colder than a planet with moderate ones, and it is further a fact that atmosphere's tend to moderate temperatures on any planet (regardless of the presence of water or not). The thicker the atmosphere, the more moderation. In this sense, an atmosphere can generally be said to warm a planet by a mechanism completely separate from what is typically called "The Greenhouse Effect," and in the case of Earth, the total warming due to that mechanism is the greater of the two.

Last edited by a moderator:

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving