- #1
FireBones
- 103
- 0
I was doing some research today and realized a somewhat silly fact. It appears that Earth's atmosphere actually lowers the total amount of radiation Earth's surface absorbs, even though it raises the mean temperature.
This should not sound impossible, the atmosphere provides [by re-radiation] a second source of thermal energy for the surface, but it also prevents/removes a great deal of thermal energy [by reflecting some radiation and absorbing some before it ever reaches earth...and also allowing evaporation and convection]
Here is why I'm thinking one can say the net effect is actually to cause less energy to reach earth.
The amount of energy actually disabused due to evaporation and convection is rather small (compared to radiative heat loss). For the most part, the medium-mediated heat transfer caused by the atmosphere serves to spread heat around [making the night side of the Earth much warmer than it would be without any such atmosphere] rather than offload the heat completely.
Ignoring that heat loss, the Earth essentially has one and only one way to lose its heat: radiation. The temperature of the surface is an indication of how much heat energy the Earth is receiving because, for reasonably stable temperatures, the amount of radiation the Earth absorbs must be approximately equal to the amount it gives off.
This leads to the [not too profound] statement that the temperature of the Earth is indicative of how much heat it receives. The more heat it receives, the hotter it gets before the rate of radiating away energy equals the rate of absorption of thermal energy.
Naively, this would suggest that the Earth's atmosphere causes it to receive more total heat than it would if there were no atmosphere because one can compare the average temperature of the Earth to the average temperature of the Moon and point out that the Earth, on average, is warmer than the Moon.
The issue, though, is that what we are interested in is the RATE OF ENERGY TRANSFER, which is based on the fourth power of the temperature, not the temperature itself. The Earth may have a higher mean temperature, but the mean value of T^4 is almost certainly higher on the Moon due to its massive temperature fluctuations.
This would suggest that the atmosphere actually does not increase the amount of energy the surface of the Earth receives via radiation but rather its effect of redistributing energy conspires with our poor choice of metric to make it appear that it does.
Comments?
This should not sound impossible, the atmosphere provides [by re-radiation] a second source of thermal energy for the surface, but it also prevents/removes a great deal of thermal energy [by reflecting some radiation and absorbing some before it ever reaches earth...and also allowing evaporation and convection]
Here is why I'm thinking one can say the net effect is actually to cause less energy to reach earth.
The amount of energy actually disabused due to evaporation and convection is rather small (compared to radiative heat loss). For the most part, the medium-mediated heat transfer caused by the atmosphere serves to spread heat around [making the night side of the Earth much warmer than it would be without any such atmosphere] rather than offload the heat completely.
Ignoring that heat loss, the Earth essentially has one and only one way to lose its heat: radiation. The temperature of the surface is an indication of how much heat energy the Earth is receiving because, for reasonably stable temperatures, the amount of radiation the Earth absorbs must be approximately equal to the amount it gives off.
This leads to the [not too profound] statement that the temperature of the Earth is indicative of how much heat it receives. The more heat it receives, the hotter it gets before the rate of radiating away energy equals the rate of absorption of thermal energy.
Naively, this would suggest that the Earth's atmosphere causes it to receive more total heat than it would if there were no atmosphere because one can compare the average temperature of the Earth to the average temperature of the Moon and point out that the Earth, on average, is warmer than the Moon.
The issue, though, is that what we are interested in is the RATE OF ENERGY TRANSFER, which is based on the fourth power of the temperature, not the temperature itself. The Earth may have a higher mean temperature, but the mean value of T^4 is almost certainly higher on the Moon due to its massive temperature fluctuations.
This would suggest that the atmosphere actually does not increase the amount of energy the surface of the Earth receives via radiation but rather its effect of redistributing energy conspires with our poor choice of metric to make it appear that it does.
Comments?