Radiative transfer to space affected by atmosphere?

Click For Summary
Modeling radiation losses from a flat surface facing the sky at night requires consideration of atmospheric effects on heat flux. The basic heat transfer equation, Q=εσ(Ts^4-T∞^4), assumes T∞ is the effective temperature of space at 3K, but atmospheric absorption complicates this. The IR transparency of the atmosphere varies with humidity, significantly impacting emissivity, which is wavelength-dependent. Researchers have utilized computational models like LOWTRAN, MODTRAN, and HITRAN to account for atmospheric absorption in their calculations. On clear nights, a common approximation for sky temperature is around 230K, which should be factored into the model for accuracy.
Mapes
Science Advisor
Homework Helper
Gold Member
Messages
2,591
Reaction score
21
I'm trying to model radiation losses from a flat surface facing the sky at night. If we ignore radiative absorption/emission in the atmosphere, the heat flux is the well-known

Q=\epsilon\sigma(T_s^4-T_\infty^4)

where we have the emissivity, the S-B constant, the temperature of the surface, and where I would think T_\infty is the effective temperature of outer space, 3K.

How does the presence of the atmosphere affect this model? How have other researchers dealt with this complication?
 
Astronomy news on Phys.org
The IR transparency of the atmosphere is dependent on humidity. It would be a significant complication. I suppose you could get a reasonable estimate based on the view of Earth down from the top: http://www.goes.noaa.gov/ECIR4.html

With a known surface temperature and a measured temperature through the atmosphere, you can estimate the effect of sky transparency.
 
It's pretty easy: the emissivity is a function of wavelength. Becasue of conservation of energy, the emissivity = absoprtion. The heat transfer equation simply turns into an integral over wavelength.

The atmospheric absoprtion depends on pretty much everything, there's good computational models (LOWTRAN/MODTRAN/HITRAN) out there, some of which are public domain.
 
Mapes said:
I'm trying to model radiation losses from a flat surface facing the sky at night. If we ignore radiative absorption/emission in the atmosphere, the heat flux is the well-known

Q=\epsilon\sigma(T_s^4-T_\infty^4)

where we have the emissivity, the S-B constant, the temperature of the surface, and where I would think T_\infty is the effective temperature of outer space, 3K.

How does the presence of the atmosphere affect this model? How have other researchers dealt with this complication?

On most of the heat transfer calculations I've seen the sky temperature on a clear night was taken as 230K.
 
Some 8 years ago I posted some experiments using 2 Software Defined Radios slaved to a common clock. The idea was measure small thermal noise by making correlation measurements between the IQ samples from each radio. This is a project that has kinda smoldered in the background where I've made progress in fits and starts. Since most (all?) RA signals are small thermal signals it seemed like the technique should be a natural approach. A recent thread discussing the feasibility of using SDRs to...