Radiative transfer to space affected by atmosphere?

Click For Summary
SUMMARY

This discussion focuses on modeling radiation losses from a flat surface facing the sky at night, specifically addressing the impact of atmospheric conditions on the heat flux equation Q=εσ(Ts^4-T∞^4). The effective temperature of outer space is considered to be 3K, while atmospheric absorption and emissivity are influenced by humidity and wavelength. Researchers utilize computational models such as LOWTRAN, MODTRAN, and HITRAN to account for these complexities, with a typical sky temperature on clear nights approximated at 230K.

PREREQUISITES
  • Understanding of the Stefan-Boltzmann law and its application in heat transfer.
  • Familiarity with atmospheric physics, particularly regarding emissivity and absorption.
  • Knowledge of computational models for radiative transfer, specifically LOWTRAN, MODTRAN, and HITRAN.
  • Basic principles of thermodynamics related to heat flux and temperature gradients.
NEXT STEPS
  • Research the application of LOWTRAN for atmospheric radiative transfer modeling.
  • Explore MODTRAN's capabilities in simulating atmospheric effects on radiation.
  • Study the relationship between humidity and infrared transparency in the atmosphere.
  • Investigate the use of emissivity as a function of wavelength in heat transfer calculations.
USEFUL FOR

This discussion is beneficial for atmospheric scientists, thermal engineers, and researchers involved in environmental modeling and radiative transfer analysis.

Mapes
Science Advisor
Homework Helper
Gold Member
Messages
2,591
Reaction score
21
I'm trying to model radiation losses from a flat surface facing the sky at night. If we ignore radiative absorption/emission in the atmosphere, the heat flux is the well-known

Q=\epsilon\sigma(T_s^4-T_\infty^4)

where we have the emissivity, the S-B constant, the temperature of the surface, and where I would think T_\infty is the effective temperature of outer space, 3K.

How does the presence of the atmosphere affect this model? How have other researchers dealt with this complication?
 
Astronomy news on Phys.org
The IR transparency of the atmosphere is dependent on humidity. It would be a significant complication. I suppose you could get a reasonable estimate based on the view of Earth down from the top: http://www.goes.noaa.gov/ECIR4.html

With a known surface temperature and a measured temperature through the atmosphere, you can estimate the effect of sky transparency.
 
It's pretty easy: the emissivity is a function of wavelength. Becasue of conservation of energy, the emissivity = absoprtion. The heat transfer equation simply turns into an integral over wavelength.

The atmospheric absoprtion depends on pretty much everything, there's good computational models (LOWTRAN/MODTRAN/HITRAN) out there, some of which are public domain.
 
Mapes said:
I'm trying to model radiation losses from a flat surface facing the sky at night. If we ignore radiative absorption/emission in the atmosphere, the heat flux is the well-known

Q=\epsilon\sigma(T_s^4-T_\infty^4)

where we have the emissivity, the S-B constant, the temperature of the surface, and where I would think T_\infty is the effective temperature of outer space, 3K.

How does the presence of the atmosphere affect this model? How have other researchers dealt with this complication?

On most of the heat transfer calculations I've seen the sky temperature on a clear night was taken as 230K.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
7K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
Replies
8
Views
4K
  • · Replies 3 ·
Replies
3
Views
966
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K