# Radiative transfer to space affected by atmosphere?

1. Mar 5, 2008

### Mapes

I'm trying to model radiation losses from a flat surface facing the sky at night. If we ignore radiative absorption/emission in the atmosphere, the heat flux is the well-known

$$Q=\epsilon\sigma(T_s^4-T_\infty^4)$$

where we have the emissivity, the S-B constant, the temperature of the surface, and where I would think $T_\infty$ is the effective temperature of outer space, 3K.

How does the presence of the atmosphere affect this model? How have other researchers dealt with this complication?

2. Mar 5, 2008

### Staff: Mentor

The IR transparency of the atmosphere is dependent on humidity. It would be a significant complication. I suppose you could get a reasonable estimate based on the view of earth down from the top: http://www.goes.noaa.gov/ECIR4.html

With a known surface temperature and a measured temperature through the atmosphere, you can estimate the effect of sky transparency.

3. Mar 6, 2008

### Andy Resnick

It's pretty easy: the emissivity is a function of wavelength. Becasue of conservation of energy, the emissivity = absoprtion. The heat transfer equation simply turns into an integral over wavelength.

The atmospheric absoprtion depends on pretty much everything, there's good computational models (LOWTRAN/MODTRAN/HITRAN) out there, some of which are public domain.

4. Mar 6, 2008

### GT1

On most of the heat transfer calculations I've seen the sky temperature on a clear night was taken as 230K.