At planetary energy densities, photons do not significantly interact with each other; their distribution evolves only through interaction with matter. The momentum of atmospheric photons is too small to allow any significant portion of their energy to go directly into translational kinetic energy of the molecules that absorb them. Instead, it goes into changing the internal quantum states of the molecules. A photon with frequency ν has energy hν, so for a photon to be absorbed or emitted, the molecule involved must have a transition between energy levels differing by that amount. Coupled vibrational and rotational states are the key players in IR absorption. An IR photon absorbed by a molecule knocks the molecule into a higher-energy quantum state. Those states have very long lifetimes, characterized by the spectroscopically measurable Einstein A coefficient. For example, for the CO2 transitions that are most significant in the thermal IR, the lifetimes tend to range from a few milli-seconds to a few tenths of a second. In contrast, the typical time between collisions for, say, a nitrogen-dominated atmosphere at a pressure of 10
4 Pa and temperature of 250 K is well under 10
−7 s. Therefore, the energy of the photon will almost always be assimilated by collisions into the general energy pool of the matter and establish a new Maxwell–Boltzmann distribution at a slightly higher temperature. That is how radiation heats matter in the LTE limit [local thermodynamic equilibrium].
According to the equipartition principle, molecular collisions maintain an equilibrium distribution of molecules in higher vibrational and rotational states. Many molecules occupy those higher-energy states, so even though the lifetime of the excited states is long, over a moderately small stretch of time a large number of molecules will decay by emitting photons. If that radiation escapes without being reabsorbed, the higher-energy states are depopulated and the system is thrown out of thermodynamic equilibrium. Molecular collisions repopulate the states and establish a new thermodynamic equilibrium at a slightly cooler temperature. That is how thermal emission of radiation cools matter in the LTE limit. Now consider a column of atmosphere sliced into thin horizontal slabs, each of which has matter in LTE. Thermal IR does not significantly scatter off atmospheric molecules or the strongly absorbing materials such as those that make up Earth’s water and ice clouds. In the absence of scattering, each direction is decoupled from the others, and the linearity of the electromagnetic interactions means that each frequency can also be considered in isolation. If a radiation flux distribution I
ν in a given propagation direction θ impinges on a slab from below, a fraction a
ν will be absorbed, with a
ν << 1 by assumption. The slab may be too thin to emit like a black-body. Without loss of generality, though, one can write the emission in the form e
νB(ν,T); here e
ν << 1 is the emissivity of the slab (see figure 1). Both a
ν and e
ν are proportional to the number of absorber–emitter molecules in the slab. The most fundamental relation underpinning radiative transfer in the LTE limit is Kirchhoff’s law, which states that a
ν = e
ν. Gustav Kirchhoff first formulated the law as an empirical description of his pioneering experiments on the interaction of radiation with matter, which led directly to the concept of blackbody radiation. It can be derived as a consequence of the second law of thermodynamics by requiring, as Kirchhoff did, that radiative transfer act to relax matter in a closed system toward an isothermal state. If Kirchhoff’s law were violated, isolated isothermal matter could spontaneously generate temperature inhomogeneities through interaction with the internal radiation field.
Given Kirchhoff’s law, the change in the flux distribution across a slab is ΔI
ν = e
ν [−I
ν + B(ν,T)], assuming e
ν ≪ 1. The radiation decays exponentially with rate e
ν, but it is resupplied by a source e
νB. The stable equilibrium solution to the flux-change iteration is I
ν = B(ν,T), which implies that within a sufficiently extensive isothermal region the solution is the Planck function appropriate to a blackbody. The recovery of blackbody radiation in that limit is one of the chief implications of Kirchhoff’s law, and it applies separately for each frequency. In the limit of infinitesimal slabs, the iteration reduces to a linear first-order ordinary differential equation for I
ν. Or, as illustrated in figure 1, one can sum the contributions from each layer, suitably attenuated by absorption in the intervening layers. The resulting radiative transfer equations entered 20th-century science through the work of Karl Schwarzschild (of black hole fame) and Edward Milne, who were interested in astrophysical applications; Siméon Poisson published a nearly identical formulation of radiative transfer[3] in 1835, but his equations languished for nearly 100 years without application.