Temperature and Radiative Transfer - more things we know

Ray Pierrehumbert's recent brief but excellent exposition of radiative transfer theory in Physics Today and scienceofdoom's continuing effort to clearly explain atmospheric radiation and the "Greenhouse" effect inspired me to do a little playing around of my own with the underlying theory and equations, to get a better feel for some of the expected behavior, and perhaps illuminate another aspect of the problem to a public audience.

As Pierrehumbert pointed out in Physics Today, the reason we have to treat thermal radiation separately from ordinary matter in the atmosphere is the decoupling from matter associated with the atmosphere's near transparency. When radiation interacts strongly with surrounding matter, it acquires the black body spectrum associated with the local temperature of that matter, but when the interaction is weaker, the flux may be quite different from the black body flux:

The local radiation field need not be in thermodynamic equilibrium with matter at the local temperature. Nonetheless, the equations predict that the radiation field comes into thermodynamic equilibrium in the limiting case in which it interacts very strongly with the matter. For such blackbody radiation, the distribution of energy flux over frequency is given by a universal expression known as the Planck function B(ν,T), where ν is the frequency and T is the temperature.

And in fact, observations of outgoing radiation from Earth see a decidedly non-black-body shape in the emission spectrum. Here's what that Planck function looks like for several different black-body temperatures:
planck-1000k-300k.png
(source: scienceofdoom)

whereas here is one of the earliest measurements of the actual spectrum leaving Earth:
goody-1989-clear-sky-spectrum2.png
(source: scienceofdoom, from "Atmospheric Radiation", Goody (1989) - this is the spectrum observed for clear skies over the Gulf of Mexico, April 23, 1969. After Conrath et al (1970))

and you can find similar images for many different observations since then. The observations show huge chunks taken out of the Planck function over certain ranges of spectral wavelength, particularly around 15 microns. Outgoing radiation from Earth is clearly not in thermal equilibrium with anything at a single temperature - rather it looks like parts of it are at thermal equilibrium around 290 K here(about 20 degrees C), about what you'd expect at sea level, and other parts of the spectrum correspond to the Planck function for much lower temperatures, 220 K (-53 degrees C) at the lowest.

Thermal radiation from the ground and ocean is pretty much in local equilibrium with those surfaces, i.e. a near perfect black-body spectrum, at say 290 K. So it is the intervening atmosphere that is reducing the flux intensity at particular wavelengths. It does this by both absorbing and emitting thermal radiation at particular wavelengths, according to the characteristics of the molecules in the air. Note especially that atmospheric absorption doesn't pull the radiation flux all the way down to zero - rather it brings it (for those strongly absorbing wavelengths) down to the level of the Planck function for the low temperature associated with high altitudes in the atmosphere.

The essential equation involved is Schwarzschild’s Equation, describing the change in upward (or downward) radiation intensity for a given wavelength through a small "slab" of atmosphere:
radiation-transfer-diagram.png
(source: scienceofdoom)

as Pierrehumbert shows in his article, the equation amounts to: ΔIν = eν[−Iν + B(ν,T)] where eν is the emissivity (and absorptivity) of the thin slab (I is the flux intensity and ν here is frequency, an alternative to wavelength in determining position in the spectrum). Note scienceofdoom also demonstrates this equation (eq. 12) but with a slightly confusing change of coordinates to optical depth that increases in a downward direction.

Setting eν = dτ, an increment in optical depth, makes this a differential equation for the intensity as a function of optical depth. For a region of constant temperature T, B(ν, T) is also a constant flux, and Iν(τ) can be solved directly as:

(1) Iν(τ) = B(ν, T) + (I0 - B(ν, T)) e

I.e. the intensity exponentially approaches the Planck intensity (thermal equilibrium of the radiation with matter) with the exponential rate controlled by that optical depth.

As the first image above suggests, you can determine the effective temperature for each wavelength just by inverting the Planck function. The exact form for B(ν, T) is:

(2) B(ν, T) = 2 h ν^3/(c^2 (ehν/kT - 1))

(note this involves the speed of light c, Planck's constant h, and Boltzmann's constant k, so the formula interestingly links the three physical realms of relativity, quantum mechanics, and thermodynamics). For a given value of ν this is a monotonically increasing function of temperature T, and the Planck function can be inverted easily enough, giving T as a function of intensity, with the functional form y = 1/ln(1 + 1/x):

TempVsIntensity.png

Equation 1 can then be used to find the effective black-body temperature of radiation passing through an isothermal medium, as a function of optical depth. I.e. the radiation comes in as if at a certain temperature (say, 290 K) but after passing through a constant-temperature absorbing region (at, say, 220 K) the radiation flux adjusts over distance to come closer and closer to the temperature of that medium. This depends on the wavelength relative to the temperature since the intensity is a much less sensitive measure of temperature at short wavelengths.

TempVsDistance.png
(R code)

The above graph assumes the incoming radiation has a flux corresponding to double the temperature of the medium it is passing through; the wavelength (λ) values are scaled such that 0.2 is close to the peak of the incoming black-body spectrum, and a value of 0.4 is close to the peak of the black-body spectrum of the absorbing medium. This shows that for short wavelengths a much longer optical depth is required for the radiation to get close to a level corresponding to the equilibrium temperature of the absorbing medium.

This is pretty much what happens when thermal radiation from the ground hits a cloud - the incoming radiation is essentially all absorbed, and the outgoing radiation is at the temperature of the cloud, not the ground. But for the clear atmosphere the situation is a little more complex: in some spectral regions the optical depth of the entire atmosphere is not very large. Even for wavelengths where it is reasonably large (1 or more) the temperature is far from constant over the absorbing/emitting region, so the final outgoing flux corresponds to an average temperature over some height through the atmosphere, not the temperature of any particular small "slab". l hope to take a look at the math involved there in a followup post.