The measurements had an offset error related to
the infrared radiation flux (a net loss, as the instruments involved emitted more IR than they received from clear skies).
In the thread on Confidence in Radiative Transfer Models, we argued that line - by - line radiative transfer codes and the best band models can accurately simulate clear sky (no clouds, aerosols)
infrared radiation fluxes at the surface provided that the vertical profiles of atmospheric temperature and trace gas concentrations are specified accurately.
Not exact matches
Once the heated layer becomes more than a few centimeters thick, the heat loss of the skin layer due to downward conduction of heat by diffusion stops having any significant effect on the surface temperature, since rock is such a good insulator that the heat
flux by conduction in rock is tiny compared to the heat loss by
infrared radiation out the top.
In equilibrium these would be balanced by upward transfer of
infrared radiation emitted by the surface, by sensible heat
flux (warm air carried upward) and by latent heat
flux (i.e. evaporation — moisture carried upward).
Suppose that most of the
radiation that makes up the
flux Iup is not visible, but
infrared.
Since the intensity of
infrared radiation increases with increasing temperature, one can think of the Earth's temperature as being determined by the
infrared flux needed to balance the absorbed solar
flux.