But then there's feedbacks within the stratosphere (water vapor), which would increase the stratospheric heating
by upward radiation from below, as well as add some feedback to the downward flux at TRPP that the upward flux at TRPP would have to respond to via warming below TRPP.
Not exact matches
Furthermore while the increased
radiation down from the layer will eventually (in equilibrium) warm the layers below enough to cause a matching increase in
radiation upward from below, it is not true that this
radiation will all be absorbed
by (and warm) the layer.
sigmaT ^ 4 is the
upward blackbody
radiation (based on stefan - boltzmann) at the surface, «a» is the albedo (reflectivity), so (1 - a) is the fraction of incident solar
radiation that is absorbed
by the planet.
Given this, it is quite clear that any reduction in the efficiency of
upward radiation (
by, say, reflecting it right back down again), will have to be compensated for
by increasing the air / sea (skin) temperature difference, hence having a warmer subsurface temperature.
The increase / decrease of net
upward LW flux going from one level to a higher level equals the net cooling / heating of that layer
by LW
radiation — in equilibrium this must be balanaced
by solar heating / cooling + convective / conductive heating / cooling, and those are related to flux variation in height in the same way.
By absorbing
radiation from below; but it would radiate both
upward and downward, thus making the layer cooler; but if it is optically thick, it could make the lower part warmer.
If it is in an isothermal layer, it will radiate
upward as much as downward; it will decrease the baseline TRPP net flux and increase the baseline TOA flux
by the same amount, but it will decrease the baseline TOA flux
by a greater amount if it is absorbing
radiation with a higher brightness temperature from below (the baseline
upward flux at TRPP), so it will increase the amount
by which the baseline net flux at TRPP is greater than that at TOA.
In equilibrium these would be balanced
by upward transfer of infrared
radiation emitted
by the surface,
by sensible heat flux (warm air carried
upward) and
by latent heat flux (i.e. evaporation — moisture carried
upward).
Dynamical
upward transport
by convection removes excess heat from the surface more efficiently than longwave
radiation is able to accomplish in the presence of a humid, optically thick boundary layer, and deposits it in the upper troposphere where it is more easily radiated to space, thereby affecting the planetary energy balance.
2) If the sole determinants of surface temperature are mass, gravity and insolation then what role is played
by upward infrared
radiation from the surface (UWIR) and downward infrared
radiation from the sky (DWIR)?
So we have the surprising logical conclusion that GHGs should actually facilitate atmospheric cooling
by permitting
upward radiation which could not otherwise occur.
The GHGs can not absorb any of the
upward return of
radiation from the ground because they are already at the maximum temperature permitted
by the incoming solar
radiation.
Instantaneously after adding the extra GHG, the temperature profile of the atmosphere has not yet changed, so the
upward radiation, which is sourced
by the ground and the lower atmosphere, remains the same within the bounds of the old photosphere.
The strongest
upward motion in the model's TTL is generally driven
by dynamics instead of
radiation, occurring in those TTL cloudy regions that overlap with optically thick clouds in the upper troposphere (UT).
It has to warm if the net
radiation there increases
by 3.7 W / m2, whether it is an increase in downward shortwave, or an increase in
upward longwave.
To me the lesson of hot greenhouses, hot attics or hot metal sheds is that if air is confined and not allowed to convect
upward to the sky it will heat up mostly because the cooling of convection has been stifled and less so because of any differential transmission of
radiation by glass.
The only possibility I can think of is to suggest that DLR energy adds more to the system than is lost
by increased
upward evaporation and
radiation.
The pioneer was W.H. Dines, who gave the first explicit model including infrared
radiation upward and downward from the atmosphere itself, and energy moved up from the Earth's surface into the atmosphere in the form of heat carried
by moisture, Dines (1917); Hunt et al. (1986) gives a review.
Without the heat flow from the surface there would be no back
radiation since it is the
upward radiation, convection and heat conduction that are responsible for the heating of the atmosphere which, in turn, results in the emission of
radiation by atmosphere.
On top of the ocean heating, we can look at the outgoing
radiation from the atmosphere,
by satellite, to see that frequencies associated with water vapor and CO2 have reduced
upward emissions.
In the ocean the surface layer is cooler because it loses heat
by evaporation,
radiation upward and conduction to the air.
Now the sun would be expected to set up an undisturbed gradient from cold at the bottom to warm at the top but it does not because
upward radiation from the surface plus energy drawn upwards
by evaporation at the surface creates a layer 1 mm deep near the surface (the subskin) which is 0.3 C cooler than the water below it.
If we now add water vapor and / or CO2 to the air then temperature of the air layer will become Ta» > Ta due to the absorption of a part of the
upward radiation by vapor and / or CO2, which will reduce the heat losses from the surface to the layer since we now have h * (T - Ta»).
After the initial absorption
by the CO2 of the longwave
radiation from the earth, most of the re-emitted longwave
radiation should be emitted
upward into space.»
When DLR from a clear sky (either at night or
by day) is present it does not significantly decrease
upward radiation in the way that a cloud does and it increases evaporation
by adding energy to the interacting layer (the top 10 microns) and then allowing maximum convection rather than suppressing it in the way that a cloud does.
By retaining this top half of the thermal resistance one would have access to the actual surface temperature which would be more correct for calculating the
upward surface
radiation.
On the other hand,
upward radiation from the surface of your ocean is determined
by the temperature of that ocean.
Clouds make warmer nights because the clouds are usually warmer than the normal air temperature at that altitude and therefore the surface's rate of loss
by radiation upward will be less leaving you with a warmer than normal night.
wayne said: Clouds make warmer nights because the clouds are usually warmer than the normal air temperature at that altitude and therefore the surface's rate of loss
by radiation upward will be less leaving you with a warmer than normal night.
So
by KT97 if you let the surface be 289K as stated in TFK09 instead of 288K you can take: 67 Wm - 2 absorbed SW
by the atmosphere plus 24 Wm - 2 carried
upward by thermals (dry conduction / convection) plus 78 Wm - 2 carried
upward by evaporation (convection) plus 66 Wm - 2 actual LW
radiation flux
upward (
radiation)------ 235 Wm - 2 detected LW upwelling
by satellites above the TOA