So, the «back radiation» from the greenhouse gas can only heat the surface (at best) to less than the surface
radiating temperature which «warmed» the greenhouse gas.
Yes the very top of the Uranus atmosphere is at a temperature close to
the radiating temperature which is a little colder than 60K.
Not exact matches
The surface of the Earth
radiates as a blackbody at its
temperature which is continually changing because it is being heated by the sun, or it is cooling during the night.
There are a variety of nice graphical touches including heat haze from the rear of your car and opposing cars
which realistically creates the sensation of heat
radiating from a car running at high
temperatures.
CO2 absorbs between the 600 - 800 cm ** -1 region, a very important part of the spectrum for planets or moons
which radiate at Earth - like
temperatures, and so yes, this substantially reduces the outgoing radiation of the planet for a given
temperature.
The skin layer planet is optically very thin, so it doesn't affect the OLR significantly, but (absent direct solar heating) the little bit of the radiant flux (approximatly equal to the OLR) from below that it absorbs must be (at equilibrium) balanced by emission,
which will be both downward and upward, so the flux emitted in either direction is only half of what was absorbed from below; via Kirchhoff's Law, the
temperature must be smaller than the brightness
temperature of the OLR (for a grey gas, Tskin ^ 4 ~ = (Te ^ 4) / 2, where Te is the effective
radiating temperature for the planet, equal to the brightness
temperature of the OLR — *** HOWEVER, see below ***).
The latter is what we should see on Earth, with its spectral shift — a warmer stratosphere at all levels with increasing CO2, combined with a reduced
temperature at
which the stratosphere
radiates to space.
All the models, not just those of RealClimate, assume that CO2 (and H2O) are in local thermodynamic equilibrium (LTE) and
radiate at the kinetic
temperature of the air, in
which case they would effectively emit all the radiation the absorb.
If it is in an isothermal layer, it will
radiate upward as much as downward; it will decrease the baseline TRPP net flux and increase the baseline TOA flux by the same amount, but it will decrease the baseline TOA flux by a greater amount if it is absorbing radiation with a higher brightness
temperature from below (the baseline upward flux at TRPP), so it will increase the amount by
which the baseline net flux at TRPP is greater than that at TOA.
First off, an idealised «black body» (
which gives of radiation in a very uniform and predictable way as a function of
temperature — encapsulated in the Stefan - Boltzmann equation) has a basic sensitivity (at Earth's
radiating temperature) of about 0.27 °C / (W / m2).
If the Earth absorbs more energy, its
temperature rises,
which causes it to
radiate more energy back into space (Stefan - Boltzmann law) until it reaches equilibrium at a higher
temperature.
If adding GHGs reduces the output of
radiated energy to space and raised the
temperature per the GCMs, then Wien's law
which dictates the energy out, doesn't work.
Atmospheric back radiation in no way reduces the ocean's ability to
radiate or conduct its own energy
which is at a higher
temperature and energy state.
Except for: — Your claims that bidirectional EM violates the 2nd law of thermodynamics; — Your sentient detector that received no energy from the object it is pointed at but
radiates energy according to the
temperature it is point at allowing you to see beyond the edge of the observable universe (Still awaiting the Nobel prize for that one no doubt); — Your perfectly
radiating blackbody that does not
radiate according to its
temperature; — Your claims EM energy interferes
which prevents energy from a colder body reaching a warmer one — a concept
which would mean it would be impossible to see your reflection in a mirror.
The atmosphere high up
which radiates to space with a brightness
temperature of less than -18 C the same amount of energy that we receive from the Sun is indeed cold.
If the amount of energy received by the Earth from the Sun exceeds the amount the Earth
radiates into space, then the only thing the Earth can do is increase its
temperature,
which in turn will increase the amount of radiation into space.
Yet he has ignored the fact that energy from the sun is constantly being added to the system
which does indeed make the Earth
radiate at a higher
temperature.
But that's actually an understatement by Gallup, since more than 97 % of the world's climatologists say that those carbon gases,
which are given off by humans» burning of carbon - based fuels, are causing this planet's
temperatures to rise over the long term, as those carbon gases accumulate in the atmosphere and also block the heat from being
radiated back into outer space.
Since the Earth
radiates approximately 240Wm - 2 of energy
which has an effective
temperature of 255K degrees, 255K to 249K is often chosen.
CO2 radiation at high above is cold (depends on altitudes, latitudes, and longitudes, say from 0degreeC to -60 degreeC) can not
radiate net heat back to the Earth
which is at a higher
temperature.
Our back porch has a sheet metal roof that
radiates heat down about 30 + degrees hotter than the ambient
temperature which on a plus 100 degree day is something to experience.
Because the energy source is symmetrically distributed throughout the shell wall, at some point the external surface of the shell will reach a
temperature such that the rate of energy
radiated outward by this surface will be equal to the rate at
which radioactive decay energy is generated internal to the shell wall.
Then that lowest atmosphere layer emit and a 50 - 50 split sends it half up and half down; and the up ward is again absorbed by a higher and now cooler layer;
which in turn emits but now at a lower
temperature; until finally some much higher and much cooler layer gets to emit radiation that actually escapes to space and that
radiating temperature is the one that must balance with the incoming TSI insolation rate.
Birkeland currents are interesting, although they seem to be a possible correction to direct solar irradiance only at the poles and only in the ionosphere,
which is already enormously hot — between 1500C and 2500C — but so tenuous that you wouldn't feel heat if you stuck your arm out into the near vacuum of the ionosphere, you'd feel intense cooling as your blood started to boil and ordinary thermometers would
radiate heat away faster than they would equilibrate (and hence would read very cold
temperatures).
If you were to let this perfect ideal gas
radiate,
which they all do to some degree, then the pressure at the bottom would have more emissivity / absorptivity and that «
temperature» ratio would show pressure's influence be even higher... even without external energy... it's called pressure broadening and it guaranteed.
These stem from a diversity of site - specific conditions, including, but not limited to: local vegetation; presence of building structures and contributions made by such structures involving energy use, heating and air conditioning, etc; exposure to winds, the wind velocities determined by climatic factors and also whether certain wind directions are more favored than others by terrain or the presence or absence thereof to bodies of water; proximity to grass, asphalt, concrete or other material surfaces; the physical conditions of the CRS itself
which include: the exact location of the
temperature sensors within it, the degree of unimpeded flow of external air through the CRS, the character of the paint used; the exact height of the instrument above the external surface (noting that when the ground is covered by 3 feet of snow, the
temperature instrument is about 60 % closer to, or less than 2 feet, above an excellent
radiating surface, much closer than it would be under snow - free conditions).
The force of gravity acting on molecules in flight establishes a state of thermodynamic equilibrium
which, in the absence of
radiating molecules, has a
temperature gradient equal to the «dry» lapse rate.
However, it is much easier to figure out what happens when you add more radiative gases to an atmosphere that already has them: And, the answer is that it increases the IR opacity of the atmosphere,
which increases the altitude of the effective
radiating level and hence means the emission is occurring from a lower -
temperature layer, leading to a reduction of emission that is eventually remedied by the atmosphere heating up so that radiative balance at the top - of - the - atmosphere is restored.
The tired old alarmist argument goes something like this: CO2 levels increase,
which in turn increases
temperature,
which in turn means more evaporation,
which in turn creates more clouds trapping the heat allowing less heat to be
radiated off into space.
Quote: «The greenhouse effect theory would have us believe that trace gases in the atmosphere can absorb enough of that immense surface radiative flux to slow it down,
which is nonsense, or to
radiate enough back to warm the surface to a
temperature higher than it is warmed by solar energy.
So it is the
temperature higher up in the atmosphere — at an altitude of about 5 km — that determines the rate at
which this energy is
radiated out into space.
To deal with that it is proposed that radiatively active gases in the atmosphere cause that uplift in
temperature by
radiating outgoing energy back to the surface for a reduction of the rate of cooling
which then causes a surface
temperature enhancement.
An aside: one of the reasons that clouds modulate
temperature so effectively is not just the albedo increase
which bounces downwelling short wave radiation back into space, but because they
radiate IR back to the surface thus reducing the net rate of thermal radiative loss.»
The trace gases absorb the radiation of the surface and
radiate at the
temperature of the air
which is, at some height, most of the time slightly lower that of the surface.
The true situation is that without the green house gas the surface
radiates to space
which is at 4K while with ghg it
radiates to to ghg at a
temperature around 200K or higher.
Measuring with a spectrometer what is left from the radiation of a broadband infrared source (say a black body heated at 1000 °C) after crossing the equivalent of some tens or hundreds of meters of the air, shows that the main CO2 bands (4.3 µm and 15 µm) have been replaced by the emission spectrum of the CO2
which is
radiated at the
temperature of the trace - gas.
And each time I was assured that the increase in
temperatures were the result of actual increase in heat in the climate system due to increased clouds
which prevented more heat from
radiating out.
An aside: one of the reasons that clouds modulate
temperature so effectively is not just the albedo increase
which bounces dowelling short wave radiation back into space, but because they
radiate IR back to the surface thus reducing the net rate of thermal radiative loss.
BY the way with my comment about the tropopause being too cold to
radiate 165 watts / sqM as required by Trenberth if it were a black body
which most certainly it is not, by Stafeans law it would have to be at a
temperature of -40.7 C yet it is actually at a
temperature of about -57 C so indeed it is significantly too cold.
With one shell, you get a surface
which radiates at twice the source radiation (not
temperature but radiation).
The surface is indeed warmer than the mean
radiating temperature, but the higher
temperatures are maintained by the effect of gravity
which establishes the so - called «lapse rate» as an equilibrium state.
Applying the adiabatic lapse rate from the effective
radiating height to the surface as per the S - B Law then gives a surface
temperature which is some 33C higher than it «should» be.
It is the
temperature of each layer that dictates the rate at
which that layer
radiates upward but some process or processes other than simple radiation is or are causing those layers to vary their
temperatures in relation to one another.
K) However, radiation heat loss does depend on the fourth power of
radiating (sea surface or ice surface)
temperature —
which will be right at 273 K — 275 K for open water; but what might be as low as -12 to -20 at the equinox for ice - covered water exposed directly to the rapidly freezing Arctic air.
Once radiative equilibrium is reestablished, this is a very helpful picture because we have just shifted the altitude higher from
which the earth
radiates but have kept the same
temperature which means the surface must be warmer because it is connected by the lapse rate.
In making that latter statement, I am not excluding that the general
temperature of the atmoshere (whatever that may be) does not slow down the heat loss from the surface or affect the height at
which energy is
radiated into space.
For the 120th time (okay, maybe it's only the 19th), the
temperature at the earth's surface is determined by the lapse rate and the level in the atmosphere at
which the
temperature is constrained,
which is the effective
radiating level, i.e., the average level from
which radiation can successfully escape to space.
Yet, I'm being told constantly that this colder atmosphere
radiates energy
which adds to the heat of the Earth and only needs a tiny tiny extra bit more of CO2 and the whole Earth's
temperature will go up several degrees and this will lead to runaway global warming, because in this is a net exchange of energy
which includes from the colder to the hotter.
Because the
temperature gradient in a planet's troposphere is the state of thermodynamic equilibrium
which the Second Law of Thermodynamics says will evolve, the planet's supported surface
temperature is autonomously warmer than its mean
radiating temperature, so warm in fact on Earth that we need
radiating gases (mostly water vapour) to reduce the gradient and thus cool the surface from a mean of about 300K to about 288K, this being confirmed by empirical evidence (as in the study in my book)
which confirms with statistical significance that water vapour cools rather than warms, all these facts thus debunking the greenhouse conjecture.
In the case of the the source of the 333 W / m ^ 2 this is back
radiated from a height in the atmosphere
which is at a
temperature lower than the ground surface and therefore this radiation can not be «Absorbed by the Surface».