I want to discuss the recent Kaufman study which purports to reconcile flat temperatures over the last 10 - 12 years with high -
sensitivity warming forecasts.
Not exact matches
Another way to estimate climate
sensitivity from both models AND observations is to calculate the ratio of observed
warming to
forecast warming... then multiply that by the ECS value used in the model.
All the rest of the
sensitivity between this 1C and 3C or 5C or whatever the
forecast is comes from feedbacks (e.g. hotter weather melts ice, which causes less sunlight to be reflected, which
warms the world more).
How much anthropogenic
warming or climate
sensitivity is required to so
forecast / hindcast?
Now I know that the CAGW hypothesis tells us that CO2 is the primary climate «control knob» with a mean 2xCO2 climate
sensitivity of 3.2 C. And, using this hypothesis, IPCC climate models have calculated that it should be
warming by 0.2 C per decade (Hansen's 1988
forecast even called for 0.32 C
warming per decade).
Again, there almost certainly is a
warming trend since 1850, and some of that trend is probably due to manmade CO2, but
sensitivities in most
forecasts that get attention in the media are way too high.
If the
sensitivity is greater than 3 deg, how much
warming might we have expected to have had since the last IPCC
forecasts in 2000?