*** 1979 - 2014 has a trend of 0.0122... I think it would be interesting to
do an uncertainty analysis and see if the uncertainty around 0.0122 encompasses 0.0168 (and I'm guessing it does).
Nic, I am fairly sure that F&G did not
do an uncertainty analysis of the mathematical issues with the surface statistical models, such as HadCru.
Don't individual studies
do uncertainty analysis?
Not exact matches
The fact is that while your executives and managers may express interest in more data (more metrics, more dimensions, raw data access, more chart choices, etc.) this is an indicator of
uncertainty, not of an interest to
do more robust
analysis.
The authors note that cost - benefit
analyses of sustainable land management scenarios «can be
done even with limited data availability, «and underscore that, despite an inevitable degree of
uncertainty, «it is imperative to take action now, as every day sees the loss of more productive land that will have to be gained back.»
Humans don't deal well with either
uncertainty or ambiguity, notes James Hammitt, director of the Harvard Center for Risk
Analysis.
David Fahey, an atmospheric scientist at the National Oceanic and Atmospheric Administration in Boulder, Colorado, said that the researchers will need to
do additional
analyses to reduce the «significant
uncertainties associated with the role of black carbon in the climate.»
How
does the lack of
uncertainty analyses affect the calculation of risk?
While the Strengthening Forensic Science panel included two statisticians, the National Academies» America's Climate Choices panels
did not include a single statistician, despite the many data, data
analysis,
uncertainty, and decisionmaking issues.
In addition, model intercomparison studies
do not quantify the range of
uncertainty associated with a specific aerosol process, nor
does this type of
uncertainty analysis provide much information on which aerosol process needs improving the most.
There are limitations in using a Monte Carlo simulation, including the
analysis is only as good as the assumptions, and despite modeling for a range of
uncertainties in the future, it
does not eliminate
uncertainty.
If you want to
do a more precise
analysis, fine — you'd need to properly include the
uncertainty ranges and you would come to the same conclusion as me — as far as one can tell within
uncertainty, the non-CO2 anthropogenic forcings approximately balance.
But one can see in this graph in an instant that while the ice MIGHT
do different things, currently its trajectory is towards rapid collapse, and one senses immediately that Schweiger might still be in the middle of his unendingly bland sentence — «this
analysis will change the predicted timing of the «ice free summer» but large
uncertainties will likely remain.....
In your blog commentaries you claim absolute attribution and
do not mention these
uncertainties in your
analysis.
I strongly suspect (and I admit to not having
done this
analysis — this is just my opinion «by eye») that the records are indistuinguishable within their one - standard - deviation
uncertainties in the 19th century.
The whole POINT of
doing what if
analysis is our
uncertainty about what certain facts will be.
If she accepts that attribution is amenable to quantitative
analysis using some kind of model (doesn't have to be a GCM), I don't get why she doesn't accept that the numbers are going to be different for different time periods and have varying degrees of
uncertainty depending on how good the forcing data is and what other factors can be brought in.
Uncertainty is not an obstacle the way I see it, and we can include that in planning — that is what is
done in all risk
analysis.
It has several major advantages over PCA including that it doesn't produce negative (non-real) results and you can incorporate
uncertainty into the
analysis so you can limit significance of low - level or missing data.
Other than anecdotal
analyses, little has been
done to quantitatively assess the
uncertainties.
[The main conclusion of this
analysis is that sea level
uncertainty is not smaller now than it was at the time of the TAR, and that quoting the 18 - 59 cm range of sea level rise, as many media articles have
done, is not telling the full story.
Do you feed in this type of information into your
uncertainty analysis?
Those who like the the idea of having their skepticism subjected to a «more nuanced
analysis» and granted «valorisation of the scientific norm of scepticism» (
do we get a percentage of «
uncertainty»?)
Results
do not address all sources of
uncertainty, but their scale and scope highlight one component of the potential health risks of unmitigated climate change impacts on extreme temperatures and draw attention to the need to continue to refine analytical tools and methods for this type of
analysis
Older
analyses (e.g., Tett et al., 2002)
did not take account of
uncertainty due to sampling signal estimates from finite - member ensembles.
We can apply my simpler bias
analysis (which we can now see is limited in that it
does not provide an
uncertainty estimate for the estimated bias) to HadCRUT3 / 4.
What is needed instead is for economists to step up and
do the
analyses of the costs and benefits of GHG emissions and of proposed policies — including stating the
uncertainties on their results.
The other thing is that SST and SAT have different variances and different
uncertainties and they respond with different lags, so I UNLES Vaugh
does some work with synthetic data FIRST to prove that the methods he applies to this data actually work, I'd say the signal
analysis is flawed from the start since the «signal», the temperature curves are not really physical metrics.
In other words, the
analysis neglects structural
uncertainty about the adequacy of the assumed linear model, and the parameter
uncertainty the
analysis does take into account is strongly reduced by models that are «bad» by this model - data mismatch metric.
Unwillingness to combine the evidence in this way might be justified by the difficulties of estimating the full range of
uncertainties of each
analysis, but if the likelihood curves are taken seriously, combining all independent evidence is a natural procedure that should be
done.
As I will discuss in Parts II and III of the Decision Making Under Climate
Uncertainty series (I will get back to that soon I hope), there are a lot of other types of studies and
analyses that climate scientists might be
doing to support decision making, that the current focus of the IPCC is arguably distracting from.
How
does your
analysis allow you to distinguish between a climate sensitivity to changes in CO2 - effected radiative forcing of 0 K / (W.M ^ -2) and say 0.3 K / (W.M ^ -2), if there are these large
uncertainties in the values of the forcings?
Are you saying that the scientific community, through the IPCC, is asking the world to restructure its entire mode of producing and consuming energy and yet hasn't
done a scientific
uncertainty analysis?
Too bad Annan didn't understand my talk, since it was targeted particularly at people like him who are pushing the idea that CO2 sensitivity is 3C http://www.jamstec.go.jp/frsgc/research/d5/jdannan/probrevised.pdf and think that Bayesian
analysis can actually provide such an answer in the face of such large
uncertainty.
Smith et al (I think)
did this in their SST
uncertainty analysis.
Note too that when the resulting cost benefit
analyses are
done, to support Federal actions, the grand
uncertainties become irrelevant.
I
do not need a «robust
analysis of
uncertainty» to conclude that the accepted trends are calculated from garbage data, and can have no possible result other than to produce a much higher trend than an
analysis that properly accounted for these factors.
On his blog, tamino
does the statistical
analysis of the BEST data and finds that because the timeframe in question is so short, the
uncertainty is too large to say for certain that the short - term trend in question is any different than the long - term trend.
But I
do know the difference between a simple linear interpolation and principal component
analysis, and I can calculate the two standard deviations range of
uncertainty on a white noise linear trend.
Even just acknowledging more openly the incredible magnitude of the deep structural
uncertainties that are involved in climate - change
analysis — and explaining better to policymakers that the artificial crispness conveyed by conventional IAM - based CBAs [Integrated Assessment Model — Cost Benefit
Analyses] here is especially and unusually misleading compared with more ordinary non-climate-change CBA situations — might go a long way toward elevating the level of public discourse concerning what to
do about global warming.
Since we can not reduce the
uncertainty in these four key inputs, it seems we can not
do much to reduce the
uncertainty in the cost benefit
analyses.
It is arguably time to tackle the tropospheric humidity issue, but this should be
done from the perspective of comparing multiple data sources and assessing the
uncertainty, before publishing trend
analyses in the context of saying something about climate change.
However, at this point, no one has
done a rigorous error or
uncertainty analysis on the data, so Landsea's statements about the trends are not supported by any rigourous
analysis at this point.
Structural
uncertainties arise from an incomplete understanding of the processes that control particular values or results, for example, when the conceptual framework or model used for
analysis does not include all the relevant processes or relationships.
Personally I agree with Smith's interpretation, we have not
done the full
analysis of model
uncertainty.
Given the damage that the wrong intervention can
do, the proper response to
uncertainty isn't * inaction *, but * further
analysis and data gathering *.