Not exact matches
Re 516: I was under the following impression: GCMs use
physics only and are adjusted for better understanding
of physical phenomena or inclusion
of new physical
parameters; temps are used as an indicator
of how well the
model performs.
Even though the dynamics
of a
model is given by basic laws
of physics you will still have quite a few
parameters that you need to specify, both for the initial conditions at the time where you start running the
model and also to specify that it is actually this planet you are simulating.
Physics perturbation merely shows the limits
of model variability, given a range
of parameter uncertainty.
«Perturbed
physics» means that
model parameters are varied across their range
of physical uncertainty.
As Sorokhtin et al. (2007) mention, until recently a sound theory using laws
of physics for the greenhouse effect was lacking and all numerical calculations and predictions were based on intuitive
models using numerous poorly defined
parameters.
Part
of the process involves adjusting
model parameters within limits dictated by observations and the principles
of physics so as to coax the simulations into good agreement with the real world climate.
This connection is not an emergent property
of the
model's
physics, since we don't really know enough about the H2O cycle to
model it — instead this feedback connection is one
of the many «
Parameters» in the
model that are adjusted to attempt to match the prior data.
The egregious and misleading stuff from the warmists IMO consists
of a) overstating the quality
of the
physics in their
models and the confidence we should have that they are correct; b) treating the ad hoc
parameter of «feedback» or «sensitivity» as something they can set on heuristic grounds, and then optimizing the other
parameters of their
models around it.
In these
models there are dozens and dozens
of assignable
parameters, because you don't really know the
physics well enough to write equations.
The Blogosphere is full
of fake skeptics that think they have a good
model just because they can get an arbitrary series
of equations (usually «cycles») with arbitrary fitting
of parameters, all while ignoring the known
physics.
The difference between the two is [that] my
model directly fits known
physics initially, and has excellent explanatory power without using a sinusoid at all in an effectively one -
parameter fit across the entire range
of the data.
However, like all statistical
models that do not reflect the real underlying
physics of a situation, assuming a form
of climate sensitivity — a constant sensitivity
parameter for instance, is simply an assumption that may or may not be useful.
Here we extend the evaluation to those variables and analyse several ensembles; two multi-
model ensembles (MMEs) from CMIP3 and four structurally different single
model ensembles (SMEs, sometimes also referred to a perturbed
physics or perturbed
parameter ensembles) with different ranges
of climate sensitivity.
1) statistical
models —
parameters are allowed to vary until a best - fit to the data is found 2) dynamical
model —
parameters are fixed by the best available science (
physics, etc.) regardless
of how this makes the
model fit the data.
There is a massive amount
of calculation effort which goes into creating the detail from «basic
physics»
models when the big picture depends heavily on assumed «guestimates»
of poorly constrained
parameters.
There's a ton
physics in the
models, a lot
of it correct with
parameters estimated with high precision.