When that cooling is subtracted, the long - term warming effect is reduced to 0.09 C (0.16 ° F) per decade, well below
computer model estimates of how much global warming should have occurred.
Computer model estimates of the «human influence» fingerprint are broadly similar to the observed pattern.
Computer models estimate that 186 whales and 155 dolphins will be killed in the war games, with millions of other marine animals being injured.
Not exact matches
Deloitte Access Economics (DAE) was commissioned by Tabcorp to
model public benefits of cost savings they anticipated from the merger DAE's Regional General Equilibrium
computer general equilibrium
model (CGE
model) to
estimate «broader and long - term economy - wide benefits associated with the merger» (para 514)
The team also used a separate
computer model developed by the Federal Emergency Management Agency to
estimate the costs: more than $ 250 million in building and crop damage in the flooded area.
That is 19 °C warmer than temperatures today — more than
computer models had
estimated (Geology, DOI: 10.1130 / g30815.1).
The
computer models used to generate the maps also
estimate where, how often and how strongly ground shaking from an earthquake could occur, so that residents, engineers and city planners can see the likelihood that their community will experience a damaging earthquake over the next year.
They also used a physically based
computer model of the hydrologic cycle, which takes daily weather observations and computes the snow accumulation, melting, and runoff to
estimate the total snowpack in the western U.S.
Previous
estimates, Schultz said, were based solely on
computer models and yielded a size
estimate of only about 50 miles in diameter.
ETH researchers have now shown that the high
estimated mutation rates at the start of the epidemic were due to the limited number of virus samples at the time in combination with the
computer models used, which calculate the
estimates using genetic data from virus samples and from underlying assumptions.
When the team combined OCO - 2 data from selected passes over certain power plants in the United States with
computer models of how emissions plumes would disperse, its
estimates of those plants» emissions fell within 17 % of the actual amounts those facilities reported for those days, the researchers report this week in Geophysical Research Letters.
Using data from several sources on 162 terrestrial animals and plants unique (endemic) to the Albertine Rift, the researchers used ecological niche
modeling (
computer models) to determine the extent of habitat already lost due to agriculture, and to
estimate the future loss of habitat as a result of climate change.
By reconstructing past global warming and the carbon cycle on Earth 56 million years ago, researchers from the Niels Bohr Institute among others have used
computer modelling to
estimate the potential perspective for future global warming, which could be even warmer than previously thought.
Using a
computer model, they also
estimated the ambient noise naturally present at each site.
The image was produced using a
computer model which
estimates ozone levels from actual meteorological data.
Other researchers have used
computer models to
estimate what an event similar to a Maunder Minimum, if it were to occur in coming decades, might mean for our current climate, which is now rapidly warming.
To get a fuller picture, Vivek Arora of Environment Canada and the University of Victoria, British Columbia, and Alvaro Montenegro of St Francis Xavier University in Antigonish, Nova Scotia, Canada, used a
computer model to
estimate the overall effect of reforesting.
Dr Philip Cox, of the Centre for Anatomical and Human Sciences, a joint research centre of the University's Department of Archaeology and HYMS, used
computer modelling to
estimate how powerful the bite of Josephoartigasia could be.
In his new paper, Lovejoy applies the same approach to the 15 - year period after 1998, during which globally averaged temperatures remained high by historical standards, but were somewhat below most predictions generated by the complex
computer models used by scientists to
estimate the effects of greenhouse - gas emissions.
«If in the Neoarchean period 97 % of the Earth's surface had been, as
estimated from
computer models, covered by water, these geochemical signals would not have been found for Neoarchean seawater,» adds Dr. Hoffmann.
A
computer model uses this data to
estimate neurotransmitter reuptake across the brain.
A
computer model based on these figures was used to
estimate the space rock's orbital path.
Rather than using complex
computer models to
estimate the effects of greenhouse - gas emissions, Lovejoy examines historical data to assess the competing hypothesis: that warming over the past century is due to natural long - term variations in temperature.
Using a
computer model that fused air pollution and atmospheric chemistry data, they
estimated what annual average levels of ozone (a key smog ingredient) and fine particulates smaller than 2.5 microns (PM2.5) were in 2010 within 100 - km - by -100-km grid squares across the world.
While his new study makes no use of the huge
computer models commonly used by scientists to
estimate the magnitude of future climate change, Lovejoy's findings effectively complement those of the International Panel on Climate Change (IPCC), he says.
«I was surprised the
computer models did as good of a job as they did as predicting the changes that we
estimated.»
Computer models can give a good
estimate of mantle flow and crustal uplift, he said, and GNET's mission is to make those
models better by providing direct observations of present - day crustal motion.
The «pruning module» (robot, cameras,
computer and cutters) will be mounted on an over-the-row-harvester, says Gunkel, who
estimates that the first
models could be in vineyards within five years.
The measurements are then run through a
computer model that uses the data to
estimate the potential worst case scenario regarding «transuranic» activity in the area.
But the US Department of Energy — whose research facilities sustained an
estimated $ 1 million of damage in the earthquake — concluded that it actually enhanced the site's suitability, because seismologists were able to verify
computer models about the seismological stability of the mountain and its environs they had generated from historical data.
International institutions such as CGIAR have developed
computer models that use data on climate, crop types, and other factors to
estimate current food production and forecast future trends.
Scientists use the available data and complex
computer modeling to fill in the gaps and
estimate the burden of, say, turberculosis in Peru or high blood pressure in Italy.
But the
computer models that they and others use have become so complex that it is difficult for outsiders to test and validate the
estimates.
With that data, they will build
computer simulations, or
models, to
estimate the condition of the Red Planet's atmosphere billions of years ago.
Previously, astronomers
estimated Jupiter's age with
computer models.
The
computer model's
estimates were checked against known demographics and voting habits, with much success.
To find these numbers, Dr. Lee and his colleagues developed a
computer model to represent the U.S. adult population, and
estimated lifetime health effects for people who were obese, overweight, or healthy weight at ages 20 through 80.
Estimates from regressions with detailed controls, nearest - neighbor
models, and propensity score
models all indicate large, positive, and statistically significant relationships between
computer ownership and earnings and employment, in sharp contrast to the null effects of our experiment.
The study, combining ground and aerial sampling of the gas with
computer modeling, is the most comprehensive «top down» look so far at methane levels over the United States, providing a vital check on «bottom up» approaches, which have tallied
estimates for releases from a host of sources — ranging from livestock operations to gas wells.
To make sure I've understood your 2006 article correctly, Dr. Hansen's 10 years to a tipping point is an educated (very educated)
estimate of how long we have to stop increasing CO2 ppm (to prevent it going over the dangerous 400ppm level) as opposed to the result of calculations from a
computer model.
While you might argue that the best quantitative and detailed
estimates come from
computer models, the physics of the greenhouse effect are very basic indeed.
They might not make accurate
estimates of economies of scale of solar and wind collection devices in the SimCity
computer model, or fully account for incidental costs of burning coal.
In their research, team members used advanced
computer models of the climate system to
estimate changes in the tropopause height that likely result from anthropogenic effects.
Using a
computer model of wave - induced current stresses, the team
estimated how powerful currents would need to be for forces they exert at the sea floor to exceed a «critical force» that triggers sediment suspensions and could lead to underwater mudslides.
But scientists were caught out — in one
computer model, 111 of 114
estimates over-stated recent temperature rises.
Even under optimistic assumptions about
computer performance continuing to increase exponentially, we
estimate that climate
models resolving low clouds globally will not be available before the 2060s.
Using
computer climate
models, scientists
estimate that by the year 2100 the average global temperature will increase by 1.4 degrees to 5.8 degrees Celsius (approximately 2.5 degrees to 10.5 degrees Fahrenheit).
A new international study is the first to use a high - resolution, large - scale
computer model to
estimate how much ice the West Antarctic Ice Sheet could lose over the next couple of centuries, and how much that could add to sea - level rise.
So, they didn't actually simulate sea level changes, but instead
estimated how much sea level rise they would expect from man - made global warming, and then used
computer model predictions of temperature changes, to predict that sea levels will have risen by 0.8 - 2 metres by 2100.
The study will use a combination of complex
computer models to replicate past weather patterns in the Atlantic Ocean, Caribbean Sea and Gulf, and use the results, along with
estimates of future production of man - made greenhouse gases like carbon dioxide and methane to predict Gulf hurricane activity.