Not exact matches
However, Singleton said he has
tested his
model against historical
data from the crisis year, yielding a 16 percent return against a 38 percent slump in the S&P index.
Telenav, the Silicon Valley company looking to bring pop - up ads to your infotainment screen, has been
testing a «freemium»
model borrowed from streaming music services to entice drivers to share their
data.
In essence, rule - based online trading platforms provide tools and tutorials for people to write algorithms on Web browsers and
test their
models with years of historic
data.
The machine - learning - based
model by Medial EarlySign used
data found in EHRs, such as laboratory
tests results, demographics, medication, and diagnostic codes, to predict a patient's risk of experiencing renal dysfunction.
The
model completely breaks down when
tested over a more extensive historical
data set.
Instead of spending their days building financial
models, the Investment Masters read, think, focus on qualitative
data and
test ideas.
You will see that I have loaded all of the above keyword
data into the
model and plugged in some
test conversion
data.
As such, the company announced a dedicated
data sales team, dubbed AMCN Agility, led by Adam Gaynor, who had previously served as the company's VP of advertising and
data solutions sales... This comes as AMC and other TV companies like A&E and Discovery are
testing a new attribution
model that seeks to prove commercials drive business results.»
Individual growth curve
models were developed for multilevel analysis and specifically designed for exploring longitudinal
data on individual changes over time.23 Using this approach, we applied the MIXED procedure in SAS (SAS Institute) to account for the random effects of repeated measurements.24 To specify the correct
model for our individual growth curves, we compared a series of MIXED
models by evaluating the difference in deviance between nested
models.23 Both fixed quadratic and cubic MIXED
models fit our
data well, but we selected the fixed quadratic MIXED
model because the addition of a cubic time term was not statistically significant based on a log - likelihood ratio
test.
This
data can also
test the
model that I proposed to determine the voters who will show up in the race to fill George Latimer's senate seat.
Kaminsky's team then developed a blood
test that measured SKA2 activity and levels of methyl groups, then fed its results, plus
data on demographics and stress level, into a statistical
model.
Processing the biological
data at the deepest level, such as DNA base pairs, therefore only makes sense if this analysis can used to build
models of biological processes and if the resulting predictions can be
tested.
Working together, they will develop and
test a variety of learning experiences in which students use online simulations to
model energy - releasing and energy - requiring reactions, analyze and interpret
data to make predictions about energy phenomena, and use evidence from their own observations or from simplified versions of scientific articles to explain phenomena and construct and critique arguments.
The researchers conducted the study by using time series
modelling of
data on the official unemployment rate and recorded property crime rate,
testing how the relationship has varied from the 1940s to the present day.
Researchers performed a meta - analysis of literature examining patients with NASH, and then
tested their hypothesis using an animal
model, which enabled them to eliminate possible confounders of the clinical
data, such as antibiotic exposure and medical comorbidities.
«But putting together the
data will enable us to
test our
models.»
In order to better understand how soil microbes respond to the changing atmosphere, the study's authors utilized statistical techniques that compare
data to
models and
test for general patterns across studies.
Common approaches to prediction include using a significance - based criterion for evaluating variables to use in
models and evaluating variables and
models simultaneously for prediction using cross-validation or independent
test data.
Instead of systematically
testing the effects of known compounds — the pharmaceutical industry's basic
model for more than a century — scientists can now investigate backward, combing through genomic
data to find links between specific genotypes and diseases and then screening drug
data to identify therapeutic candidates.
Model simulations can always be improved by
testing predictions against field
data collected from different ecosystems, and Sulman and Phillips are doing just that: investigating how roots influence soil decomposition and protected forms of carbon in forests that vary in the composition of tree and microbial communities.
Any results that are reported to constitute a blinded, independent validation of a statistical
model (or mathematical classifier or predictor) must be accompanied by a detailed explanation that includes: 1) specification of the exact «locked down» form of the
model, including all
data processing steps, algorithm for calculating the
model output, and any cutpoints that might be applied to the
model output for final classification, 2) date on which the
model or predictor was fully locked down in exactly the form described, 3) name of the individual (s) who maintained the blinded
data and oversaw the evaluation (e.g., honest broker), 4) statement of assurance that no modifications, additions, or exclusion were made to the validation
data set from the point at which the
model was locked down and that neither the validation
data nor any subset of it had ever been used to assess or refine the
model being
tested
«Using
data mining to make sense of climate change: New methodology puts emphasis on
data to
test climate
models.»
The researchers focused on automobiles,
testing their theoretical
model with
data from new - car transactions in the premium midsize sedan category between 2001 and 2005.
Porter and Reich used 2001 - 2006
data on 10,670 burglaries in Baltimore County, Maryland, to build and
test the efficiency of their crime - fighting
model.
The remaining groups of
data — two sets of 69 ratings and their corresponding chemical information — were used to
test how well the
models predicted both how an average person would rate an odor and how each of the 49 individuals would rate them.
They put the satellite
data into their
model that
tests the wave - particle interaction theory, and the results suggest the wave scattering was the cause of the particle fallout.
«Our
data confirmed that, while the rate of growth of triple - negative breast cancer was not affected by CDK 4/6 inhibitors, this class of drugs was able to significantly inhibit the spread of triple - negative breast cancer to distant organs when
tested in multiple different triple - negative breast cancer
models, including patient - derived xenografts.»
Therefore, further investigations based on multi-messenger approaches — combining theory with all three messenger
data — are crucial to
test our
model.»
They then
tested for differences in
model fits using annual nest count
data to see how the
model performed with respect to environmental factors.
Based on the results of that
test, Beaty and colleagues developed a predictive
model and
tested against brain scan
data collected for earlier studies on creativity.
In addition to documenting the amount of defensive ecosystems in the U.S., the researchers fed
data about property values, population, income and age into a
model that
tested four sea - level rise scenarios.
But, of course, Bolton noted, no probe has ever come this close to Jupiter before, and the purpose of the Juno mission is to gather
data that will put
models like those to the
test.
Provide example
data from experiments using PD - L1 positive tumors and a cancer cell - line engraftment into a humanized mouse
model that was used to
test anti-PD1 immune therapy
The
data were used to
test mathematical
models to pinpoint the role and importance of factors associated with H1N1's arrival and spread, including demographics, school opening dates, humidity levels and immunity from previous outbreaks.
By
testing their
model against historical
data, they found their production forecasts were more accurate than those of both peak oilers, who are traditionally too pessimistic, and authorities such as the US Energy Information Administration, which is generally far too optimistic.
The
test unit carried an internal
data recorder, hardened so it could measure what happened during the impact and gather
data to validate computer
models.
After chronicling the different shifts of decomposers on the mice, and seeing the same shifts operating on the humans, the researchers built a computer
model using the mouse
data to see whether the microbial composition could be used to predict times of death, using the humans as a
test case.
Alex said: «For years ecologists have struggled to
test or extend
models of ecosystem - level change because the
data were too expensive to collect at the required scales.
By combining the before - and after - stagnation
data, the city biofilm «control»
data and information from building blueprints, the team developed a
model to
test water quality inside almost any building.
Researchers
test mechanisms explaining differences in dengue serotype and disease severity by statistically fitting mathematical
models to viral load
data from dengue - infected individuals.
This has led to a demand for people who can understand
data collection techniques and analyze vast amounts of
data, categorize the
data sets, develop
models to
test hypotheses that can then be used to develop drugs, and
test potential candidates in animals.
Because they offer theorists many more variables to play with, such «dark sector»
models can be reconciled to fit into the ever tighter straitjacket of facts placed on dark matter by new
data — but the downside is that this sprawling flexibility makes them very difficult to conclusively
test.
In this technique, scientists initiate a computer
model with
data collected before a past event, and then
test the
model's accuracy by comparing its output with observations recorded as the event unfolded.
The
models used for
testing the impact of climate change combine the risks of avalanche with local climate
data.
Instead of randomly
testing individual compounds, the team turned to AI and machine learning to build predictive
models from experimental
data.
«Our expectation is that the
data from our nuclear physics experiments can be combined with the results from atomic trapping experiments measuring EDMs to make the most stringent
tests of the Standard
Model, the best theory we have for understanding the nature of the building blocks of the universe,» Butler said.
«You can not really
test the idea that animal reservoirs are important other than building a
model from the
data available.
Using both the
test tube and human
data, Kashuba and her team created a mathematical
model that predicts the drug - to - DNA ratios in vaginal, cervical and rectal tissues and calculates the amount of drug needed to prevent HIV from infecting human tissues.
With the recent publication of a large
data set of 763 microsatellite markers — short stretches of DNA that are repeated in the genome — from 53 populations in the Human Genome Diversity Project, evolutionary geneticists William Amos and Joe Hoffman of the University of Cambridge in the United Kingdom had enough genomic
data to
test both
models.
This
data produced a plot with a series of curves, each of which tracked how the organism changed in the size and number of units with age, enabling the researchers to produce a computer
model to replicate growth in the organism and
test previous hypotheses about where and how growth occurred.