Our group gathers multi-disciplinary expertise aiming at developing methods and algorithms for extracting, analysing, and
modeling spatial data from biological images.
Not exact matches
Using published
data from the circumpolar arctic, their own new field observations of Siberian permafrost and thermokarsts, radiocarbon dating, atmospheric
modeling, and
spatial analyses, the research team studied how thawing permafrost is affecting climate change and greenhouse gas emissions.
The
data have helped them learn more about the snakes»
spatial range and behavior and develop population
models they hope will be useful for conserving the locally threatened population of pine snakes.
Although the
data suggest that
spatial models can effectively forecast tree community composition and structure of unstudied sites in Amazonia, incorporating environmental
data may yield substantial improvements.
High throughput genome sequencing and quantitative image analysis provide evolution, metabolic, and interaction
data to build community metabolome maps, taxa / gene networks, and
spatial ecosystem
models.
From these
data, we quantify the population genetic parameters of the intra-patient environment to aid
modeling efforts such as the
spatial - monotherapy work.
Disease prevention versus
data privacy: Using landcover maps to inform
spatial epidemic
models.
ASTER
data is used to create detailed maps of land surface temperature, reflectance, and elevation.ASTER captures high
spatial resolution
data in 14 bands, from the visible to the thermal infrared wavelengths, and provides stereo viewing capability for digital elevation
model creation.
We carried out validation of the
model using a variogram - based procedure, which tested the compatibility of the adopted
spatial structure with the
data.
We concluded that the adopted covariance
model was compatible with the
data, as the empirical semi-variogram fell within the 95 % tolerance intervals computed via Monte Carlo simulation and a
spatial correlation test of residuals.
The ARM Aerosol Measurement Science Group (AMSG) coordinates ARM Climate Research Facility observations of aerosols and atmospheric trace gases with user needs to ensure advanced, well - characterized observational measurements and
data products — at the
spatial and temporal scales necessary — for improving climate science and
model forecasts.
In the 1970s and»80s, British psychologist Alan Baddeley and colleagues developed a
model of working memory that brings together how the brain accepts sensory input, processes both visual -
spatial and verbal
data, and accesses long - term memory; and how all of that input is processed by a function they referred to as central executive.
Also, the ability to specify and draw design concepts with network diagrams,
data ow diagrams, logical ow charts, and
models requires 2D and even 3D
spatial awareness, often calling upon signicant artistic communication talent.
Methodologically, González Canché employs econometric, quasi-experimental,
spatial statistics, and visualization methods for big and geocoded
data, including geographical information systems and network
modeling.
In this case, there has been an identification of a host of small issues (and, in truth, there are always small issues in any complex field) that have involved the fidelity of the observations (the
spatial coverage, the corrections for known biases), the fidelity of the
models (issues with the forcings, examinations of the variability in ocean vertical transports etc.), and the coherence of the
model -
data comparisons.
To test that I varied the
data sources, the time periods used, the importance of
spatial auto - correlation on the effective numbers of degree of freedom, and most importantly, I looked at how these methodologies stacked up in numerical laboratories (GCM
model runs) where I knew the answer already.
To get a sense of the mix of whaling - era
data, tracking and
modeling used to estimate past blue whale abundance, read this PloS ONE paper by an overlapping research team from last year: «Estimating Historical Eastern North Pacific Blue Whale Catches Using
Spatial Calling Patterns.»
We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i)
data limitations and the comparability of
data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable
model for the process under study; (v) temporal autocorrelation; (vi)
spatial autocorrelation and patterns; and (vii) the reporting of rates of change.
If the
models can not even accurately simulate current climate statistics when they are not constrained by real world
data, the expense to run them to produce detailed
spatial maps is not worthwhile.
Wang, 2011: Detecting the ITCZ in instantaneous satellite
data using
spatial - temporal statistical
modeling: ITCZ climatology in the east Pacific.
Nesting a regional climate
model (with higher
spatial resolution) into an existing GCM is one way to downscale
data.
This will increase the
spatial degrees of freedom beyond that of the PAGES 2k Consortium [2013] synthesis and provide clear targets for observation and
model comparisons while honoring limitations imposed by current
data availability.
Analyses of tide gauge and altimetry
data by Vinogradov and Ponte (2011), which indicated the presence of considerably small
spatial scale variability in annual mean sea level over many coastal regions, are an important factor for understanding the uncertainties in regional sea - level simulations and projections at sub-decadal time scales in coarse - resolution climate
models that are also discussed in Chapter 13.
The RCPs provide a unique set of
data, particularly with respect to comprehensiveness and detail, as well as
spatial scale of information for climate
model projections.
To claim anyone has created an accurate
spatial model of the history of temperature change from such
data is flat out dreaming.
«Both of these are established methodologies for doing
spatial prediction otherwise known as interpolation» — phew, good thing you're not using «computer
models» to fill in the
data.
All
model and observed
data have same
spatial coverage as HadCRUT4.
Most
models accurately reproduce the
spatial distribution of explosive cyclones when compared to reanalysis
data (R = 0.94), with high frequencies along the Kuroshio Current and the Gulf Stream.
We conclude that the most valid
model of the
spatial pattern of trends in land surface temperature records over 1979 — 2002 requires a combination of the processes represented in some GCMs and certain socioeconomic measures that capture
data quality variations and changes to the land surface.
European Forest
Data Centre; EFDAC; free software; Free Scientific Software; Free and Open Source Software; Europe; forest information system; European Forest Fire Information System; EFFIS; geospatial; geospatial tools; semantic array programming; morphological
spatial pattern analysis; GUIDOS; reproducible research; environmental
modelling
As Mike noted, we should stay focused on the suite of (very interesting and) important scientific questions raised by this post — especially those related to the idea of
spatial / temporal patterns of climate
data in relation to concepts and
models of their likely physical causes.
DelSole et al. (28) also found 2.5 cycles by extracting the
spatial pattern in the Intergovernmental Panel on Climate Change, Fourth Assessment Report (IPCC AR4)(29)
model control runs that best characterizes internal variability and by projecting the observed global
data onto this pattern.
Because of their large
spatial coverage, satellite
data have proven useful in evaluating dust sources, transport and deposition in global
models.
Prior to 1988, the satellite
data that Trenberth uses is not available, but it is known that long term records in radiosondes contain large inhomogeneities due to improving observing systems, increasing
spatial resolution (but still very little ocean coverage), and the NCEP
data in particular contains large
model biases.
The long story short is that Numenta's software, called Grok, is able to recognize patterns (e.g., temporal and
spatial) from streaming
data and then automatically build
models that allow it to predict what will happen next.