Sentences with phrase «to smooth the data»

The phrase "to smooth the data" means to make the information or numbers in a dataset more even or consistent by removing any irregularities or outliers. It helps to reveal patterns or trends more clearly. Full definition
The issues of how to smooth data series to avoid misleading end effects is not a trivial one.
The concept of enabling smooth data transfer from another device with a particular OS to another with a different OS isn't new either.
I promise never to use smoothed data as input to other analyses 36.
The term harmonization is defined as a procedure whereby emission outputs from the IAMs are adjusted in such a way that emissions in the reference year are equal to some reference data set (with these adjustments extended into the future, in some manner, to assure smooth data sets).
I specifically used a source of smoothed data to avoid such comments, but that is how things go at times.
Finally got some comparable error bars for Figure 17 by comparing 5 yr smoothed M&W to the 5 yr smoothed data from Mike's website.
«[The Nature study is] heavily smoothing the data so as to look only at centennial - scale shifts, not what we usually think of as droughts or rainfall extremes, which would be scales of days to at most a decade or two.»
If you want to see how much a methodology smooths its data, you can simply compare its input and output.
They make a great deal of the fact that we only plotted the ~ 50 year smoothed data rather than the annual means.
This method isn't perfect, but it's a good way to smooth the data while removing almost all the impact of volcanic eruptions — and that'll tell us the background sulfate level in the GISP2 ice core.
So although a map of land - only data will show smoothed data far out into the oceans, in fact a land - ocean dataset will have discarded this when combining, wherever SST data is available.
The lack of curiosity by RealClimate aficionados as to the implications of the broad assumptions used to smooth the data highlights their true interests.
The basic idea is to estimate the trend component, by smoothing the data or by fitting a regression model, and then estimate the seasonal component, by averaging the de-trended seasonal data points (e.g., the December seasonal effect comes from the data points for all Decembers in the series).
Quote from the article: «In statistics and image processing, to smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine - scale structures / rapid phenomena..............
They use different computational methods to e.g. average or smooth the data over spatiotemporal gaps, and they «adjust» the data differently for poorly known or even unmeasurable things such as UHI.
While that may seem odd, for heavily smoothed data that is not really abnormal at all.
WHen there is not sufficient areal data, any method of interpolation, extrapolation, or ad hoc infilling is suspect even for smooth data.
We take full responsibility for archiving data with the appropriate NASA archive facility, helping to smooth the data or review process, and we will ensure your submitted datasets are compliant with the standards.
My model uses the smoothed data from the Federal Reserve H15 series, which dates as far back as 1962, though some series, like the 30 - year, date back to 1977, and have an interruption from 2002 - 2005, after the 30 - year ceased to be issued for a time.
I used the smoothed data as is without any modification.
The claim of misleading is not the data, but the smoothed data in the source ends in 1915.
The smoothed data value in 1990 is a weighted average of (2m - 1) points (i.e., 29 data years) and is thus robust.
Over the open ocean RL04 has errors of ~ 1 - 2 cm for 750 km Gaussian - smoothed data, whereas RL05 has errors of ~ 0.5 - 1.5 cm.
a b c d e f g h i j k l m n o p q r s t u v w x y z