Sentences with phrase «with eigenvalues»

Individual items were retained if they had a loading near or over 0.35 and the number of factors was based upon those with eigenvalues greater than one.10 A two - factor solution was the clearest at both ages and accounted for over 95 % of the total variance in the observed variables (Table 2).
We identified three core profiles with eigenvalues over or near 1.00, explaining more than half of the variance in the 13 marital items; 54.2 % and 58.2 % for men and women, respectively.
Initial analysis revealed seven components with eigenvalues above Kaiser's criterion of 1.
An initial component extraction showed that there were five dimensions with eigenvalues greater than one.
In the PCA, three components with eigenvalues > 1 were extracted from the data set.
The first factor covered more than 64 % of the total variance of the readability measures with an eigenvalue of 32.3, which is more than 23 units greater than the next factor's eigenvalue.
The three items measuring modelling of healthy eating all loaded onto one unique factor with an eigenvalue greater than one, explaining 63 % of the variance.
A cutoff of 0.40 was used for factor loading with an eigenvalue greater than 1, which allows the extracted factor to explain a reasonable proportion of the total variance.
Factors with an eigenvalue > 1.0 were selected.
For each scale, two factors with an eigenvalue > 1.0 were identified.
Exploratory factor analysis indicated a 2 - factor structure with an eigenvalue of the second factor slightly exceeding 1.0 (eigenvalue = 1.04 for all partnered, 1.02 for dyads).
A principal component analysis (PCA) revealed two components with an eigenvalue above the cut - off value of one (4.47 and 2.65), suggesting a two - factor structure for the EFA.

Not exact matches

According to the PFA (on the basis of eigenvalues, the Kaiser criterion, scree test and the interpretation) three aspects could be constructed with 17 statements (Table 1 in Appendix).
Precisely that question was addressed by Mann and coworkers in their response to the rejected MM comment through the use of so - called «Monte Carlo» simulations that generate an ensemble of realizations of the random process in question (see here) to determine the «null» eigenvalue spectrum that would be expected from simple red noise with the statistical attributes of the North American ITRDB data.
From the latter, you can't tell whether something is a trend or a cycle with data short compared to the cycle (the eigenvalues of the discriminating matrix explode, making every observation useless).
I had a discussion with Steve McIntyre a couple of years ago on the scaling issue but I also asked about how eigenvalues fit into the topic, i.e. were the eigenvalues from the «noise» PCs smaller than the eigenvalues from the reconstruction.
«Along with the use of principal component regression there appears to have been a growth in the misconception that the principal components with small eigenvalues will rarely be of any use in a regression.
Very good: so you are happy with the method on the grounds that the eigenvalues are smaller for red noise.
The results of the orthogonal rotation yielded an interpretable three - factor solution that collectively explained 74.624 % of the variance for the set of six variables (34.238 % explained by Factor 1, 23.574 % by Factor 2, and 16.812 % by Factor 3) with the rotated factors obtaining eigenvalues ranging from 1.01 to 2.054.
Initial examination of the items using principal component analysis with varimax rotation to maximize variance, revealed three factors having an eigenvalue greater than one.
Factor analysis of the 12 items (Table I) indicated a single - factor solution, with all items loading a single factor having an eigenvalue of 6.2 (all other factors had an eigenvalue of < 1.0), accounting for 51.3 % of the variance.
a b c d e f g h i j k l m n o p q r s t u v w x y z