A high degree of inter-rater reliability was established on primary diagnoses and subthreshold diagnoses (
kappa coefficient = 0.92; range: 0.62 to 1.00; Göttken et al., 2014).
Agreement between child - and caregiver - reports was analyzed using Cohen's
kappa coefficient.
Not exact matches
Results: For each subscale and factor, the values of
kappa for inter-rater reliability were between.625 and.884 (p <.05); the values of retest reliability were between.537 and.832 (p <.05); The scores of the subscale of PSDQ were correlated with each factor significantly (
coefficient of correlation:.732 -.951, p <.05), and the correlation
coefficient was more than those between each factor of this subscale (correlation
coefficient:.382 -.834, p <.05).
Research on the PTSC reports a free marginal multirater
kappa at.82 and an intraclass correlation
coefficient at.95 indicating strong interrater reliability on the instrument.
Stability of the ODD - CU measures for ages 3 to 5 was estimated via the intra-class correlation
coefficient (absolute agreement) for quantitative scores and
kappa for categorical scores.