Further, research from our clinic has demonstrated excellent reliability for the ADIS with interrater agreement of
kappa = 1.00 for an overall anxiety disorder diagnosis, and between
Kappa =.80 and.93 for specific anxiety diagnoses (Lyneham et al. 2007).
Twenty - one percent of cases were second - coded and reliability was
kappa =.74.
Kappas for individual codes were also tested to control for chance agreement (kappas ranged from.58 to 1.0, mean
kappa =.83).
The inter-rater reliability of CD diagnoses was excellent (Cohen's
kappa = 1.00).
PACE assessment of parental history of mental disorder indicated moderate to substantial agreement with parent questionnaire report of having ever seen a mental health professional (mothers:
kappa = 0.64, fathers:
kappa = 0.49).
The modified SCID has demonstrated high inter-rater reliability for adolescent alcohol diagnoses (
kappa = 0.94).21 Before each interview takes place, verbal consent will be obtained.
Interrater agreement for primary diagnoses among a team of psychologists / psychiatrists was good (Cohen's
kappa =.79).
There was 81.6 % agreement among the categorisations, and an interrater reliability analysis resulted in
a kappa =.72, indicating substantial agreement.
A total score ≥ 50 is indicative of the full PTSD diagnosis (sensitivity = 0.82; specificity = 0.83;
kappa = 0.64).44 In this study, the traumatic event in the original PCL - C was replaced by physical violence.
The average inter-observer agreement was
kappa = 0.78 and the average intra-observer agreement was
kappa = 0.72.
Not exact matches
A high degree of inter-rater reliability was established on primary diagnoses and subthreshold diagnoses (
kappa coefficient
= 0.92; range: 0.62 to 1.00; Göttken et al., 2014).