Hi
This is my first times here in the forum and I'm doing my first validation study with almost 0 experience
. I have a data of 8 raters assess 31 subject. i used kappaetc code, but I'm a little bet confuse if i have to use wgt(quadratic) or not. I run both codes and get 2 different results, which one is correct.
kappaetc Rater1_1 Rater2_1 Rater3_1 Rater4_1 Rater5_1 Rater6_1 Rater7_1 Rater8_1
Interrater agreement Number of subjects = 31
Ratings per subject = 8
Number of rating categories = 5
------------------------------------------------------------------------------
| Coef. Std. Err. t P>|t| [95% Conf. Interval]
---------------------+--------------------------------------------------------
Percent Agreement | 0.5334 0.0424 12.59 0.000 0.4469 0.6199
Brennan and Prediger | 0.4168 0.0529 7.87 0.000 0.3086 0.5249
Cohen/Conger's Kappa | 0.4073 0.0604 6.75 0.000 0.2840 0.5305
Scott/Fleiss' Kappa | 0.4042 0.0613 6.60 0.000 0.2791 0.5294
Gwet's AC | 0.4198 0.0513 8.19 0.000 0.3151 0.5245
Krippendorff's Alpha | 0.4066 0.0613 6.64 0.000 0.2815 0.5318
------------------------------------------------------------------------------
. kappaetc Rater1_1 Rater2_1 Rater3_1 Rater4_1 Rater5_1 Rater6_1 Rater7_1 Rater8_1, wgt(quadratic)
Interrater agreement Number of subjects = 31
(weighted analysis) Ratings per subject = 8
Number of rating categories = 5
------------------------------------------------------------------------------
| Coef. Std. Err. t P>|t| [95% Conf. Interval]
---------------------+--------------------------------------------------------
Percent Agreement | 0.9505 0.0068 140.00 0.000 0.9366 0.9643
Brennan and Prediger | 0.8018 0.0272 29.53 0.000 0.7464 0.8573
Cohen/Conger's Kappa |. 0.7541 0.0586 12.86 0.000 0.6344 0.8738
Scott/Fleiss' Kappa | 0.7529 0.0594 12.67 0.000 0.6315 0.8742
Gwet's AC | 0.8136 0.0260 31.30 0.000 0.7605 0.8667
Krippendorff's Alpha | 0.7538 0.0594 12.68 0.000 0.6325 0.8752
------------------------------------------------------------------------------
Thank you
Best regards
Hayder
This is my first times here in the forum and I'm doing my first validation study with almost 0 experience

kappaetc Rater1_1 Rater2_1 Rater3_1 Rater4_1 Rater5_1 Rater6_1 Rater7_1 Rater8_1
Interrater agreement Number of subjects = 31
Ratings per subject = 8
Number of rating categories = 5
------------------------------------------------------------------------------
| Coef. Std. Err. t P>|t| [95% Conf. Interval]
---------------------+--------------------------------------------------------
Percent Agreement | 0.5334 0.0424 12.59 0.000 0.4469 0.6199
Brennan and Prediger | 0.4168 0.0529 7.87 0.000 0.3086 0.5249
Cohen/Conger's Kappa | 0.4073 0.0604 6.75 0.000 0.2840 0.5305
Scott/Fleiss' Kappa | 0.4042 0.0613 6.60 0.000 0.2791 0.5294
Gwet's AC | 0.4198 0.0513 8.19 0.000 0.3151 0.5245
Krippendorff's Alpha | 0.4066 0.0613 6.64 0.000 0.2815 0.5318
------------------------------------------------------------------------------
. kappaetc Rater1_1 Rater2_1 Rater3_1 Rater4_1 Rater5_1 Rater6_1 Rater7_1 Rater8_1, wgt(quadratic)
Interrater agreement Number of subjects = 31
(weighted analysis) Ratings per subject = 8
Number of rating categories = 5
------------------------------------------------------------------------------
| Coef. Std. Err. t P>|t| [95% Conf. Interval]
---------------------+--------------------------------------------------------
Percent Agreement | 0.9505 0.0068 140.00 0.000 0.9366 0.9643
Brennan and Prediger | 0.8018 0.0272 29.53 0.000 0.7464 0.8573
Cohen/Conger's Kappa |. 0.7541 0.0586 12.86 0.000 0.6344 0.8738
Scott/Fleiss' Kappa | 0.7529 0.0594 12.67 0.000 0.6315 0.8742
Gwet's AC | 0.8136 0.0260 31.30 0.000 0.7605 0.8667
Krippendorff's Alpha | 0.7538 0.0594 12.68 0.000 0.6325 0.8752
------------------------------------------------------------------------------
Thank you
Best regards
Hayder
Comment