首页 | 本学科首页   官方微博 | 高级检索  
     


Another look at interrater agreement.
Authors:Zwick   Rebecca
Abstract:Most currently used measures of interrater agreement for the nominal case incorporate a correction for chance agreement. The definition of chance agreement, however, is not the same for all coefficients. Three chance-corrected coefficients are Cohen's (1960) κ; Scott's (1955) π; and the S index of Bennett, Alpert, and Goldstein (1954), which has reappeared in many guises. For all three measures, independence between raters is assumed in deriving the proportion of agreement expected by chance. Scott's π involves a further assumption of homogeneous rater marginals, and the S coefficient requires the assumption of uniform marginal distributions for both raters. Because of these disparate formulations, κ, π, and S can lead to different conclusions about rater agreement. Consideration of the properties of these measures leads to the recommendation that marginal homogeneity be assessed as a first step in the analysis of rater agreement. If marginal homogeneity can be assumed, π can be used as an index of agreement. (PsycINFO Database Record (c) 2010 APA, all rights reserved)
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号