Kappa statistic

From Citizendium
Revision as of 12:30, 27 December 2007 by imported>David E. Volk
Jump to navigation Jump to search
This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Interpretation

Landis and Koch[1] proposed the schema in the table below for interpreting values.

Proposed interpretation of values
Interpretation
< 0 Poor agreement
0.0 — 0.20 Slight agreement
0.21 — 0.40 Fair agreement
0.41 — 0.60 Moderate agreement
0.61 — 0.80 Substantial agreement
0.81 — 1.00 Almost perfect agreement

References

  1. Landis JR, Koch GG (1977). "The measurement of observer agreement for categorical data". Biometrics 33 (1): 159–74. PMID 843571[e]