Kappa statistic: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>David E. Volk
No edit summary
imported>Robert Badgett
(Redirected page to Fleiss' kappa)
 
Line 1: Line 1:
{{subpages}}
#REDIRECT [[Fleiss' kappa]]
==Interpretation==
Landis and Koch<ref name="pmid843571">{{cite journal |author=Landis JR, Koch GG |title=The measurement of observer agreement for categorical data |journal=Biometrics |volume=33 |issue=1 |pages=159–74 |year=1977 |pmid=843571 |doi=}}</ref> proposed the schema in the table below for interpreting  <math>\kappa</math> values.
 
{|class=wikitable
|+Proposed interpretation of <math>\kappa</math> values
! <math>\kappa</math> !! Interpretation
|-
|align=center| < 0              || Poor agreement
|-
|align=center| 0.0 &mdash; 0.20  || Slight agreement
|-
|align=center| 0.21 &mdash; 0.40 || Fair agreement
|-
|align=center| 0.41 &mdash; 0.60 || Moderate agreement
|-
|align=center| 0.61 &mdash; 0.80 || Substantial agreement
|-
|align=center| 0.81 &mdash; 1.00 || Almost perfect agreement
|-
|}
 
==References==
<references/>

Latest revision as of 15:45, 30 September 2011

Redirect to: