Category:Cohen's Kappa Statistic
This category contains results about Cohen's Kappa Statistic.
Definitions specific to this category can be found in Definitions/Cohen's Kappa Statistic.
Let two observers $A$ and $B$ independently classify each of a set of observations into $2$ or more categories.
Cohen's kappa statistic $\kappa$ is a measure of agreement between $A$ and $B$.
Let there be $N$ observations.
Let $n$ denote the number of agreements over all categories.
Let $p_{\mathrm {obs} } := \dfrac n N$ be the observed proportion of agreements.
Let $p_{\mathrm {exp} }$ denote the expected proportion of agreements over all categories under random assignment, as calculated in the usual manner for a contingency table.
Then:
- $\kappa = \dfrac {p_{\mathrm {obs} } - p_{\mathrm {exp} } } {1 - p_{\mathrm {exp} } }$
Subcategories
This category has only the following subcategory.
E
Pages in category "Cohen's Kappa Statistic"
This category contains only the following page.