kappam_vanbelle {kappaGold} | R Documentation |
Agreement between two groups of raters
Description
This function expands upon Cohen's and Fleiss' Kappa as measures for interrater agreement while taking into account the heterogeneity within each group.
Usage
kappam_vanbelle(
ratings,
refIdx,
ratingScale = NULL,
weights = c("unweighted", "linear", "quadratic"),
conf.level = 0.95
)
Arguments
ratings |
matrix of subjects x raters for both groups of raters |
refIdx |
numeric. indices of raters that constitute the reference group. Can also be all negative to define rater group by exclusion. |
ratingScale |
character vector of the levels for the rating. Or |
weights |
optional weighting schemes: |
conf.level |
confidence level for interval estimation |
Details
Data need to be stored with raters in columns.
Value
list. kappa agreement between two groups of raters
References
Vanbelle, S., Albert, A. Agreement between Two Independent Groups of Raters. Psychometrika 74, 477–491 (2009). doi:10.1007/s11336-009-9116-1
Examples
# compare student ratings with ratings of 11 experts
kappam_vanbelle(SC_test, refIdx = 40:50)
[Package kappaGold version 0.4.0 Index]