Weighted Kappa for Multiple Raters | Semantic Scholar
PLOS ONE: Interrater agreement of two adverse drug reaction causality assessment methods: A randomised comparison of the Liverpool Adverse Drug Reaction Causality Assessment Tool and the World Health Organization-Uppsala Monitoring Centre system
Kappa Coefficient for Dummies. How to measure the agreement between… | by Aditya Kumar | AI Graduate | Medium
PDF] Weighted kappa is higher than Cohen's kappa for tridiagonal agreement tables | Semantic Scholar
PLOS ONE: Capturing the Context of Maternal Deaths from Verbal Autopsies: A Reliability Study of the Maternal Data Extraction Tool (M-DET)
PLOS ONE: Digital versus analogue record systems for mass casualty incidents at sea—Results from an exploratory study
Full article: Weighted kappa as a function of unweighted kappas