Title: Interrater Agreement Measures for Nominal and Ordinal Data
Abstract: This chapter focuses on three measures of interrater agreement, including Cohen's kappa, Scott's pi, and Krippendorff's alpha, which researchers use to assess reliability in content analyses. Statisticians generally consider kappa the most popular measure of agreement for categorical data. Weighted kappa became an important measure in the social sciences, allowing researchers to move beyond unordered nominal categories to measures containing ordered observations. The intraclass correlation coefficient serves as a viable option for testing agreement when more than two raters assess ordinal content. A key concern in using an intraclass correlation coefficient as a measure of agreement is the selection of the correct ICC statistic. Intraclass correlation coefficients also provide indications of reliability with ordinal data, as does Kendal's coefficient of concordance. The chapter offers SPSS instructions for computing kappa and intraclass correlation coefficients.
Publication Year: 2016
Publication Date: 2016-10-07
Language: en
Type: other
Indexed In: ['crossref']
Access and Citation
Cited By Count: 5
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot