Identifying Sources of Difference in Reliability in Content Analysis

Authors

  • Elizabeth Murphy
  • Justyna Ciszewska-Carr

DOI:

https://doi.org/10.19173/irrodl.v6i2.233

Abstract

This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD). Transcripts of 10 students in a month-long online asynchronous discussion were coded by two coders using an instrument with two categories, five processes, and 19 indicators of Problem Formulation and Resolution (PFR). Sources of difference were identified in relation to: coders; tasks; and students. Reliability values were calculated at the levels of categories, processes, and indicators. At the most detailed level of coding on the basis of the indicator, findings revealed that the overall level of reliability between coders was .591 when measured with Cohen’s kappa. The difference between tasks at the same level ranged from .349 to .664, and the difference between participants ranged from .390 to .907. Implications for training and research are discussed.

Keywords: content analysis; online discussions; reliability; Cohen's kappa; sources of difference; coding

Author Biographies

Elizabeth Murphy

Dr. Elizabeth Murphy is an Associate Professor of educational technology and second-language learning at Memorial University of Newfoundland. Her research focuses on content analysis of online discussions as well as on the design of web-based learning.

Justyna Ciszewska-Carr

Justyna Ciszewska-Carr is a graduate student at the Faculty of Education at Memorial University of Newfoundland, Canada. Her research interests include the use of technology in teaching and learning, as well as issues related to working with linguistically and culturally diverse students.

Downloads

Published

2005-07-01

How to Cite

Murphy, E., & Ciszewska-Carr, J. (2005). Identifying Sources of Difference in Reliability in Content Analysis. The International Review of Research in Open and Distributed Learning, 6(2). https://doi.org/10.19173/irrodl.v6i2.233

Issue

Section

Research Articles