In the process of educating for change, we must strategically design assessment to examine how well our students are learning. This subject is important but easily neglected by educators or misrepresented in the education field.This research applied Item Response Theory (IRT), a contemporary measurement theory that models the relationship between the probability of an item response and the underlying proficiency being measured, to examine the psychometric properties of binary (true-or-false) question items designed to check how much students have learned in a web-based learning program based on a sample of Hong Kong Chinese students. The IRT analysis procedure would be illustrated, from checking model assumptions, calibrating items to assessing goodness-of-fit.Principal results of this research would offer information for estimating item discrimination and item difficulty for each question item, producing estimates on the proficiency level for each student, and providing item information to indicate how well an individual item contributes to the assessment of learning along a continuum ranging from low to high proficiency levels. In this direction, the IRT approach offers useful information for design, diagnosis and revision of question items. For example, items with high information value are particularly useful and should be retained, whereas items with low information value are not particularly useful and could be considered for removal.In conclusion, this research puts forward an IRT approach that can be widely applied to design and modify assessment items such that assessment of learning can be better suited to the discipline, culture and technology in context.
Jenny Mei Yiu Huen, The University of Hong Kong, Hong Kong
Yue Zhao, The University of Hong Kong, Hong Kong
Paul Siu Fai Yip, The University of Hong Kong, Hong Kong
Stream: e-Assessment and new Assessment Theories and Methodologies
This paper is part of the ACTC2017 Conference Proceedings (View)
View / Download the full paper in a new tab/window