Evaluating the Construct Validity of Basic Science Curriculum Assessment Instrument for Critical Thinking: A Case-Study
Chau-Kuang Chen, Adriana Marie Horner, Michelle Scott, Stephanie C. McClure
The Rasch model is a practical framework for evaluating a construct validity of assessment instruments. It is capable of determining how the measurement of person’s ability (endorsement) and item difficulty matches with each other. This study aimed at evaluating the psychometric properties (reliability, validity, and utility) of a basic science curriculum assessment instrument. Special emphasis was placed on finding the strengths and challenges in the curriculum, and detecting the existence of multidimensional structure. A total of 130 medical students in academic year 2016/17 completed a 22-item assessment instrument. Three major steps were involved in this study. First, the parameters of person’s ability and item difficulty were separately estimated. Second, infit/outfit mean square residuals and standardized residual variance from principal component analysis (PCA) were used to validate the unidimentionality assumption. Lastly, differential item functioning (DIF) was assessed to determine the fairness of the assessment instrument. As a result, the baseline measures of the strengths and challenges in medical curriculum were established for continuous quality improvement. However, the unexplained variance for the first contrast value of 3.08 in PCA was greater than the criterion of 2.0, which shows some degree of violation of the unidimensionality assumption. Therefore, this instrument must be further revised for future application.