Potential Impact of Item Parameter Drift Due to Practice and Curriculum Change on Item Calibration in Computerized Adaptive Testing

Validity & Testing, Research, 2011, Psychometric Report

Kyung T. Han & Fanmin Guo

January 01, 2011

Overview

When test items are compromised over time for various reasons, such as security breaches, practice, and/or curriculum changes, item parameter drift (IPD) becomes a serious threat to test validity and fairness. Extensive research using simulated and real data has been conducted to investigate the impact of IPD on item parameter and proficiency estimates. Most of these simulation studies, however, have oversimplified IPD situations by assuming that the item parameters drift with all test takers. In reality, test items are exposed only to a small percentage of test takers, so IPD would occur only with those examinees. This study employed simulation studies to examine the impact of IPD on item calibration when IPD items were exposed only to a portion of test takers in a computerized adaptive testing (CAT) environment. Based on the simulation results, the short-term effect of IPD on item calibration turned out to be very limited and fairly inconsequential under the studied conditions.