Paper Summary

An Investigation of Response Time Differences on a Certification Examination With Multiple Item Formats

Sun, April 15, 10:35am to 12:05pm, Vancouver Convention Centre, Floor: First Level, East Ballroom A

Abstract

Research on examination response time has focused on investigating traditional multiple-choice items, and consequently the impact of other innovative or complex item formats on response times is not understood. The present study applied a multilevel modeling framework to investigate examinee characteristics associated with response time differences on a high-stakes examination comprised of multiple item formats. Results showed that a linear growth model described examinee pacing on the traditional multiple-choice section, while a growth curve model described pacing on a diagnostic study section comprised of complex, graphic-intensive selected-response items. Examinees’ gender, ability, and age explained variability in response times for each exam section. These findings have implications for test developers who intend to incorporate complex item formats into their high-stakes examinations.

Authors