Paper Summary
Share...

Direct link:

Demonstrating an Authoring Platform for Scenario-Based Assessment

Sun, April 27, 11:40am to 1:10pm MDT (11:40am to 1:10pm MDT), The Colorado Convention Center, Floor: Terrace Level, Bluebird Ballroom Room 3F

Abstract

Objectives or purposes
Scenario-based assessments (SBAs) effectively support socioculturally responsive approaches to assessment. However, authoring such assessments has historically been challenging, due to the unfamiliarity of SBA design principles and complex technical requirements. This demonstration will highlight progress we have made toward building an authoring platform that will enable college professors and other instructors to create their own local assessments built on SBA design principles.

Theoretical framework
We build on the scenario design approach embodied in Sabatini, O’Reilly, Weeks, & Zang (2020) and Deane, Wilson, Zhang, Li, van Rijn, Guo, ... & Richter (2021), as adapted to culturally responsive assessment by Bennett (2023) and Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth, & Gholson (2023). SBA design enables delivery of authentic, culturally situated items that simulate steps in a performance task. The core of SBA design is a meaningful scenario – a series of steps that accomplish some larger purpose. The SBA design process translates these steps into concrete tasks using evidence-centered design (Mislevy & Almond, 2003). SBA design has the potential to support personalization, localization, and multimodal (rather than text-only) forms of interaction.

Methods
We demonstrate an authoring platform that combines modern web technologies (specifically React and Redux) with AI technologies (text-to-speech, speech-to-text, and the use of large language models to provide personalized feedback and support dialogic interactions). We encapsulate these complexities to support instructor authoring using simple web or text interfaces. This technology captures process data, enabling student- and teacher-facing dashboards that provide formative feedback. We validate design decisions through user-based research (focus groups, cognitive labs, and interviews).

Data sources
Our primary data at this stage derives from instructors at several participating institutions across subjects including freshman composition, psychology, and interdisciplinary studies. We have designed and implemented model SBAs for these courses, which will be demonstrated during the session.

Scientific significance
Our framework supports the capture of rich process data needed to deliver and score socioculturally responsive assessments while respecting student privacy and enabling an open science platform.

References
Bennett, R. E. (2023). Toward a Theory of Socioculturally Responsive Assessment. Educational Assessment, 28(2), 83-104.
Deane, P., Wilson, J., Zhang, M., Li, C., van Rijn, P., Guo, H., ... & Richter, T. (2021). The sensitivity of a scenario-based assessment of written argumentation to school differences in curriculum and instruction. International Journal of Artificial Intelligence in Education, 31, 57-98.
Hood, S. (1998). Culturally responsive performance-based assessment: Conceptual and psychometric considerations. Journal of Negro Education, 187-196.
Sabatini, J., O’Reilly, T., Weeks, J., & Wang, Z. (2020). Engineering a twenty-first century reading comprehension assessment system utilizing scenario-based assessment techniques. International Journal of Testing, 20(1), 1-23.
Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence‐centered design. ETS Research Report Series, 2003(1), i-29.
Walker, M. E., Olivera-Aguilar, M., Lehman, B., Laitusis, C., Guzman-Orth, D., & Gholson, M. (2023). Culturally Responsive Assessment: Provisional Principles. ETS Research Report Series, 2023.

Authors