Paper Summary
Share...

Direct link:

Automated Teacher Feedback From a Game-Based Formative Assessment Targeting Data Science Concepts (Poster 3)

Thu, April 21, 8:00 to 9:30am PDT (8:00 to 9:30am PDT), Marriott Marquis San Diego Marina, Floor: North Tower, Ground Level, Pacific Ballroom 18

Abstract

Data literacy is considered a critical competency in US schools (Mandinach & Gummer, 2013); data literacy includes an understanding of data sources and collection mechanisms and the analytical methods applied to interpret, transform and visualize data to answer questions and solve problems. K-12 teachers need support to develop and evaluate their students’ data literacy skills in engaging and meaningful ways.
Game-based assessments can provide interactive spaces for students to engage with concepts and practices in authentic and meaningful settings, and gameplay data has also shown to be helpful for assessing student understanding of concepts underlying the game (Bauer et al., 2017). However, translating student actions in the game to meaningful inferences about student proficiencies that are usable by teachers is not straightforward
We designed ‘Beats Empire’ as a formative assessment game to provide middle school teachers with information about their students’ ability related to concepts in the data and analysis strand of the K-12 CS framework (Parker & DeLyser, 2017). Beats Empire has been designed as a web-based stand-alone tool that is not subject matter dependent and is a light-weight addition to regular classroom activities. It can be used in multiple short sessions or fewer longer sessions so that teachers can adapt it to their needs.
We evaluated student understandings from three mutually complementary perspectives: 1) an Evidence-Centered Design (ECD) approach to generate hypotheses about how student activity might be surfaced by student activity (Mislevy & Haertel, 2016); 2) an education data mining (EDM) approach to evaluate those hypotheses; and 3) a participatory design process (DiSalvo, 2016) that brought teachers into the design and allowed us to iterate on how to structure and present the feedback to teachers via an automated dashboard that presented the results of the ECD and EDM.
In consultation with teachers, we determined that the most common use cases for the dashboard would be to provide: 1) provide in-the-moment feedback about on students' engagement; 2) class-wide overview patterns of in-game actions across a class; 3) deep examinations of individual student’s game play provide a means to examine the gameplay of students. We then redesigned the dashboard such that it now includes those three sections explicitly; they are the first options seen. Furthermore, the first page now presents a visualization of the current in-game category of activity for each student in the class.
Iterative participatory design with teachers is difficult. Our work situates the design process in multiple theoretical frameworks and research to make concrete, verifiable positive changes in our design. As there is relatively little literature on designing for teachers engaged in game-based assessment, we believe that our process, success, and evidence of success will both be specifically useful for others entering that space as a well as suggest a new way to use ECD and EDM data to bring to bear on participatory design with teachers.

Authors