Paper Summary
Share...

Direct link:

Gesture and Embodied Actions Across the XR Spectrum: “Fleshing-Out” Virtualized Learner Profiles

Sat, April 13, 3:05 to 4:35pm, Pennsylvania Convention Center, Floor: Level 100, Room 117

Abstract

This paper focuses on new multimodal (Walkington et al., 2023) forms of assessment, presenting three separate studies performed across three platforms associated with the XR spectrum (Milgram & Kishino, 1994). The first study is of a LiDAR-enabled AR mobile application to teach graphing skills, the second is an MR study with passive haptics, and the final is a VR STEM game called “Save the Building.”

XR Instructional designs should take advantage of the profound affordances of XR, namely embodiment and agency from manipulating content in 3D (Johnson-Glenberg, 2018). Interacting with content in 2D is a powerful predictor of learning, sometimes called the “doer” effect (Koedinger et al., 2016). Manipulating objects in 3D can be more immersive. Embodied studies report when action is designed in the lesson, learners exhibit higher knowledge gains and better retention of content. However, with an increase in modal signals beyond typical auditory and visual, cognitive overload may also increase. The field is in need of evidence-based XR guidelines and new methods of assessment based on action and embodiment, highlighted in this paper with three studies.

AR – Embodied Graphing. An application called “motion visualizer” was created to understand how students use and learn from a Physics Toolbox Sensor Suite. This app uses LiDAR on iOS mobile devices (this emits an array of infrared laser pulses and then senses the reflected pulses via light’s time of flight). Using precise, real-time data students create embodied graphs for both position and velocity over time as they walk towards and away from objects in the environment. A formal study with college students and an exploratory study with 7th graders were conducted. Whiteboard artifacts were gathered with the middle schoolers as they adjusted their hand-drawn models over three time points during the 1.5-hour session. We are currently optimizing motion capture methodologies to better understand how dyads move and make sense with the app.

MR – Titration with Haptics. A 3D-printed burette device was tethered to laptop-based chemistry titration lesson to mimic the feel of a real, tangible burette. During the 1-hour randomized control trial study, students either turned the burette valve or used more traditional computer keys to control a virtual burette. We discuss how gestures differed between groups during a scored recall. Additionally, while there were not significant main effects of the haptic condition, we find interactions between treatment condition for students with lower prior knowledge.

VR – “Save the Building”. We have created a highly embodied structural engineering lesson on making tall buildings earthquake-proof. Players in VR can set variables on a Tuned Mass Damper. They test viability by “teleporting to a helicopter” and starting an earthquake. They control the magnitude of the earthquake via velocity of moving the hand controllers. In this ongoing study we are gathering multiple performance variables including: cognitive load, pupil dilation, EEG, self-report on load, in-game performance, and traditional pre- and posttests. We are specifically looking at how self-regulated learning and metacognitive/reflective prompts (Geden et al., 2020) can be embedded in STEM VR games to increase learning.

Authors