Paper Summary
Share...

Direct link:

Case Study: LE in the Air Force (Poster 5)

Thu, April 24, 5:25 to 6:55pm MDT (5:25 to 6:55pm MDT), The Colorado Convention Center, Floor: Terrace Level, Bluebird Ballroom Room 2A

Abstract

Systematic approaches to the design of instructional materials have a rich history in the United States Military and the United States Air Force (Garcia & Ozogul, 2023), and these foundational methods are evolving with modern concepts such as Learning Engineering, which has been proposed as a part of an ecosystem that can modernize learning in the federal government (Walcutt & Schatz, 2019). While anecdotal examples from various military branches are beginning to emerge, none have shown up in published research. In this presentation, we will demonstrate how the Learning Engineering Process (LEP) (Kessler et al., 2022) was used in several projects at the Global College of PME (GCPME) at Air University to enhance the online learning experience for approximately 14,000 Duty, Guard, Reserve, and Civilian students that the GCPME graduates yearly. Each highlighted project will focus on essential components of the LEP:
1. The central challenge
2. The solution created to address the challenge
3. The implementation plan, including data collection
4. The investigation phase, involving data analysis
The first project aimed to improve assessment rubrics used across various courses and programs at GCPME. The central challenge involved two aspects: reducing the time instructors spent completing the rubric and providing more specific feedback to students to enhance their writing process. The proposed solution was to rewrite two rubrics and pilot them in several courses. To evaluate the success of the pilot, data sources such as specially designed surveys for instructors and students, as well as assignment grade data from the LMS, were identified and collected. After implementation, data were analyzed to determine if the rubric effectively met the central challenge. While the initial iteration showed promising results, further iterations of the LEP were conducted after making modifications to the rubric, resulting in a rubric that better met the needs of students and instructors.
The second project involved modifying a suite of courses focused on U.S. national security to better scaffold the process for writing an analysis of a Cold War Era strategy document. The central challenge was addressed by implementing a collaborative annotation exercise using the Perusall learning technology. During the pilot test, data were gathered from various sources, including a survey completed by students regarding the new tool, grades on the final essay assignment, and learning analytics from Perusall itself. Analysis indicated that Perusall activity (instrumented as the number of comments, comment length, and engagement time) in and of itself was not a strong predictor of student grades on the writing assignment (Chen, Watts, & Nyland, 2023). Although a more qualitative analysis might yield better signal strength, the analysis suggests that the activity did not have the intended impact and should potentially be reconsidered.
Both of these examples demonstrate to other organizations in military and government contexts how the LEP can be used to address challenges facing stakeholders in a way that is grounded in data.

Author