Paper Summary
Share...

Direct link:

Overcoming Instructors’ Design Differences to Deliver Equitable Learning Opportunities and Algorithms That Predict Science Achievement

Thu, April 11, 4:20 to 5:50pm, Pennsylvania Convention Center, Floor: Level 100, Room 108B

Abstract

Objective & Theoretical Framework: The combination of improvement of online learning environments (Greenhow, et al., 2022) and promotion of active learning pedagogies in science education afford opportunities for instructors to provide students with opportunities to learn using rich digital resources (Eddy & Hogan, 2014; Lombardi, et al. 2021). Students’ use of these materials can be observed using learning analytics, and instructors who partner with learning analysts can understand and even predict students’ learning (Lang, et al., 2017, Kizilcec & Lee, 2021). Such analytic insights are predicated on (a) course instructors’ adoption of a master course that provides rich digital content to many learners, which enables students to produce (b) a sufficient volume of learning events (Bernacki, 2018) to represent learning processes and inform tools to understand and predict learning (c) instructors’ collaboration with learning analysts to apply prediction models and offer learning support to students. Those conditions are difficult to meet when students enroll in smaller sections (N ≤ 40) taught by different instructors who have the freedom to enact their own course designs. Across any two sections, the type and amount of digital content can vary drastically, and combined with low sample size, this limits the ability to understand, predict, or support learning.

We present a method to overcome these challenges by (1) leveraging a common learning space – the online laboratory component of the course (2) providing open access to a resource commons (Figure 1) where all students can engage with digital materials that can provide equitable access to support their learning process. We aimed to provide these materials to students to support active learning, and to use these learning events to inform a prediction model that identified those who might benefit from learning support, based on their poor performance on their first unit exam (i.e., C-or-Better vs. D-or-Worse).

Methods & Data: Exam grades and digital events from 434 students enrolled in 10 sections of an Anatomy and Physiology I Course were collected and submitted to feature engineering and prediction modeling analyses using Splunk Machine Learning Toolkit. Exam grades were regressed (Logistic, 10-fold cross-validation) on 245 candidate predictors, including traces of use of learning modules, and content within them (e.g., learning objectives, outlines, self-quizzes, active learning worksheets, base art images).

Results: Using only digital event data from the lab and enrichment sites from the first two weeks of the course (during instruction on Chapters 1 and 4), a logistic regression model containing 12 predictors was fit that accurately identified 79% of students who would earn a D or Worse on Exam 1 (i.e., sensitivity; overall accuracy = 66%, see Table 1).

Significance: Leveraging and elaborating common spaces where students learn can circumvent instructional design differences between instructors’ course sections and enable learning analysts to derive insights, develop tools, and deliver support to all learners (Cogliano, et al. 2022). These approaches can make learning analytics-informed support for STEM learning available across a broader set of course designs, lowering barriers for smaller enrollment models and providing more equitable opportunities to learn.

Authors