Paper Summary
Share...

Direct link:

Exploring the human factors affecting AI acceptance in decision-making processes among administrators and faculty.

Thu, April 24, 1:45 to 3:15pm MDT (1:45 to 3:15pm MDT), The Colorado Convention Center, Floor: Terrace Level, Bluebird Ballroom Room 3C

Abstract

Faculty and administrators in higher education institutions often serve many roles, with job functions that affect students' academic success and performance. One of these roles include assessing a student’s academic history from a sending institution to evaluate if listed courses are equivalent to those of the receiving institution. This study investigates how an AI-based platform (anonymous et al., 2023) that recommends equivalent courses from a sending institution to a receiving institution, might aid faculty and administrators in transfer credit decision-making. We take a human-centered approach to gain a deeper understanding of faculty and administrator’s behaviors, understandings, and contexts of their decision-making processes around transfer credit approval and what factors led to greater trust in accepting equivalent courses.
Our study follows a mixed methods exploratory sequential design approach where we begin with interviews to inform our survey design’s conditions in choosing which factors lead to higher acceptance of course equivalence suggestions, manipulating several design components of the platform. We conduct a 2 (type of equivalence) × 5 (human factors) between-subjects study with 300 participants recruited from the State University of New York (SUNY) system that include 2-year and 4-year institutions. The two levels for type of equivalences are a precedent setting articulation and individual student request. The five levels of human factors are the influences of knowing which other institutions accept the courses (external pressure), the number of students previously approved credit for courses (internal data), course descriptions’ percentage overlaps (explainability), or not knowing the name of the sending institution (blind).
In exploring these conditions, we aim to understand the balance of faculty’s and administrator’s decision-making between their own perceptions of equivalencies and their trust in an algorithmic system. Data collection is in progress. We hypothesize that knowing other institutions that have accepted courses, knowing the number of previous students who were approved to receive credit for courses, knowing the similarity score of courses between institutions, and not knowing the name of sending institution for a course will all increase acceptance rates. In addition, we hypothesize that individual student requests will be more accepted than precedent setting articulations. We expect there to be more acceptance for individual student requests when knowing the number of students who have received credit before (internal data) and more acceptance for precedent setting articulations, when knowing which other institutions accept the course as equivalent (external pressure).

Authors