Paper Summary
Share...

Direct link:

Model Complexity and Its Effects on Tests of Differential Item Functioning (Poster 22)

Fri, April 12, 11:25am to 12:55pm, Pennsylvania Convention Center, Floor: Level 200, Exhibit Hall A

Abstract

A crucial part of the development of educational and psychological testing is determining if test items function similarly across examinee groups or if they contain differential item functioning (DIF). While there are several tests to detect DIF many of the simulations to determine their efficacy use simplified models to generate examinee data. Through a systematic review, this study delved into what models are typically used to simulate examinee data in studies of DIF tests. This review guided data generation for a simulation study which compares the effectiveness of various DIF tests when examinee response data is generated using a complex model designed to capture the within group random item effects which exist in empirical data. The results of this simulation are used to inform a DIF analysis of the 2022 Programme for International Student Achievement (PISA) which looks at students from the United States using gender and language spoken at home as grouping variables.

Author