Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Session Type: Roundtable Session
This roundtable features five studies advancing causal inference and experimental design methodology in education research. The papers introduce a diagnostic approach for testing instrumental variable validity in pretest–posttest designs, a simulation-based framework for power analysis in staggered Difference-in-Differences models, empirical benchmarks for moderator effects in cluster randomized trials, new power formulas and confidence interval methods for equivalence testing, and practical guidelines for addressing covariate imbalance in small-sample randomized controlled trials. Collectively, these studies strengthen the precision, transparency, and interpretability of causal estimates. The session highlights innovative strategies that enhance the rigor of policy evaluation and experimental research in educational contexts.
Advancing Causal Inference in Education: Testing Instrument Exogeneity in Pretest–Posttest Designs - Naram Gwak, Seoul National University
Power Analysis for Difference-in-Differences Studies with Staggered Treatment Adoption - Wei Li, University of Florida; Katherine Strickland, University of Florida; Jing Huang, University of Florida; Shuyuan Li, University of Florida
Empirical Estimates of Moderator Effects in Cluster Randomized Trials: Implications for Statistical Power and Study Design - Qi Zhang, American Institutes for Research; Molly E. Cain, American Institutes for Research
Methodological Updates for Equivalence Tests in Randomized Controlled Trials - Zuchao Shen, University of Georgia
Polish the Gold: Calibrating Intervention Effect Estimates for Small-Sample Randomized Controlled Trials - Yixiao Dong, University of California - Santa Barbara; Garrett J. Roberts, University of Denver