Session Summary
Share...

Direct link:

Improving the Design of Program Evaluations in Education and Prevention Science

Sun, April 24, 2:30 to 4:00pm PDT (2:30 to 4:00pm PDT), Manchester Grand Hyatt, Floor: 2nd Floor, Seaport Tower, Gaslamp CD

Session Type: Symposium

Abstract

The quality of study designs to evaluate interventions is critical in the effort to produce rigorous evidence of ‘what works’. To evaluate intervention effects in education and prevention science, multilevel randomized trials are now widely used. A statistical power analysis demonstrating adequate power to detect the treatment effects if they exist is essential. In order to design evaluations with sufficient statistical power to detect a meaningful effect, researchers need to make reasonable assumptions about the design parameters to estimate the required sample sizes. The purpose of this session is to provide estimates of design parameters for planning evaluations using multilevel randomized trials across several domains and provide empirical benchmarks for interpreting effect sizes based on empirical data analysis and meta-analysis.

Sub Unit

Chair

Papers

Discussant