Search
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Browse Sessions by Descriptor
Browse Papers by Descriptor
Browse Sessions by Research Method
Browse Papers by Research Method
Search Tips
Annual Meeting Housing and Travel
Personal Schedule
Change Preferences / Time Zone
Sign In
X (Twitter)
Objectives: In 2017–18, the Arkansas Department of Education (ADE) partnered with Solution Tree to launch a large-scale investment in PLC at Work® with the goal of continuous improvement in student achievement (DuFour & Eaker, 2009). Solution Tree amplified its PLC at Work model to provide three years of 50 days of onsite customized coaching to improve educators’ leadership, collaboration, and constructive use of data. This study estimates the impact of PLC at Work in Arkansas on students’ state English Language Arts (ELA) and math test scores after two years of implementation. The study contributes foundational evidence to the field on the impact of PLC at Work on student achievement.
Framework: This study builds on correlational evidence that suggests PLCs may improve student achievement (e.g., Lomos, et al., 2011; Vescio et al., 2008).
Methods/Data sources: We used a quasi-experimental design eligible to meet What Works Clearinghouse standards with reservations. With statewide ADE data, we used a two-stage matching procedure to construct a comparison group. First, propensity score matching established baseline (2016–17) equivalency on school characteristics, including enrollment, demographic composition, average test scores, and the percentage of students eligible for federal programs between the 12 Cohort 1 (treatment) and comparison schools. Within matched treatment and comparison schools, coarsened exact matching established baseline equivalency among treatment and comparison students based on grade level, gender, race/ethnicity, ELA and math state test scores, and eligibility for federal programs. No statistically significant differences remained between treatment and comparison schools or students, and effect sizes did not exceed |0.25| for any matching variable. Students who left treatment schools after the baseline year remained in the treatment sample. The analytic sample included 2,633 treatment students enrolled in 87 schools and 44,341 comparison students in 293 schools in 2018–19.
We measured ELA and math test impacts with the following multilevel model:
Yij = treatij + Xij + δj + uj + εij
where Yij is the 2018–19 ELA or math test score for student i in school j; treat indicates student i attended a PLC at Work school in 2016–17; X, a vector of baseline student demographic, achievement, and behavioral characteristics, and δ, a vector of baseline school characteristics, including region, enrollment, demographic composition, and average ELA and math achievement. u is a school random effect and ε is a random error term.
Results/Significance: Our study found positive gains on math state assessments for students in grades 5–10 between the baseline year (2016–17) and Year 2 (2018–19). Students in treatment schools had 0.067 standard deviation (s.d.) higher growth in math relative to the comparison group—a statistically significant result (p = 0.041)—and 0.032 s.d. in ELA, which was not statistically significant (p = 0.330). Subgroup analyses revealed positive, statistically significant impacts for several student groups in math. Future research may leverage implementation data to identify the PLC at Work model components and/or conditions that drive positive results and examine the sensitivity of the results to adaptations of the Arkansas model.