Paper Summary
Share...

Direct link:

Toward a Typology of Questions Explored by Researcher-Practitioner Partnerships

Tue, April 12, 10:35am to 12:05pm, Convention Center, Floor: Level One, Room 101

Abstract

Objectives: We collaborate in a researcher-practitioner partnership to analyze outcomes for English learners (ELs) in Oregon. Drawing on our work, we describe a typology of questions that researcher-practitioner partnerships explore: data validation, information-gathering, and evaluation questions. Data validation provides answers to the question: Is this data accurate? Information-gathering provides answers to descriptive questions such as: How many …? or How long …? Finally, evaluation provides answers to the question: How well did this work? We provide an example of each type of question, share our results, and reflect on affordances and constraints offered by the different types of questions. For data validation, we describe our work verifying data about instructional program models in which ELs participate. For information-gathering, we discuss findings about the time needed for students to exit EL services. Finally, for evaluation, we consider our experience evaluating reclassification policies. We then describe how policymakers made use of findings from these three types of questions.
Theoretical Framework: Coburn, Penuel, and Geil (2013) created a typology for researcher-practitioner partnerships and describe challenges resulting from the different incentive structures and timelines under which the two groups operate. However, the terrain of researcher-practitioner partnerships, particularly the types of questions taken up by partnerships, has not yet been fully explored. In our analysis, we build on Gutiérrez and Penuel’s (2014) discussion of relevance to practice as a criterion for rigor in education research.
Methods: Because our work spans three types of questions, it incorporates a variety of methods. To validate data about instructional program models in which ELs participate, we created tables listing the number of students participating in different types of instructional programs, including different types of bilingual programs, by district, school, grade, and year and shared these with districts to check for accuracy. Information-gathering questions might involve only basic descriptive statistics; however, other approaches might also be needed. For example, in our analysis of how long it takes ELs to attain English proficiency and exit EL services, we employ discrete-time survival analysis. Finally, evaluation questions typically involve quasi-experimental methods. We use regression discontinuity approaches to evaluate the state’s reclassification polices.
Data: We use seven years of statewide administrative data for all students enrolled in Oregon public K-12 schools. The longitudinal dataset we have assembled includes 4,008,915 observations for 981,145 students, approximately 20% of whom were ever classified as ELs.
Results: Although evaluation questions have higher prestige, exploring data validation and information-gathering questions provides crucial, actionable findings for practitioners. For example, analysis showed that a majority of ELs attained English proficiency within six years. This finding informed the state’s revisions to its Annual Measurable Achievement Objectives. Now Oregon districts are rated as to whether ELs are on-track to attain proficiency within six years.
Significance: In addition to providing concrete findings about outcomes for ELs in Oregon, our work contributes to broader conversations about directions in researcher-practitioner partnerships, particularly the types of questions these partnerships are best positioned to answer.

Authors