Paper Summary
Share...

Direct link:

Understanding School Security Equipment Decision-Making: A Mixed-Methods Process Evaluation Design

Sat, April 11, 1:45 to 3:15pm PDT (1:45 to 3:15pm PDT), InterContinental Los Angeles Downtown, Floor: 5th Floor, Los Feliz

Abstract

Objectives and Theoretical Framework
We examine district and school leader decision-making processes regarding school security equipment (SSE) implementation, guided by logic model frameworks theorizing how SSE inputs produce intended outputs and outcomes. While existing SSE research lacks causal evidence and often does not incorporate stakeholder perspectives, our process evaluation addresses gaps by centering practitioner voices in understanding SSE decision-making, priority-setting, and goal articulation. Our theoretical foundation draws from logic model frameworks and the need to understand policy decision-making processes, recognizing that understanding why and how practitioners make SSE investments is essential for interpreting quantitative impact findings and informing evidence-based policy.

Methodological Approach
We employ a qualitative design combining semi-structured interviews with an online open response form alternative to maximize participation. Our two-tiered sampling strategy targets: (1) district-level administrators responsible for SSE decisions from the Virginia Center for School and Campus Safety database, and (2) building-level principals from participating districts. Critically, sampling includes both grant recipients and non-recipients from Virginia's School Security Equipment Grant Program (2022-2023 cycles), enabling comparative analysis across funding outcomes.

To address methodological challenges in practitioner research, we will implement multiple data collection modalities: video-conference interviews ($50 stipend, ~50 participants each level) and structured online forms using open-ended questions ($25 stipend, up to 200 additional principals). This dual approach addresses time constraints and accessibility barriers while maintaining data quality through parallel question protocols.

Data Sources and Analysis Framework
Data sources include interview transcripts, online form responses, and institutional documents from approximately 50 district administrators and up to 250 school principals across Virginia. We employ systematic thematic analysis using Dedoose, with explicit attention to methodological rigor through: (1) collaborative codebook development with training for data coders, (2) dual-coder reliability assessment using Cohen's Kappa with two researchers independently coding initial interviews to determine final codes, (3) ongoing engagement with study partners to validate findings, and (4) transparent disagreement resolution through researcher collaboration to reach agreement on larger themes and sub-themes before completing remaining data coding. Our analysis will examine: decision-making hierarchies and processes, resource allocation rationales, perceived SSE effectiveness theories, implementation challenges, and intended outcome articulation. This systematic approach enables pattern identification across organizational levels while preserving contextual nuances.

Anticipated Contributions and Scholarly Significance
This methodological approach addresses gaps in SSE policy research by leveraging qualitative process evaluation to illuminate policy implementation's "black box." Our design, including the dual-modality data collection, protocols, and codebooks, offer replicable frameworks for studying practitioner decision-making in resource-constrained environments. Specifically, these contributions include: (1) modeling accessible qualitative data collection with hard-to-reach populations, (2) demonstrating process evaluation integration within larger mixed-methods impact studies, and (3) providing frameworks for comparative analysis across policy implementation contexts. By centering practitioner perspectives through rigorous qualitative methods, this research exemplifies how process evaluation can bridge the research-practice gap while generating actionable evidence for policy refinement.

Author