Paper Summary
Share...

Direct link:

Matching Education Researchers with Validated & Usable Digital Learning Platform Infrastructure (Poster 6)

Thu, April 24, 1:45 to 3:15pm MDT (1:45 to 3:15pm MDT), The Colorado Convention Center, Floor: Terrace Level, Bluebird Ballroom Room 2A

Abstract

Overcoming the gap between researchers and education technology providers

Educational researchers spend a substantial amount of time collecting data for research and navigating platforms to set up the technical infrastructure for their studies (Newman, Bavik, Mount, & Shao, 2021). Meanwhile, there is a growing movement in the United States (e.g., Advancing Innovative Math Solutions (AIMS), the IES-funded SeerNET, and others that seek to create research platforms for post-hoc and experimental research studies. For these efforts to succeed, it is critical that the infrastructure is described in ways that map to researcher interest and is consistent across efforts. This poster describes a research project to collect requirements from key stakeholders and share the results in a “Digital Learning Platform Catalog”.

Codesign Process and Use-Case Analyses

We developed the DLP Catalog through a rigorous co-design process, involving stakeholders from the education community to address diverse research needs. Involving end users in design significantly enhances a tool’s effectiveness (Sanders & Stappers, 2008). We began by creating a set of user personas and corresponding features, implementing an “agile” framework for development, and built a prototype of those features.
We recruited a steering committee of data scientists, professors, and educators to collect feedback, which was clustered into themes. This revealed common requests such as advanced search options to align research goals with theoretical frameworks and a wider range of datasets. The committee highlighted the need for options to specify theoretical frameworks, enabling researchers to situate their work within existing literature. We will next conduct focus groups with graduate students and postdoctoral researchers to ensure the catalog remains relevant and user-friendly for early-career researchers.

Focus on Functional Implementation to Address Challenges

The DLP Catalog currently addresses several concerns identified by stakeholders. While researchers may be familiar with software like MATHia or MathNation, navigating different websites and access protocols can be cumbersome. We focused on simplifying this process by aggregating key information essential for starting a study, such as sampling populations, types of data collected, and privacy or security considerations.

Repositories such as the NCES (National Center for Education Statistics) or datasets on Kaggle may lack actionable tools and immediate support needed for experimental research. To assist researchers, the DLP catalog lists both potential datasets and platforms equipped with built-in experimental capabilities and infrastructure for conducting controlled studies. This reduces the need for researchers to manage complex technical systems, allowing them to focus more on pedagogical and analytical work.
Researchers emphasized a desire to communicate complex needs more directly with data providers. We built a pipeline for interested parties to submit expressions of interest and for platforms to respond and track these requests in a timely manner. The implementation is open-source and currently available online.

Next Steps
We will update the application with future stakeholder feedback, adding features such as advanced search filters, sample research questions, and other features to bridge the gap between platform providers and researchers.

Authors