Key Takeaways from SREE 2024 on Effective Education Practices
ExploreLearning's Senior Researcher, Dr. Megan Conrad, shares her takeaways from the 2024 SREE Conference.
Can educational research be both rigorous and relevant?
I recently attended the Society for Research on Educational Effectiveness (SREE) annual conference, an interdisciplinary association “dedicated to advancing the generation and use of effectiveness research to solve pressing challenges in education.” This year’s conference theme, “Navigating the Future of Education Research: Impact Evaluation in a Transforming Landscape,” invited education researchers to grapple with the challenges that arise when rapid changes happen in the education landscape.
The impact of COVID-19 and AI on educational research
Virtual instruction during COVID-19 fundamentally changed the way educators engage with technology. In just the past year, generative AI has become the "topic du jour," with nearly every webinar or education-related conference including some reference to the potential for AI to completely reshape and reimagine education, with both positive and negative connotations.
Challenges of keeping research relevant in a rapidly changing landscape
Unfortunately for all involved, research has struggled to keep pace. "Slow" research can be a sign of careful, well-thought-out work. However, it can also fail to meet the needs of practitioners, who need quick access to data and insights to make informed decisions to support their students. Educators and decision-makers don’t often know where to go to get answers to questions like “What programs will work best for my demographic of students?” and default to incomplete pictures of answers, like teacher testimonials or their own lived experiences.
Balancing qualitative feedback with rigorous scientific evaluation
Without discounting the valuable role of qualitative feedback for specific questions (e.g., Will my students enjoy this program? Is it easy to implement? Does it meet my classroom teaching goals?), program efficacy is best addressed using the scientific method and rigorous statistical evaluations. So, how does the field move forward? How do we balance the need for methodological rigor with the demand for responsive research that accurately reflects the rapidly evolving technological landscape of today’s schools?
Key takeaways from SREE: Bridging research and practice
Many of the panels and talks I attended at this year’s SREE reflected on this topic, trying to find ways to bridge the divergent perspectives of academic researchers, non-profit research institutes, decision-makers within school systems, and funding and policy organizations. Below are some of the common themes and takeaways I found particularly interesting.
The importance of connecting research to practice
In a session titled "From Evidence to Action: Navigating the Dual Challenges of Rigor and Relevance in Educational Research," speakers and attendees discussed the need for relevant and useful research for practitioners. Prioritizing the research questions and assessment outcomes that are most relevant to schools and districts right now can be achieved by co-creating studies with practitioners involved in the research design process or added to organizational “advisory boards.”
The role of instructional design in edtech effectiveness
Outcomes data itself cannot be relied on without a deep understanding of the instructional features and pedagogical theories behind the program's design. Multiple speakers emphasized that the instructional design drives positive outcomes rather than the technology itself.
Research on adaptive technology that focuses only on one group of students or one application may not be widely applicable to all students. For instance, adaptive programs focusing solely on remediation can potentially hinder students' progress.
The need for nuanced understandings of efficacy
For academic researchers, the “gold standard” of research methodologies for determining program efficacy has been causal evidence from randomized controlled trials (RCTs). This is often at odds with the needs of school systems, which struggle with equity issues related to withholding a potentially impactful program from students just because they were not randomly selected.
In a session titled “Defining the Efficacy of Education Technology Products: From Prototyping to Scaling Up,” the Senior Research Analyst for the Detroit Public Schools Community District said, "If your intervention is worth doing, it is worth doing it for all students.” Other potentially more important research questions from a district perspective include:
- Implementation fidelity: A program might be great in theory, but if implementation fidelity is low, those improvements will not be observed. Traditional RCTs don’t always account for the situational factors that may have impeded implementation, and focused studies of what leads to high implementation fidelity can be just as important for a district. Incorporating an understanding of context includes research questions like: Is this a tool that can be implemented in our classrooms with our students, particularly those students who need the most support?
- Feasibility and Use: Teachers may have already started using a product due to a district-wide decision to adopt it. So, the research findings they are particularly interested in are what happens if you implement the program this way vs. this way.
- Cost-effectiveness: Return on investment (ROI) of edtech is a crucial factor, particularly in the context of post-ESSER funding. The Deputy State Superintendent for North Carolina described their use of an ROI calculator for district leaders, quantifying outcomes and weighting implementation fidelity, to fundamentally shift procurement decision-making and make quantifiable and measurable outcomes a central piece of the conversation.
Insights from a longitudinal study on Frax effectiveness
In my own talk at SREE this year, I shared how we partnered with a school district for a 2-year longitudinal study of our Frax program. This partnership was vital to exploring the effectiveness of the program in real classrooms, including all of the potential hiccups that can go along with implementing a brand-new technology program.
Our partnership was able to uncover a number of important findings, including the benefit of Frax Sector 1 for both grade-level content and remediation; the ability of Frax to help students struggling in math to move towards grade-level standards; and the continued benefits of Frax for math achievement over a year after using the program.
Bridging edtech research and school leadership for better outcomes
I appreciated the number of talks at SREE this year dedicated to combining the voices of the edtech research community with the school leaders making the decision about what programs to use in their communities. The need for practical, context-specific, and nuanced approaches to educational research and evaluation is an important and relevant topic, given the potential of edtech to alter the academic experience of millions of students and shape their future career interests and opportunities. I was excited to spend time with this community and look forward to next year in Chicago!
Interested in exploring the difference Frax can make for your students? Take a free trial today!
About the Author
ExploreLearning Senior Researcher Megan Conrad, Ph.D.
Dr. Megan Conrad, Senior Researcher for ExploreLearning, shares her insights in our Measuring Impact Series. In her current role, she works with district administrators, curriculum coordinators, and teachers to uncover evidence of student success from product usage and helps districts make evidence-based decisions regarding product implementation.
She earned her Ph.D. in Psychology (specializing in Developmental and Cognitive Science) from Rutgers University and has 15 years of experience designing and conducting research in child development, learning environments, and youth & technology. She previously worked in higher education as an Assistant Professor of Psychology and founded a research lab that explored children’s STEM learning in informal learning settings.
You might also like these stories...
Sign up to get the latest updates from ExploreLearning via occasional email.