“1.1 million students used the program across the US last year.”
“70% of students in the intervention condition improved in test scores.”
“This is the best product ever!” Teacher from District A
Understanding impact: Key metrics and data sources
When you are looking to adopt an edtech program to improve student outcomes, you may see statistics and quotes like this highlighted on a product’s website about program effectiveness. There are a number of different ways that programs can demonstrate they are making a real, tangible difference for the students and teachers using their programs.
Depending on your own goals, some outcomes and data pieces may be more useful than others for measuring impact. The following are some things to look for in a well-designed program:
Capturing user experiences and feedback with qualitative data
Qualitative Data
- Qualitative or descriptive data may come from surveys, interviews, and case studies. This data offers rich, in-depth data that can capture user experiences and sentiments in more detail than a single numerical data point.
- Qualitative data analysis involves organizing, coding, interpreting, and synthesizing data to identify patterns, themes, and meanings.
- A well-designed program will conduct frequent qualitative engagements with a wide variety of diverse users to learn more about how and why the product is being used in the classroom and the outcomes that students and teachers are experiencing as a result.
Measuring reach and engagement with quantitative data
Quantitative Research
- Qualitative research can be great for rich and genuine feedback, but it may not tell the full story. Edtech programs should also engage in quantitative or numerical impact research using high-quality and rigorous statistical analyses.
- At a very basic level, quantitative research in education can tell us about reach (the number of people using the program) and engagement (the amount of usage), which can be important for understanding a product’s potential for supporting all kinds of students.
- Well-designed studies with statistical hypothesis testing (e.g., comparing academic gains in users vs. non-users) are critically important for understanding research metrics and demonstrating learning gains and academic achievement associated with product usage.
Comparative analyses: Benchmarking against industry standards
Comparative Analyses
- Both quantitative and qualitative research methodologies can take advantage of comparative analyses, comparing the findings from their studies against industry standards or other similar initiatives, to provide even more information for evaluating the product’s real-world impact.
- Longitudinal studies, conducted via repeated engagements with the same students or teachers, can also use control groups, providing change information for those who used the product vs those who did not.
- For instance, a recent ESSA Tier 2 study of Frax found that the program was as impactful as a leading competitor product but in ¼ the time. When we compared the gains produced by Frax to other interventions, the standardized effect sizes revealed that Frax was 3x more effective than the average educational intervention for 3rd graders and 5x more effective for 4th graders.
Feedback loops and iteration for continuous improvement
Feedback Loops and Iteration
- Impact research should not be an isolated endeavor undertaken merely to check a box. Well-designed programs continually engage in a feedback loop of product improvement and research. Longitudinal research ensures that the outcomes observed result in real change and improvement, even years after exposure.
- ExploreLearning engages a user feedback group called the Collab Crew to continually gather feedback from real users and improve our program based on insights gained.
Examples of high-quality research by ExploreLearning
Educators often wonder how to measure the impact of a program. Educational research, specifically edtech research, is the answer. High-quality research brings the effectiveness of a program into focus.
Take a look at these examples of high-quality research studies conducted by ExploreLearning’s Research team to quantify the impact of our products on teachers and students:
Academic Performance
Gizmos and STEM Cases are virtual simulations designed to help teachers implement applied, real-world activities that allow students to develop their critical thinking skills and boost science proficiency. A recent quantitative study in a diverse school district found that students who use Gizmos more frequently were more likely to meet or exceed proficiency on an NGSS-aligned assessment compared to students who used Gizmos less frequently.
A study found that Reflex helped students increase their math proficiency on state assessments compared to schools that did not use the program, showing that targeted support can lead to meaningful gains in learning.
Learning Gains
Research examined the impact of high-fidelity Reflex usage on students’ math fact fluency and achievement on NWEA MAP Growth assessments. Students who achieved the “Green Light” in the majority of usage sessions were significantly more likely to reach 100% fluency levels and more likely to meet or exceed individual growth goals.
New research conducted by ExploreLearning found that elementary students using Frax—an adaptive, game-based program that helps students build mastery of fractions—made significantly larger academic growth in math compared to non-users.
Student Engagement
One survey of teachers using Frax for the first time found that nearly all (99.7%) of the teachers said they saw improvement in student learning and engagement because of Frax. The most frequently observed improvements by teachers included increased enjoyment in math learning and increased confidence in math abilities.
Elementary school teachers shared overwhelmingly favorable feedback on the benefits of using Reflex in their classrooms. In the end-of-year study, responses to the survey questions were overwhelmingly favorable.
Teacher Support
ExploreLearning regularly conducts surveys, interviews, and focus groups with teachers using our products. In addition to student outcomes, our products also aim to support teachers’ classroom needs (e.g., saving time, streamlining prep, and ease of use). A survey administered to teachers awarded Science Success Grants to trial Gizmos found that 81% of teachers agreed that Gizmos supported opportunities for phenomena-based inquiry learning, and 83% of teachers agreed that Gizmos supported the achievement of their school and/or department goals. Another recent qualitative survey found that teachers often noted a key benefit of using Gizmos in their classroom was the support they provide to meet their pedagogical needs (e.g., adaptable materials aligned to curriculum standards).
Are you ready for more? Take a deep dive into the research behind ExploreLearning math solutions and science solutions.
Grades 3-12
Math and science simulations that excite curiosity and inquiry
Grades 3-5
Game-based program helps students learn fractions
Grades 2-6
The most effective system for mastering math facts
Grades K-2
Online and offline activities build foundational science skills
About the Author
ExploreLearning Senior Researcher Megan Conrad, Ph.D.
Dr. Megan Conrad, Senior Researcher for ExploreLearning, shares her insights in our Measuring Impact Series. In her current role, she works with district administrators, curriculum coordinators, and teachers to uncover evidence of student success from product usage and helps districts make evidence-based decisions regarding product implementation.
She earned her Ph.D. in Psychology (specializing in Developmental and Cognitive Science) from Rutgers University and has 15 years of experience designing and conducting research in child development, learning environments, and youth & technology. She previously worked in higher education as an Assistant Professor of Psychology and founded a research lab that explored children’s STEM learning in informal learning settings.
You might also like these stories...
Sign up to get the latest updates from ExploreLearning via occasional email.