Understanding the Varied Effects of Mathematics Interventions Through Meta-Analysis
Mathematics is a foundation that is necessary for learning and employment in other technical disciplines, such as science and engineering. But U.S. student math achievement often falls behind many other industrialized nations. Math education researchers and reformers have tackled this problem for decades through new curricula, instructional approaches, and out-of-school time programs.
Researchers at AIR’s Methods for Synthesis and Integration Center (MOSAIC) conducted the largest-ever quantitative synthesis of these math interventions’ effects on U.S. preK-12 student math achievement, aiming to understand what works and in what contexts. This meta-analytic research was published online in the Journal of Research on Educational Effectiveness in January 2022.
The Synthesis
The Synthesis Included 191 Experimental Studies Spanning More Than 25 Years
The synthesis included studies that used the rigorous design of randomly assigning students to receive a mathematics intervention (e.g., a new curriculum) or “business as usual” instruction. This study design of random assignment is a gold standard for ensuring that the groups of students are comparable prior to receiving the intervention.
The team found 191 studies meeting inclusion criteria, including more than 250,000 U.S. preK-12 students and spanning more than 25 years of educational research. An interactive application hosted on MOSAIC’s website allows users to explore the evidence and conduct their own analyses.
Key Findings
The Interventions Improved Math Achievement Overall, but Effects Widely Varied
- Overall, the studied research-based interventions were effective at improving student mathematics achievement. In the settings tested, a randomly chosen intervention yielded better outcomes than standard instruction in 75% of cases.
- These intervention effects also widely varied, however, with typical effects ranging from strongly positive to moderately negative.
- One factor explaining these varied effects was who delivered the intervention. Interventions delivered by teachers or other professionals yielded larger average effects than interventions delivered by technology (e.g., computer-delivered instruction).
- Another explanatory factor was the outcome measure type. Standardized tests demonstrated much weaker average intervention effects than measures developed by the study authors or other educational researchers.
- Analyses suggested some other potentially important factors such as intervention length and student grade level. But still, much of the variation in intervention effects went unexplained.
Practitioners Need to Know What Will Work for Them and Their Students
These findings indicate the importance of knowing which math interventions work, for whom, and in what conditions. The overall results show that the use of research-based interventions is promising, but teachers and practitioners must consider the context and setting for their students. The weaker average effects for technology-delivered interventions are also notable as schools seek methods to address learning loss during the pandemic.