Meet the Expert: Markus Broer

Image
Illustration of Markus Broer

Markus Broer provides expert advice and research support to the National Assessment of Educational Progress (NAEP), which benchmarks the educational achievements of K-12 students in the United States. Previously, he provided technical assistance for student assessments on a variety of AIR’s international projects.

POSITION: Managing Researcher

AREAS OF EXPERTISE: Large scale educational assessments, Education research, Psychometrics

YEARS OF EXPERIENCE: 20+
 

Q: NAEP is known as "the Nation’s Report Card." What does it measure, and how have the results been used?

Markus: NAEP evaluates students’ knowledge and skills in mathematics, reading, science, and other subject areas. The goal is not to provide feedback on individual students’ skills or knowledge, but rather to get a sense of how U.S. students are doing overall, as well as how subgroups of students are doing. A sample of students from different schools and districts take the test, and their combined results give us a sense of average achievement levels and how scores evolve over time. The samples can be very large for grade 4 and 8 math and reading assessments since they are representative at both the national and state levels—between 140,000 and 150,000 students per subject area/grade level pair (for example, grade 4 mathematics).  

States set their own standards and administer their own state assessments, meaning that state results cannot be compared with one another. With NAEP, one can compare state performance across the entire United States, allowing an "apples-to-apples" comparison. As a result, educational policymakers and the public follow the results very closely.
 

Q: Your team provides research support to the NAEP program. What does that involve?

Markus: We provide support across a number of areas, but I’d like to focus on one: validity. Validity refers to how accurately a test or questionnaire reflects or assesses the concept that it intends to measure. Validity is something that assessments continuously strive for; it’s not a box that you can tick off once and then be done with. The ongoing validity of the assessment is essential to getting results that educators and policymakers can rely on. As the assessments are constantly evolving, validity issues also need to be continually reviewed and investigated for NAEP to maintain its status as “gold standard” in educational assessment.

Validity is something that assessments continuously strive for; it’s not a box that you can tick off once and then be done with. The ongoing validity of the assessment is essential to getting results that educators and policymakers can rely on.

AIR supports this continuous review through its management of an independent panel of experts, the NAEP Validity Studies Panel, and through research studies conducted by an AIR team that I lead. For example, our NAEP research team worked on validity research connected to the transition of NAEP from a paper and pencil format to a digital format. Prior to that, we also examined how students’ access to and familiarity with using technological devices, like a computer, were related to three early NAEP digital based assessments. We also conducted a series of studies focused on the "predictive validity" of NAEP, i.e., how its assessment results are related to future outcomes. For instance, we conducted a study that looked at the relationship between results from the NAEP grade 12 mathematics test and postsecondary enrollment and outcomes.

Equity is an important topic in validity. This involves both ensuring fair measurement and using NAEP to document and track inequitable outcomes. For example we studied trends in achievement gaps between high and low SES students between 2003 and 2017 at the national level and at the state-level.
 

Q: What major trends does NAEP show in the outcomes of American students?

Markus: For quite some time, we have observed the widening of score gaps. For example, the scores at the 90th percentile of the NAEP score distribution and those at the 10th percentile have been diverging over time, and that gap has grown more pronounced over the last decade or so. National Center for Education Statistics and AIR researchers have done an in-depth analysis about this not only involving data from NAEP but also from international assessments that include U.S. data.

Even before the pandemic, NAEP performance was already stagnating or declining overall. In many grades and in many subject areas, a large share of students perform below NAEP’s Basic achievement level that represent partial mastery of knowledge and skills.
 

Q: How do you think policymakers and stakeholders will react to the latest NAEP results?

I hope the NAEP results will act as a wake-up call and that the 2022 results will spur policymakers to develop multi-year plans to recover from the pandemic learning lags.

Markus: I hope the NAEP results will act as a wake-up call and that the 2022 results will spur policymakers to develop multi-year plans to recover from the pandemic learning lags. NAEP will also play an important role in tracking that eventual recovery over time. NAEP is a cross-sectional assessment, meaning it does not follow a particular cohort of students as they progress through school. Instead, for example, it consistently tracks how 4th graders are doing in mathematics.

If, as I hope, counteracting measures are implemented to combat pandemic learning loss, we will be able to observe it in the contrasting performance of different 4th graders over time. I think it is unlikely, however, that the current declines can be fully reversed in the next two to four years with a “continue-as-usual” or “one-off intervention” approach.

The NAEP 2022 results showed us how important school-based learning is, especially in the early years. One of our NAEP researchers, Jasmine Park, led an excellent study (to be released as working paper soon) showing the steep growth in reading skills that normally happens between kindergarten and 1st grade for kids that go on to become proficient readers. If pandemic learning disruptions blocked that growth, how can we help students try to make it up the following years?

I hope the pandemic-era drop in NAEP scores across almost every student group will start a bigger debate about how to improve student learning outcomes. Even before COVID, the U.S. education system was not working well for a large share of students: For example, in grade 4 reading, 34% of students performed below NAEP's Basic level in 2019 and that figure rose to 37% in 2022. That percentage was even higher for disadvantaged subgroups.
 

Q: Where can we find you on a typical Saturday?

Markus: On Saturday morning, I take my twins to German school. In the afternoon we often go to a nearby park. And if the opportunity presents itself, I will try to take a nap.
 

Q: What book would you suggest everyone read?

Markus: One excellent book related to my field is “Our Kids: The American Dream in Crisis” by Robert Putnam. His work inspired some of our analysis on socioeconomic status. We also replicated some of his work with newer data sets, and we found similar, depressing results on college access: that low-performing, wealthier students are likelier to attend college than their high-performing, economically disadvantaged counterparts.

I also recommend “The Black Swan: The Impact of the Highly Improbable,” by Nassim Nicholas Taleb. It describes events that are unpredictable but carry a huge impact. There seem to be a lot of those lately—the pandemic, war.