Study assesses modified evaluation method for students placed at risk

Center for Research on Children Youth Families and Schools

Chuck Green, December 3, 2021

Study assesses modified evaluation method for students placed at risk

HyeonJin Yoon, research assistant professor, Nebraska Academy for Methodology, Analytics and Psychometrics, is assessing the validity of a new evaluation method for targeted educational interventions for students placed at risk.

Regression discontinuity design is an evaluation that assesses the impact of a need-based, targeted intervention. It relies on a cutoff point on an assignment measure that typically evaluates participants’ need, or threshold, above or below which an intervention is assigned. By comparing post-test scores around the cutoff, researchers can identify the intervention’s impact on a given outcome.

For example, in schools, students could be invited to an after-school reading program based on a cutoff of reading scores. If the intervention is assigned to students within a certain range above or below the cutoff, regression discontinuity design can measure the group-difference in outcomes clustered in the small area on either side of the cutoff, where students are considered statistically comparable.

Yoon notes that although regression discontinuity design is effective, it has its limitations.

“For example, when using RDD, you could determine the intervention impact on students who score just around the cut score 30, but not students who score far below, such as 10,” she said. “Therefore, we don’t know whether intervention has impacts on those students who are most in need.”

For this study, which is funded by an Office of Research and Economic Development Layman Seed Grant, Yoon is using evidence-based, kindergarten math intervention data collected in 2016 and 2018 from Massachusetts and Oregon. She is working to demonstrate the application, analysis and interpretation of regression discontinuity design with covariate matching.

By matching covariates — independent variables that can influence the outcome of a statistical trial — Yoon aims to balance covariate distribution in the treatment and control groups. She will add covariate measures such as pretest scores and demographic information — data naturally collected in schools.

“That’s an advantage of this method,” said Yoon, a CYFS research affiliate. “The covariates are already there. We just need to include them in the analysis.”

Matching covariates will help identify causal effects beyond the treatment cutoff, and enable Yoon to assess the extent to which the method generates precise, unbiased estimates comparable to those from a randomized controlled trial design.

In education settings, students with or at risk for learning difficulties in reading or math are invited to targeted interventions based on their standardized test scores. Because classrooms are not research settings, children are not randomly assigned to either the intervention or control condition.

“When you use RCT for program evaluation, you randomize students,” Yoon said. “But with randomization, students assigned to a control group don’t receive the intervention. If the RDD-CM method is found to be methodologically valid, and works as well as RCT, that solves much of the problem.”

If regression discontinuity design with covariate matching is found to be sound, Yoon would then develop simulations to determine the design and variable conditions that work for the application, such as the necessary sample size and other specifics such as optimal methods for handling absences of covariates.

“Educators and policymakers need to know whether an intervention works for everyone — especially those most in need,” Yoon said.

Learn more about this project in the CYFS Research Network. This study aligns the UNL Grand Challenges of early childhood education and development.


Center for Research on Children Youth Families and Schools Nebraska Academy for Methodology Analytics and Psychometrics