Skip Navigation
Title:  Asymdystopia: The threat of small biases in evaluations of education interventions that need to be powered to detect small impacts
Description: Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen (1988) characterized as "small." While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts may create a new challenge for researchers: the need to guard against smaller inaccuracies (or "biases"). The purpose of this report is twofold. First, the report examines the potential for small biases to increase the risk of making false inferences as studies are powered to detect smaller impacts, a phenomenon the report calls asymdystopia. The report examines this potential for both randomized controlled trials (RCTs) and studies using regression discontinuity designs (RDDs). Second, the report recommends strategies researchers can use to avoid or mitigate these biases. For RCTs, the report recommends that evaluators either substantially limit attrition rates or offer a strong justification for why attrition is unlikely to be related to study outcomes. For RDDs, new statistical methods can protect against bias from incorrect regression models, but these methods often require larger sample sizes in order to detect small effects.
Online Availability:
Cover Date: October 2017
Web Release: October 3, 2017
Print Release: October 3, 2017
Publication #: NCEE 20184002
General Ordering Information
Center/Program: NCEE
Authors: John Deke, Thomas Wei, and Tim Kautz
Type of Product: Research Report
Keywords:
Questions: For questions about the content of this Research Report, please contact:
Amy Johnson.