当前位置: X-MOL 学术J. Res. Educ. Eff. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Asymdystopia: The Threat of Small Biases in Evaluations of Education Interventions That Need to Be Powered to Detect Small Impacts
Journal of Research on Educational Effectiveness ( IF 2.217 ) Pub Date : 2021-04-16 , DOI: 10.1080/19345747.2020.1849480
John Deke 1 , Thomas Wei 2 , Tim Kautz 1
Affiliation  

Abstract

Evaluators of education interventions are increasingly designing studies to detect impacts much smaller than the 0.20 standard deviations that Cohen characterized as “small.” While the need to detect smaller impacts is based on compelling arguments that such impacts are substantively meaningful, the drive to detect smaller impacts may create a new challenge for researchers: the need to guard against smaller biases. The purpose of this article is twofold. First, we examine the potential for small biases to increase the risk of making false inferences as studies are powered to detect smaller impacts, a phenomenon we refer to as asymdystopia. We examine this potential for two of the most rigorous designs commonly used in education research—randomized controlled trials and regression discontinuity designs. Second, we recommend strategies researchers can use to avoid or mitigate these biases.



中文翻译:

非对称性失调:评估需要干预以发现微小影响的教育干预措施中的小偏见的威胁

摘要

教育干预措施的评估人员正在越来越多地设计研究,以检测比Cohen定义为“小”的0.20标准差小得多的影响。尽管发现较小影响的必要性是基于令人信服的论点,即此类影响具有实质意义,但发现较小影响的动力可能会给研究人员带来新的挑战:需要防止较小的偏见。本文的目的是双重的。首先,随着研究能够发现较小的影响,我们研究了小的偏见可能增加做出错误推断的风险,这种现象我们称为不对称性。我们研究了教育研究中最常用的两种最严格的设计的潜力-随机对照试验和回归不连续性设计。其次,我们建议研究人员可以用来避免或减轻这些偏见的策略。

更新日期:2021-04-16
down
wechat
bug