当前位置: X-MOL 学术Ergo, an Open Access Journal of Philosophy › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
What's Wrong with Machine Bias
Ergo, an Open Access Journal of Philosophy ( IF 0.5 ) Pub Date : 2019-07-11 , DOI: 10.3998/ergo.12405314.0006.015
Clinton Castro

Data-driven, decision-making technologies used in the justice system to inform decisions about bail, parole, and prison sentencing are biased against historically marginalized groups (Angwin, Larson, Mattu, & Kirchner 2016). But these technologies’ judgments—which reproduce patterns of wrongful discrimination embedded in the historical datasets that they are trained on—are well-evidenced. This presents a puzzle: how can we account for the wrong these judgments engender without also indicting morally permissible statistical inferences about persons? I motivate this puzzle and attempt an answer.

中文翻译:

机器偏差有什么问题

在司法系统中用于以保释,假释和监狱判决为依据的数据驱动型决策技术,偏向于历史上处于边缘地位的群体(Angwin,Larson,Mattu和&Kirchner 2016)。但是,这些技术的判断是有充分根据的,它们的判断再现了受过训练的历史数据集中嵌入的错误歧视的模式。这就提出了一个难题:我们如何在不对个人进行道德上允许的统计推断的情况下解释这些判断所引起的错误?我激发了这个难题,并尝试了一个答案。
更新日期:2019-07-11
down
wechat
bug