当前位置: X-MOL 学术American Criminal Law Review › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The Solution to the Pervasive Bias and Discrimination in the Criminal Justice System: Transparent and Fair Artificial Intelligence
American Criminal Law Review ( IF 3.455 ) Pub Date : 2022-12-01
Mirko Bagaric, Jennifer Svilar, Melissa Bull, Dan Hunter,, Nigel Stobbs

Algorithms are increasingly used in the criminal justice system for a range of important matters, including determining the sentence that should be imposed on offenders; whether offenders should be released early from prison; and the locations where police should patrol. The use of algorithms in this domain has been severely criticized on a number of grounds, including that they are inaccurate and discriminate against minority groups. Algorithms are used widely in relation to many other social endeavors, including flying planes and assessing eligibility for loans and insurance. In fact, most people regularly use algorithms in their day-to-day lives. Google Maps is an algorithm, as are Siri, weather forecasts, and automatic pilots. The criminal justice system is one of the few human activities which has not substantially embraced the use of algorithms. This Article explains why the criticisms that have been leveled against the use of algorithms in the criminal justice domain are flawed. The manner in which algorithms oper-ate is generally misunderstood. Algorithms are not autonomous machine applications or processes. Instead, they are developed and programmed by people and their efficacy is determined by the quality of the design process. Intelligently designed algorithms can replicate human cognitive processing, but they have a number of advantages, including the speed at which they process information. Also, because they do not have feelings, they are more objective and predictable than people in their decision-making. They are a core component of overcoming the pervasive bias and discrimination that exists in the criminal justice system.

中文翻译:

刑事司法系统中普遍存在的偏见和歧视的解决方案:透明和公平的人工智能

算法在刑事司法系统中越来越多地用于处理一系列重要事项,包括确定应对罪犯判处的刑罚;罪犯是否应提早出狱;以及警察应该巡逻的地点。在这一领域中算法的使用受到了许多理由的严厉批评,包括它们不准确和歧视少数群体。算法在许多其他社会活动中被广泛使用,包括飞行和评估贷款和保险的资​​格。事实上,大多数人在日常生活中经常使用算法。谷歌地图是一种算法,Siri、天气预报和自动驾驶仪也是。刑事司法系统是为数不多的没有大量使用算法的人类活动之一。本文解释了为什么对在刑事司法领域使用算法的批评存在缺陷。算法的运行方式通常被误解。算法不是自主的机器应用程序或进程。相反,它们是由人开发和编程的,其功效取决于设计过程的质量。智能设计的算法可以复制人类的认知处理,但它们具有许多优势,包括处理信息的速度。此外,由于他们没有感情,他们在决策时比人更客观、更可预测。
更新日期:2022-02-03
down
wechat
bug