当前位置: X-MOL 学术Artif. Intell. Med. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The Three Ghosts of Medical AI: Can the Black-Box Present Deliver?
Artificial Intelligence in Medicine ( IF 7.5 ) Pub Date : 2021-08-28 , DOI: 10.1016/j.artmed.2021.102158
Thomas P Quinn 1 , Stephan Jacobs 1 , Manisha Senadeera 1 , Vuong Le 1 , Simon Coghlan 2
Affiliation  

Our title alludes to the three Christmas ghosts encountered by Ebenezer Scrooge in A Christmas Carol, who guide Ebenezer through the past, present, and future of Christmas holiday events. Similarly, our article takes readers through a journey of the past, present, and future of medical AI. In doing so, we focus on the crux of modern machine learning: the reliance on powerful but intrinsically opaque models. When applied to the healthcare domain, these models fail to meet the needs for transparency that their clinician and patient end-users require. We review the implications of this failure, and argue that opaque models (1) lack quality assurance, (2) fail to elicit trust, and (3) restrict physician-patient dialogue. We then discuss how upholding transparency in all aspects of model design and model validation can help ensure the reliability and success of medical AI.



中文翻译:

医疗 AI 的三大幽灵:黑盒礼物能否兑现?

我们的标题暗示了 Ebenezer Scrooge 在圣诞颂歌中遇到的三个圣诞幽灵,他们引导 Ebenezer 了解圣诞节假期活动的过去、现在和未来。同样,我们的文章将带领读者了解医疗 AI 的过去、现在和未来。在此过程中,我们专注于现代机器学习的关键:对强大但本质上不透明的模型的依赖。当应用于医疗保健领域时,这些模型无法满足其临床医生和患者最终用户对透明度的需求。我们回顾了这一失败的影响,并认为不透明的模型 (1) 缺乏质量保证,(2) 未能引起信任,以及 (3) 限制了医患对话。然后,我们讨论了在模型设计和模型验证的各个方面保持透明度如何有助于确保医疗 AI 的可靠性和成功。

更新日期:2021-08-29
down
wechat
bug