当前位置: X-MOL 学术Engaging Science, Technology, and Society › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction
Engaging Science, Technology, and Society ( IF 1.0 ) Pub Date : 2019-03-23 , DOI: 10.17351/ests2019.260
Madeleine Clare Elish

As debates about the policy and ethical implications of AI systems grow, it will be increasingly important to accurately locate who is responsible when agency is distributed in a system and control over an action is mediated through time and space. Analyzing several high-profile accidents involving complex and automated socio-technical systems and the media coverage that surrounded them, I introduce the concept of a moral crumple zone to describe how responsibility for an action may be misattributed to a human actor who had limited control over the behavior of an automated or autonomous system. Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a highly complex and automated system may become simply a component—accidentally or intentionally—that bears the brunt of the moral and legal responsibilities when the overall system malfunctions. While the crumple zone in a car is meant to protect the human driver, the moral crumple zone protects the integrity of the technological system, at the expense of the nearest human operator. The concept is both a challenge to and an opportunity for the design and regulation of human-robot systems. At stake in articulating moral crumple zones is not only the misattribution of responsibility but also the ways in which new forms of consumer and worker harm may develop in new complex, automated, or purported autonomous technologies.

中文翻译:

道德压区:人机交互中的警示故事

随着有关AI系统的政策和道德影响的辩论不断增多,准确地确定代理在系统中的分布情况以及对行为的控制是通过时间和空间来调节的,谁来负责就变得越来越重要。在分析涉及复杂和自动化的社会技术系统的数起引人注目的事故以及围绕它们的媒体报道之后,我介绍了道德崩溃区的概念,以描述对某行为的责任可能如何归因于对行为控制有限的人类演员。自动化或自治系统的行为。正如汽车的压皱区旨在吸收碰撞中的撞击力一样,当整个系统发生故障时,高度复杂和自动化的系统中的人员可能只是偶然或有意地成为承担道德和法律责任首当其冲的组件。汽车中的压痕区旨在保护驾驶员,而道德的压痕区则保护技术系统的完整性,但要牺牲最近的操作员。对于人类机器人系统的设计和监管,这一概念既是挑战,也是机遇。阐明道德崩溃区的关键不仅在于责任的错误归因,还在于在新的复杂,自动化或声称的自治技术中可能会发展出新的消费者和工人伤害形式的方式。道德崩溃区保护了技术系统的完整性,却以最接近的操作员为代价。对于人类机器人系统的设计和监管,这一概念既是挑战,也是机遇。阐明道德崩溃区的关键不仅在于责任的错误归因,还在于在新的复杂,自动化或声称的自治技术中可能会发展出新的消费者和工人伤害形式的方式。道德崩溃区保护了技术系统的完整性,却以最接近的操作员为代价。该概念既是对人类机器人系统的设计和监管的挑战,也是机遇。阐明道德崩溃区的关键不仅在于责任的错误归因,还在于在新的复杂,自动化或声称的自治技术中可能会发展出新的消费者和工人伤害形式的方式。
更新日期:2019-03-23
down
wechat
bug