当前位置: X-MOL 学术Journal of Consumer Psychology › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Consumers Object to Algorithms Making Morally Relevant Tradeoffs Because of Algorithms’ Consequentialist Decision Strategies
Journal of Consumer Psychology ( IF 4.551 ) Pub Date : 2021-07-07 , DOI: 10.1002/jcpy.1266
Berkeley J. Dietvorst 1 , Daniel M. Bartels 1
Affiliation  

Why do consumers embrace some algorithms and find others objectionable? The moral relevance of the domain in which an algorithm operates plays a role. The authors find that consumers believe that algorithms are more likely to use maximization (i.e., attempting to maximize some measured outcome) as a decision-making strategy than human decision makers (Study 1). Consumers find this consequentialist decision strategy to be objectionable in morally relevant tradeoffs and disapprove of algorithms making morally relevant tradeoffs as a result (Studies 2, 3a, & 3b). Consumers also object to human employees making morally relevant tradeoffs when they are trained to make decisions by maximizing outcomes, consistent with the notion that their objections to algorithmic decision makers stem from concerns about maximization (Study 4). The results provide insight into why consumers object to some consumer relevant algorithms while adopting others.

中文翻译:

由于算法的后果主义决策策略,消费者反对算法做出道德相关的权衡

为什么消费者会接受一些算法而发现另一些令人反感?算法运行的领域的道德相关性起着重要作用。作者发现,与人类决策者相比,消费者认为算法更可能使用最大化(即尝试最大化某些测量结果)作为决策策略(研究 1)。消费者发现这种结果主义决策策略在道德相关的权衡中是令人反感的,并且不赞成算法因此做出道德相关的权衡(研究 2、3a 和 3b)。消费者还反对人类员工在接受培训以通过最大化结果来做出决策时做出道德相关的权衡,这与他们对算法决策者的反对源于对最大化的担忧的概念一致(研究 4)。
更新日期:2021-07-07
down
wechat
bug