当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Consumer-Driven Explanations for Machine Learning Decisions: An Empirical Study of Robustness
arXiv - CS - Machine Learning Pub Date : 2020-01-13 , DOI: arxiv-2001.05573
Michael Hind, Dennis Wei, Yunfeng Zhang

Many proposed methods for explaining machine learning predictions are in fact challenging to understand for nontechnical consumers. This paper builds upon an alternative consumer-driven approach called TED that asks for explanations to be provided in training data, along with target labels. Using semi-synthetic data from credit approval and employee retention applications, experiments are conducted to investigate some practical considerations with TED, including its performance with different classification algorithms, varying numbers of explanations, and variability in explanations. A new algorithm is proposed to handle the case where some training examples do not have explanations. Our results show that TED is robust to increasing numbers of explanations, noisy explanations, and large fractions of missing explanations, thus making advances toward its practical deployment.

中文翻译:

消费者驱动的机器学习决策解释:稳健性的实证研究

许多用于解释机器学习预测的建议方法实际上对于非技术消费者来说很难理解。本文建立在一种称为 TED 的替代消费者驱动方法的基础上,该方法要求在训练数据中提供解释以及目标标签。使用来自信用批准和员工保留申请的半合成数据,进行了实验以研究 TED 的一些实际考虑因素,包括其在不同分类算法下的表现、不同数量的解释以及解释的可变性。提出了一种新的算法来处理一些训练样例没有解释的情况。我们的结果表明 TED 对越来越多的解释、嘈杂的解释和大部分缺失的解释具有鲁棒性,
更新日期:2020-01-17
down
wechat
bug