当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ALICE: Active Learning with Contrastive Natural Language Explanations
arXiv - CS - Computation and Language Pub Date : 2020-09-22 , DOI: arxiv-2009.10259
Weixin Liang, James Zou, Zhou Yu

Training a supervised neural network classifier typically requires many annotated training samples. Collecting and annotating a large number of data points are costly and sometimes even infeasible. Traditional annotation process uses a low-bandwidth human-machine communication interface: classification labels, each of which only provides several bits of information. We propose Active Learning with Contrastive Explanations (ALICE), an expert-in-the-loop training framework that utilizes contrastive natural language explanations to improve data efficiency in learning. ALICE learns to first use active learning to select the most informative pairs of label classes to elicit contrastive natural language explanations from experts. Then it extracts knowledge from these explanations using a semantic parser. Finally, it incorporates the extracted knowledge through dynamically changing the learning model's structure. We applied ALICE in two visual recognition tasks, bird species classification and social relationship classification. We found by incorporating contrastive explanations, our models outperform baseline models that are trained with 40-100% more training data. We found that adding 1 explanation leads to similar performance gain as adding 13-30 labeled training data points.

中文翻译:

ALICE:使用对比自然语言解释的主动学习

训练监督神经网络分类器通常需要许多带注释的训练样本。收集和注释大量数据点成本高昂,有时甚至不可行。传统的标注过程使用低带宽的人机通信接口:分类标签,每个标签只提供几位信息。我们提出了具有对比解释的主动学习(ALICE),这是一种专家在环训练框架,它利用对比自然语言解释来提高学习中的数据效率。ALICE 学会首先使用主动学习来选择信息量最大的标签类对,以从专家那里引出对比鲜明的自然语言解释。然后它使用语义解析器从这些解释中提取知识。最后,它通过动态改变学习模型的结构来合并提取的知识。我们将 ALICE 应用于两个视觉识别任务,鸟类物种分类和社会关系分类。我们发现通过结合对比解释,我们的模型优于使用多 40-100% 的训练数据训练的基线模型。我们发现添加 1 个解释会带来与添加 13-30 个带标签的训练数据点类似的性能提升。
更新日期:2020-10-02
down
wechat
bug