当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Zero-shot domain adaptation for natural language inference by projecting superficial words out
Knowledge-Based Systems ( IF 7.2 ) Pub Date : 2021-06-14 , DOI: 10.1016/j.knosys.2021.107193
Wanyun Cui , Guangyu Zheng , Wei Wang

In natural language inference, the semantics of some words do not affect the inference. Such information is considered superficial and brings overfitting. In this paper, we project the superficial information out to learn a more general representation. The generality refer to the adaptation to different domains. The projection is over the dropout framework. To project all information of recurring words out, we propose a parameter-free model (HardDrop). By further noticing that the information of some recurring words needs to be reserved, we propose SoftDrop to learn to cautiously project the information out. Our approaches outperform the competitors over the source domain and over zero-shot target domains. For some target domains (e.g. from RTE to MRPC), the accuracy/f1-score increases from 47.6%/0.501 to 68.1%/0.771.



中文翻译:

通过将表面词投影出来进行自然语言推理的零样本域适应


在自然语言推理中,一些词的语义不影响推理。这些信息被认为是肤浅的,会导致过度拟合。在本文中,我们将表面信息投影出来以学习更一般的表示。通用性是指对不同领域的适应。投影在 dropout 框架上。为了将重复出现的词的所有信息投影出来,我们提出了一个无参数模型(HardDrop)。通过进一步注意到一些重复出现的词的信息需要保留,我们建议SoftDrop学会谨慎地将信息投射出去。我们的方法在源域和零样本目标域上的表现优于竞争对手。对于某些目标域(例如从 RTE 到 MRPC),准确率/f1-score 从 47.6%/0.501 增加到 68.1%/0.771。

更新日期:2021-06-14
down
wechat
bug