当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Inferential Text Generation with Multiple Knowledge Sources and Meta-Learning
arXiv - CS - Artificial Intelligence Pub Date : 2020-04-07 , DOI: arxiv-2004.03070
Daya Guo, Akari Asai, Duyu Tang, Nan Duan, Ming Gong, Linjun Shou, Daxin Jiang, Jian Yin and Ming Zhou

We study the problem of generating inferential texts of events for a variety of commonsense like \textit{if-else} relations. Existing approaches typically use limited evidence from training examples and learn for each relation individually. In this work, we use multiple knowledge sources as fuels for the model. Existing commonsense knowledge bases like ConceptNet are dominated by taxonomic knowledge (e.g., \textit{isA} and \textit{relatedTo} relations), having a limited number of inferential knowledge. We use not only structured commonsense knowledge bases, but also natural language snippets from search-engine results. These sources are incorporated into a generative base model via key-value memory network. In addition, we introduce a meta-learning based multi-task learning algorithm. For each targeted commonsense relation, we regard the learning of examples from other relations as the meta-training process, and the evaluation on examples from the targeted relation as the meta-test process. We conduct experiments on Event2Mind and ATOMIC datasets. Results show that both the integration of multiple knowledge sources and the use of the meta-learning algorithm improve the performance.

中文翻译:

具有多个知识源和元学习的推理文本生成

我们研究为各种常识(如 \textit{if-else} 关系)生成事件推理文本的问题。现有方法通常使用来自训练示例的有限证据并单独学习每个关系。在这项工作中,我们使用多个知识源作为模型的燃料。现有的常识知识库如 ConceptNet 以分类学知识(例如,\textit{isA} 和 \textit{relatedTo} 关系)为主,具有有限数量的推理知识。我们不仅使用结构化的常识知识库,还使用来自搜索引擎结果的自然语言片段。这些来源通过键值记忆网络合并到生成基础模型中。此外,我们介绍了一种基于元学习的多任务学习算法。对于每个有针对性的常识关系,我们把从其他关系中学习样本作为元训练过程,将目标关系中的样本评估作为元测试过程。我们在 Event2Mind 和 ATOMIC 数据集上进行实验。结果表明,多个知识源的集成和元学习算法的使用都提高了性能。
更新日期:2020-04-16
down
wechat
bug