当前位置: X-MOL 学术Int. J. Coop. Inf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Multi-Attention Matching Model for Multiple-Choice Reading Comprehension
International Journal of Cooperative Information Systems ( IF 1.5 ) Pub Date : 2020-01-31 , DOI: 10.1142/s0218843019500102
Liguo Duan 1 , Jianying Gao 1 , Aiping Li 1
Affiliation  

The Multi-choice machine reading comprehension, selecting the correct answer in the candidate answers, requires obtaining the interaction semantics between the given passage and the question. In this paper, we propose an end-to-end deep learning model. It employs Bi-GRU to contextually encode passages and question, and specifically models complex interactions between the given passage and the question by six kinds of attention functions, including the concatenated attention, the bilinear attention, the element-wise dot attention, minus attention and bi-directional attentions of Query2Context, Context2Query. Then, we use the multi-level attention transfer reasoning mechanism to focus on further obtaining more accurate comprehensive semantics. To demonstrate the validity of our model, we performed experiments on the large reading comprehension data set RACE. The experimental results show that our model surpasses many state-of-the-art systems on the RACE data set and has good reasoning ability.

中文翻译:

多选阅读理解的多注意匹配模型

Multi-choice机器阅读理解,在候选答案中选择正确答案,需要获得给定段落和问题之间的交互语义。在本文中,我们提出了一种端到端的深度学习模型。它采用 Bi-GRU 对段落和问题进行上下文编码,并通过六种注意函数具体建模给定段落和问题之间的复杂交互,包括级联注意、双线性注意、元素点注意、负注意和Query2Context、Context2Query 的双向注意力。然后,我们使用多级注意力转移推理机制,专注于进一步获得更准确的综合语义。为了证明我们模型的有效性,我们在大型阅读理解数据集 RACE 上进行了实验。实验结果表明,我们的模型在 RACE 数据集上超越了许多最先进的系统,并且具有良好的推理能力。
更新日期:2020-01-31
down
wechat
bug