当前位置: X-MOL 学术ETRI J. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi‐layered attentional peephole convolutional LSTM for abstractive text summarization
ETRI Journal ( IF 1.4 ) Pub Date : 2020-12-18 , DOI: 10.4218/etrij.2019-0016
Md. Motiur Rahman 1 , Fazlul Hasan Siddiqui 2
Affiliation  

Abstractive text summarization is a process of making a summary of a given text by paraphrasing the facts of the text while keeping the meaning intact. The manmade summary generation process is laborious and time‐consuming. We present here a summary generation model that is based on multilayered attentional peephole convolutional long short‐term memory (MAPCoL; LSTM) in order to extract abstractive summaries of large text in an automated manner. We added the concept of attention in a peephole convolutional LSTM to improve the overall quality of a summary by giving weights to important parts of the source text during training. We evaluated the performance with regard to semantic coherence of our MAPCoL model over a popular dataset named CNN/Daily Mail, and found that MAPCoL outperformed other traditional LSTM‐based models. We found improvements in the performance of MAPCoL in different internal settings when compared to state‐of‐the‐art models of abstractive text summarization.

中文翻译:

用于抽象文本摘要的多层注意猫眼卷积LSTM

摘要性文本摘要是通过解释文本的事实,同时保持含义完整的方法,对给定文本进行摘要的过程。人为的摘要生成过程既费力又费时。我们在此提出一个摘要生成模型,该模型基于多层注意猫眼卷积长短期记忆(MAPCoL; LSTM),以便以自动化方式提取大文本的抽象摘要。我们在窥视孔卷积LSTM中添加了关注概念,以通过在训练过程中对源文本的重要部分进行加权来提高摘要的整体质量。我们在名为CNN / Daily Mail的流行数据集上评估了MAPCoL模型的语义一致性方面的性能,发现MAPCoL优于其他基于LSTM的传统模型。
更新日期:2020-12-18
down
wechat
bug