当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Dual Encoding for Abstractive Text Summarization
IEEE Transactions on Cybernetics ( IF 11.8 ) Pub Date : 2020-03-01 , DOI: 10.1109/tcyb.2018.2876317
Kaichun Yao , Libo Zhang , Dawei Du , Tiejian Luo , Lili Tao , Yanjun Wu

Recurrent neural network-based sequence-to-sequence attentional models have proven effective in abstractive text summarization. In this paper, we model abstractive text summarization using a dual encoding model. Different from the previous works only using a single encoder, the proposed method employs a dual encoder including the primary and the secondary encoders. Specifically, the primary encoder conducts coarse encoding in a regular way, while the secondary encoder models the importance of words and generates more fine encoding based on the input raw text and the previously generated output text summarization. The two level encodings are combined and fed into the decoder to generate more diverse summary that can decrease repetition phenomenon for long sequence generation. The experimental results on two challenging datasets (i.e., CNN/DailyMail and DUC 2004) demonstrate that our dual encoding model performs against existing methods.

中文翻译:

用于抽象文本摘要的双重编码

事实证明,基于递归神经网络的序列到序列注意模型在抽象文本摘要中是有效的。在本文中,我们使用双重编码模型对抽象文本摘要进行建模。与仅使用单个编码器的先前工作不同,所提出的方法采用了包括主要和次要编码器的双重编码器。具体而言,主编码器以常规方式进行粗略编码,而辅助编码器对单词的重要性进行建模,并根据输入的原始文本和先前生成的输出文本摘要生成更精细的编码。将两个级别的编码合并并馈入解码器,以生成更多样化的摘要,从而可以减少长序列生成的重复现象。在两个具有挑战性的数据集(即
更新日期:2020-03-01
down
wechat
bug