当前位置: X-MOL 学术ACM Trans. Intell. Syst. Technol. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation
ACM Transactions on Intelligent Systems and Technology ( IF 5 ) Pub Date : 2020-05-29 , DOI: 10.1145/3384675
Yingying Zhang 1 , Quan Fang 2 , Shengsheng Qian 2 , Changsheng Xu 3
Affiliation  

Natural language generation has become a fundamental task in dialogue systems. RNN-based natural response generation methods encode the dialogue context and decode it into a response. However, they tend to generate dull and simple responses. In this article, we propose a novel framework, called KAWA-DRG (Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation) to model conversation-specific external knowledge and the importance variances of dialogue context in a unified adversarial encoder-decoder learning framework. In KAWA-DRG, a co-attention mechanism attends to important parts within and among context utterances with word-utterance-level attention. Prior knowledge is integrated into the conditional Wasserstein auto-encoder for learning the latent variable space. The posterior and prior distribution of latent variables are generated and trained through adversarial learning. We evaluate our model on Switchboard, DailyDialog, In-Car Assistant, and Ubuntu Dialogue Corpus. Experimental results show that KAWA-DRG outperforms the existing methods.

中文翻译:

知识意识专心的 Wasserstein 对抗性对话响应生成

自然语言生成已成为对话系统中的一项基本任务。基于 RNN 的自然响应生成方法对对话上下文进行编码并将其解码为响应。然而,它们往往会产生沉闷而简单的反应。在本文中,我们提出了一个新颖的框架,称为KAWA-DRG(Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation)在统一的对抗性编码器-解码器学习框架中对特定于对话的外部知识和对话上下文的重要性差异进行建模。在 KAWA-DRG 中,共同注意机制通过单词-话语级别的注意力来关注上下文话语内部和之间的重要部分。先验知识被集成到条件 Wasserstein 自动编码器中,用于学习潜在变量空间。潜在变量的后验和先验分布是通过对抗性学习生成和训练的。我们在 Switchboard、DailyDialog、In-Car Assistant 和 Ubuntu Dialogue Corpus 上评估我们的模型。实验结果表明,KAWA-DRG 优于现有方法。
更新日期:2020-05-29
down
wechat
bug