当前位置: X-MOL 学术Complex Intell. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-turn dialogue-oriented pretrained question generation model
Complex & Intelligent Systems ( IF 5.0 ) Pub Date : 2020-05-16 , DOI: 10.1007/s40747-020-00147-2
Yanmeng Wang , Wenge Rong , Jianfei Zhang , Shijie Zhou , Zhang Xiong

In recent years, teaching machines to ask meaningful and coherent questions has attracted considerable attention in natural language processing. Question generation has found wide applications in areas such as education (testing knowledge) and chatbots (enhancing interaction). Following previous studies on conversational question generation, we propose a pretrained, encoder–decoder model that can incorporate the semantic information from both passage and hidden conversation representations. We adopt BERT as the encoder to combine external text and dialogue history, and we design a multi-head attention-based decoder to incorporate the semantic information from both text and hidden dialogue representations into the decoding process, thereby generating coherent questions. Experiments with conversational question generation and document-grounded dialogue response generation tasks indicate that the proposed model is superior to baseline models in terms of both standard metrics and human evaluations.



中文翻译:

面向多回合对话的预习问题生成模型

近年来,在自然语言处理中,提出有意义和连贯问题的教学机备受关注。问题生成已在教育(测试知识)和聊天机器人(增强交互)等领域得到了广泛的应用。在先前关于会话问题生成的研究之后,我们提出了一种预训练的编码器/解码器模型,该模型可以合并来自段落和隐藏会话表示形式的语义信息。我们采用BERT作为编码器,以结合外部文本和对话历史记录,并设计了一种基于多头注意力的解码器,将来自文本和隐藏对话表示的语义信息纳入解码过程,从而产生连贯的问题。

更新日期:2020-05-16
down
wechat
bug