当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-turn intent determination and slot filling with neural networks and regular expressions
Knowledge-Based Systems ( IF 8.8 ) Pub Date : 2020-09-18 , DOI: 10.1016/j.knosys.2020.106428
Waheed Ahmed Abro , Guilin Qi , Zafar Ali , Yansong Feng , Muhammad Aamir

Intent determination and slot filling are two prominent research areas related to natural language understanding (NLU). In a multi-turn NLU system, contextual information from dialogue history is exploited to mitigate the ambiguity of user utterance. State-of-the-art models employ memory networks to encode dialogue context, which is used by neural networks for determining user intent and associated slots. However, these methods rely on a large amount of labelled data, whereas we often have limited labelled data. To address this problem, we propose a multi-task learning model based on neural networks and regular expressions (REs), to jointly perform intent determination and slot filling tasks. The proposed model integrates neural networks with REs to encode domain knowledge and handle cases with a limited amount of labelled data in an end-to-end trainable manner. More specifically, the model employs a pre-trained BERT model to obtain contextual word representations of user utterances. These representations are utilized by a memory network to encode multi-turn information which is shared by the tasks. Furthermore, the convolutional neural network (CNN) and the recurrent neural network (RNN) are applied to contextual word representations and dialogue context for intent determination and slot filling tasks, respectively. These neural networks are then combined with REs which encode domain knowledge about a particular intent or slot value. Finally, the two neural networks are trained simultaneously by minimizing the joint loss. Extensive experiments on Key-Value Retrieval and Frames datasets show that the proposed model outperforms baseline methods in both tasks while requiring modest human effort.



中文翻译:

用神经网络和正则表达式进行多转意图确定和空位填充

意图确定和空缺填充是与自然语言理解(NLU)相关的两个重要研究领域。在多回合NLU系统中,利用对话历史记录中的上下文信息来减轻用户话语的歧义。最新的模型采用存储网络对对话上下文进行编码,神经网络将其用于确定用户意图和相关的插槽。但是,这些方法依赖于大量标记数据,而我们通常只有有限的标记数据。为了解决这个问题,我们提出了一种基于神经网络和正则表达式(RE)的多任务学习模型,以共同执行意图确定和任务填充任务。提出的模型将神经网络与RE集成在一起,以对域知识进行编码,并以端到端的可训练方式处理带有少量标记数据的案例。更具体地,该模型采用预训练的BERT模型来获得用户话语的上下文词表示。存储器网络利用这些表示来编码任务共享的多匝信息。此外,将卷积神经网络(CNN)和递归神经网络(RNN)分别应用于上下文词表示和对话上下文,以进行意图确定和空位填充任务。然后将这些神经网络与RE结合,RE对关于特定意图或广告位值的领域知识进行编码。最后,通过最小化关节损失来同时训练两个神经网络。

更新日期:2020-09-20
down
wechat
bug