当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Self-Attention Networks for Intent Detection
arXiv - CS - Computation and Language Pub Date : 2020-06-28 , DOI: arxiv-2006.15585
Sevinj Yolchuyeva, G\'eza N\'emeth, B\'alint Gyires-T\'oth

Self-attention networks (SAN) have shown promising performance in various Natural Language Processing (NLP) scenarios, especially in machine translation. One of the main points of SANs is the strength of capturing long-range and multi-scale dependencies from the data. In this paper, we present a novel intent detection system which is based on a self-attention network and a Bi-LSTM. Our approach shows improvement by using a transformer model and deep averaging network-based universal sentence encoder compared to previous solutions. We evaluate the system on Snips, Smart Speaker, Smart Lights, and ATIS datasets by different evaluation metrics. The performance of the proposed model is compared with LSTM with the same datasets.

中文翻译:

用于意图检测的自注意力网络

自注意力网络 (SAN) 在各种自然语言处理 (NLP) 场景中表现出良好的性能,尤其是在机器翻译中。SAN 的要点之一是从数据中捕获远程和多尺度依赖关系的强度。在本文中,我们提出了一种基于自注意力网络和 Bi-LSTM 的新型意图检测系统。与以前的解决方案相比,我们的方法通过使用变换器模型和基于深度平均网络的通用句子编码器显示了改进。我们通过不同的评估指标在 Snips、Smart Speaker、Smart Lights 和 ATIS 数据集上评估系统。将所提出模型的性能与具有相同数据集的 LSTM 进行比较。
更新日期:2020-06-30
down
wechat
bug