当前位置: X-MOL 学术IEEE/CAA J. Automatica Sinica › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Convolutional multi-head self-attention on memory for aspect sentiment classification
IEEE/CAA Journal of Automatica Sinica ( IF 15.3 ) Pub Date : 2020-06-29 , DOI: 10.1109/jas.2020.1003243
Yaojie Zhang 1 , Bing Xu 1 , Tiejun Zhao 1
Affiliation  

This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory networkʼ s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network ( RNN ) long short term memory ( LSTM ) , gated recurrent unit ( GRU ) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.

中文翻译:


用于方面情感分类的记忆上的卷积多头自注意力



本文提出了一种基于方面的情感分类任务的方法,称为卷积多头自注意记忆网络(CMA-MemNet)。这是一种基于记忆网络的改进模型,使得从序列和方面提取更丰富、更复杂的语义信息成为可能。为了解决记忆网络无法捕获单词级别的上下文相关信息的问题,我们建议利用卷积来捕获 n-gram 语法信息。我们使用多头自注意力来弥补记忆网络忽略序列本身语义信息的问题。同时,与大多数循环神经网络(RNN)长短期记忆(LSTM)、门控循环单元(GRU)模型不同,我们保留了网络的并行性。我们在开放数据集 SemEval-2014 Task 4 和 SemEval-2016 Task 6 上进行实验。与一些流行的基线方法相比,我们的模型表现出色。
更新日期:2020-06-29
down
wechat
bug