当前位置: X-MOL 学术Decis. Support Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
S2SAN: A sentence-to-sentence attention network for sentiment analysis of online reviews
Decision Support Systems ( IF 6.7 ) Pub Date : 2021-05-21 , DOI: 10.1016/j.dss.2021.113603
Ping Wang , Jiangnan Li , Jingrui Hou

Many existing attention-based deep learning approaches to sentiment analysis have focused on words and represent an entire review text as a word sequence. However, these approaches overlook the differences in the importance of each sentence to the complete text. To solve this problem, some work has been performed to calculate sentence-level attention, but these studies use the same approach that is applied to word-level attention, which leads to unnecessary sequential structures and increased complexity of sentence representation. Therefore, in this paper, we propose a sentence-to-sentence attention network1 (S2SAN) using multihead self-attention. We conducted several domain-specific, cross-domain and multidomain sentiment analysis experiments with real-world datasets. The experimental results show that S2SAN outperforms other state-of-the-art models. Some classical sentiment classifiers [e.g., convolutional neural network (CNN), recurrent neural network (RNN), and long short-term memory (LSTM) models] achieve better accuracies when they are reconfigured to include sentence-to-sentence attention.



中文翻译:

S2SAN:用于在线评论情感分析的句子到句子注意网络

许多现有的基于注意力的深度学习情感分析方法都专注于单词并将整个评论文本表示为单词序列。然而,这些方法忽略了每个句子对完整文本的重要性的差异。为了解决这个问题,已经进行了一些计算句子级注意力的工作,但这些研究使用了与应用于词级注意力相同的方法,这导致了不必要的序列结构和句子表示的复杂性增加。因此,在本文中,我们提出了一个句子到句子的注意力网络1(S2SAN) 使用多头自注意力。我们使用真实世界的数据集进行了多个特定领域、跨领域和多领域情感分析实验。实验结果表明,S2SAN 优于其他最先进的模型。一些经典的情感分类器 [例如,卷积神经网络 (CNN)、循环神经网络 (RNN) 和长短期记忆 (LSTM) 模型] 在重新配置以包含句子到句子的注意力时实现了更好的准确性。

更新日期:2021-05-21
down
wechat
bug