当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Novel Deep Learning Method for Textual Sentiment Analysis
arXiv - CS - Computation and Language Pub Date : 2021-02-23 , DOI: arxiv-2102.11651
Hossein Sadr, Mozhdeh Nazari Solimandarabi, Mir Mohsen Pedram, Mohammad Teshnehlab

Sentiment analysis is known as one of the most crucial tasks in the field of natural language processing and Convolutional Neural Network (CNN) is one of those prominent models that is commonly used for this aim. Although convolutional neural networks have obtained remarkable results in recent years, they are still confronted with some limitations. Firstly, they consider that all words in a sentence have equal contributions in the sentence meaning representation and are not able to extract informative words. Secondly, they require a large number of training data to obtain considerable results while they have many parameters that must be accurately adjusted. To this end, a convolutional neural network integrated with a hierarchical attention layer is proposed which is able to extract informative words and assign them higher weight. Moreover, the effect of transfer learning that transfers knowledge learned in the source domain to the target domain with the aim of improving the performance is also explored. Based on the empirical results, the proposed model not only has higher classification accuracy and can extract informative words but also applying incremental transfer learning can significantly enhance the classification performance.

中文翻译:

一种用于文本情感分析的新型深度学习方法

情感分析是自然语言处理领域中最关键的任务之一,而卷积神经网络(CNN)是为此目的通常使用的杰出模型之一。尽管卷积神经网络近年来取得了显著成果,但它们仍然面临一些局限性。首先,他们认为一个句子中的所有单词对句子的意义表示具有相等的贡献,并且不能提取信息丰富的单词。其次,他们需要大量的训练数据才能获得可观的结果,同时它们具有许多必须精确调整的参数。为此,提出了一种与层次化关注层集成的卷积神经网络,该网络能够提取翔实的单词并为其赋予更高的权重。而且,还研究了将学习从源域学习的知识转移到目标域以提高性能的转移学习的效果。基于经验结果,提出的模型不仅具有较高的分类精度,可以提取信息量大的单词,而且应用增量转移学习可以显着提高分类性能。
更新日期:2021-02-24
down
wechat
bug