当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Combine Convolution with Recurrent Networks for Text Classification
arXiv - CS - Computation and Language Pub Date : 2020-06-29 , DOI: arxiv-2006.15795
Shengfei Lyu, Jiaqi Liu

Convolutional neural network (CNN) and recurrent neural network (RNN) are two popular architectures used in text classification. Traditional methods to combine the strengths of the two networks rely on streamlining them or concatenating features extracted from them. In this paper, we propose a novel method to keep the strengths of the two networks to a great extent. In the proposed model, a convolutional neural network is applied to learn a 2D weight matrix where each row reflects the importance of each word from different aspects. Meanwhile, we use a bi-directional RNN to process each word and employ a neural tensor layer that fuses forward and backward hidden states to get word representations. In the end, the weight matrix and word representations are combined to obtain the representation in a 2D matrix form for the text. We carry out experiments on a number of datasets for text classification. The experimental results confirm the effectiveness of the proposed method.

中文翻译:

将卷积与循环网络相结合进行文本分类

卷积神经网络 (CNN) 和循环神经网络 (RNN) 是文本分类中使用的两种流行架构。结合两个网络优势的传统方法依赖于简化它们或连接从它们中提取的特征。在本文中,我们提出了一种新方法来在很大程度上保持两个网络的优势。在所提出的模型中,应用卷积神经网络来学习二维权重矩阵,其中每一行从不同方面反映每个单词的重要性。同时,我们使用双向 RNN 来处理每个单词,并使用神经张量层融合前向和后向隐藏状态来获得单词表示。最后,将权重矩阵和词表示结合起来,得到文本的二维矩阵形式的表示。我们对许多用于文本分类的数据集进行了实验。实验结果证实了所提出方法的有效性。
更新日期:2020-06-30
down
wechat
bug