当前位置: X-MOL 学术ACM Trans. Asian Low Resour. Lang. Inf. Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Chinese Short Text Classification with Mutual-Attention Convolutional Neural Networks
ACM Transactions on Asian and Low-Resource Language Information Processing ( IF 1.8 ) Pub Date : 2020-07-07 , DOI: 10.1145/3388970
Ming Hao 1 , Bo Xu 2 , Jing-Yi Liang 3 , Bo-Wen Zhang 4 , Xu-Cheng Yin 5
Affiliation  

The methods based on the combination of word-level and character-level features can effectively boost performance on Chinese short text classification. A lot of works concatenate two-level features with little processing, which leads to losing feature information. In this work, we propose a novel framework called Mutual-Attention Convolutional Neural Networks, which integrates word and character-level features without losing too much feature information. We first generate two matrices with aligned information of two-level features by multiplying word and character features with a trainable matrix. Then, we stack them as a three-dimensional tensor. Finally, we generate the integrated features using a convolutional neural network. Extensive experiments on six public datasets demonstrate improved performance of our new framework over current methods.

中文翻译:

具有相互注意卷积神经网络的中文短文本分类

基于词级和字符级特征相结合的方法可以有效提高中文短文本分类的性能。很多作品在很少处理的情况下连接了两级特征,这导致了特征信息的丢失。在这项工作中,我们提出了一个名为 Mutual-Attention Convolutional Neural Networks 的新框架,它集成了单词和字符级别的特征,而不会丢失太多的特征信息。我们首先通过将单词和字符特征与可训练矩阵相乘来生成两个具有两级特征对齐信息的矩阵。然后,我们将它们堆叠为一个三维张量。最后,我们使用卷积神经网络生成集成特征。在六个公共数据集上进行的大量实验表明,我们的新框架比当前方法的性能有所提高。
更新日期:2020-07-07
down
wechat
bug