当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-Label Text Classification using Attention-based Graph Neural Network
arXiv - CS - Computation and Language Pub Date : 2020-03-22 , DOI: arxiv-2003.11644
Ankit Pal, Muru Selvakumar and Malaikannan Sankarasubbu

In Multi-Label Text Classification (MLTC), one sample can belong to more than one class. It is observed that most MLTC tasks, there are dependencies or correlations among labels. Existing methods tend to ignore the relationship among labels. In this paper, a graph attention network-based model is proposed to capture the attentive dependency structure among the labels. The graph attention network uses a feature matrix and a correlation matrix to capture and explore the crucial dependencies between the labels and generate classifiers for the task. The generated classifiers are applied to sentence feature vectors obtained from the text feature extraction network (BiLSTM) to enable end-to-end training. Attention allows the system to assign different weights to neighbor nodes per label, thus allowing it to learn the dependencies among labels implicitly. The results of the proposed model are validated on five real-world MLTC datasets. The proposed model achieves similar or better performance compared to the previous state-of-the-art models.

中文翻译:

使用基于注意力的图神经网络的多标签文本分类

在多标签文本分类 (MLTC) 中,一个样本可以属于多个类。据观察,大多数 MLTC 任务,标签之间存在依赖性或相关性。现有方法往往忽略标签之间的关系。在本文中,提出了一种基于图注意力网络的模型来捕获标签之间的注意力依赖结构。图注意力网络使用特征矩阵和相关矩阵来捕获和探索标签之间的关键依赖关系,并为任务生成分类器。生成的分类器应用于从文本特征提取网络 (BiLSTM) 获得的句子特征向量,以实现端到端的训练。注意允许系统为每个标签的邻居节点分配不同的权重,从而允许它隐式地学习标签之间的依赖关系。所提出模型的结果在五个真实世界的 MLTC 数据集上得到验证。与之前最先进的模型相比,所提出的模型实现了相似或更好的性能。
更新日期:2020-03-27
down
wechat
bug