当前位置: X-MOL 学术Int. J. Mach. Learn. & Cyber. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hierarchical multi-attention networks for document classification
International Journal of Machine Learning and Cybernetics ( IF 3.1 ) Pub Date : 2021-01-14 , DOI: 10.1007/s13042-020-01260-x
Yingren Huang , Jiaojiao Chen , Shaomin Zheng , Yun Xue , Xiaohui Hu

Research of document classification is ongoing to employ the attention based-deep learning algorithms and achieves impressive results. Owing to the complexity of the document, classical models, as well as single attention mechanism, fail to meet the demand of high-accuracy classification. This paper proposes a method that classifies the document via the hierarchical multi-attention networks, which describes the document from the word-sentence level and the sentence-document level. Further, different attention strategies are performed on different levels, which enables accurate assigning of the attention weight. Specifically, the soft attention mechanism is applied to the word-sentence level while the CNN-attention to the sentence-document level. Due to the distinctiveness of the model, the proposed method delivers the highest accuracy compared to other state-of-the-art methods. In addition, the attention weight visualization outcomes present the effectiveness of attention mechanism in distinguishing the importance.



中文翻译:

用于文档分类的分层多关注网络

使用基于注意力的深度学习算法进行文档分类的研究正在进行中,并取得了令人印象深刻的结果。由于文档的复杂性,经典模型以及单关注机制无法满足高精度分类的需求。本文提出了一种通过层次化多注意网络对文档进行分类的方法,该方法从单词句子层次和句子文档层次对文档进行描述。此外,在不同级别上执行不同的注意力策略,这使得能够正确分配注意力权重。具体而言,软注意机制应用于单词句子级别,而CNN注意应用于句子文档级别。由于模型的独特性,与其他最新方法相比,该方法提供了最高的准确性。此外,注意权重可视化结果显示了注意机制在区分重要性方面的有效性。

更新日期:2021-01-14
down
wechat
bug