当前位置: X-MOL 学术Sci. China Inf. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Few-shot text classification by leveraging bi-directional attention and cross-class knowledge
Science China Information Sciences ( IF 8.8 ) Pub Date : 2021-02-07 , DOI: 10.1007/s11432-020-3055-1
Ning Pang , Xiang Zhao , Wei Wang , Weidong Xiao , Deke Guo

Few-shot text classification targets at the situation where a model is developed to classify newly incoming query instances after acquiring knowledge from a few support instances. In this paper, we investigate few-shot text classification under a metric-based meta-learning framework. While the representations of the query and support instances are the key to the classification, existing study handles them independently in the text encoding stage. To better describe the classification features, we propose to exploit their interaction with adapted bi-directional attention mechanism. Moreover, distinct from previous approaches that encode different classes individually, we leverage the underlying cross-class knowledge for classification. To this end, we conceive the learning target by incorporating the large margin loss, which is expected to shorten the intra-class distances while enlarging the inter-class distances. To validate the design, we conduct extensive experiments on three datasets, and the experimental results demonstrate that our solution outperforms its state-of-the-art competitors. Detailed analyses also reveal that the bi-directional attention and the cross-class knowledge both contribute to the overall performance.



中文翻译:

利用双向注意力和跨类别知识对文本进行少量分类

很少有针对性的文本分类针对的情况是,在从一些支持实例获取知识之后,开发了一种模型来对新进入的查询实例进行分类。在本文中,我们研究了在基于度量的元学习框架下的几次快照文本分类。虽然查询和支持实例的表示形式是分类的关键,但是现有研究在文本编码阶段对它们进行了独立处理。为了更好地描述分类特征,我们建议利用它们与适应的双向注意机制的相互作用。此外,与以前的单独编码不同类的方法不同,我们利用潜在的跨类知识进行分类。为此,我们通过合并较大的保证金亏损来构思学习目标,预期这将缩短类内距离,同时增加类间距离。为了验证设计,我们对三个数据集进行了广泛的实验,实验结果表明我们的解决方案优于其最先进的竞争对手。详细的分析还表明,双向注意力和跨类知识都对整体表现有所贡献。

更新日期:2021-02-15
down
wechat
bug