当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A transductive learning method to leverage graph structure for few-shot learning
Pattern Recognition Letters ( IF 3.9 ) Pub Date : 2022-05-14 , DOI: 10.1016/j.patrec.2022.05.013
Yaning Wang , Zijian Liu , Yang Luo , Chunbo Luo

Few-shot learning has attracted extensive research attention given its capability to classify unseen data from limited samples, potentially addressing the key issue of data scarcity commonly existing in many machine learning tasks. This paper proposes a new transductive learning method that integrates information propagation and prototype rectification in few-shot learning, which achieves state-of-the-art classification performance on four popular datasets. We use first-order information propagation instead of infinite order method to avoid the over-smoothing caused by iterations of information aggregation and node updating in graph neural networks. We further reveal that current transductive few-shot learning models often assume the datasets have balanced classes, which cannot be guaranteed in practice. We thus propose to estimate the distribution of task samples to optimize the number of iterations so as to enhance the robustness of the model. Extensive experiments validate the proposed model and reveal the confirmation bias that could be effectively addressed by the optimization strategy.



中文翻译:

一种利用图结构进行小样本学习的转导学习方法

少样本学习因其能够从有限的样本中对看不见的数据进行分类的能力而引起了广泛的研究关注,这可能解决了许多机器学习任务中普遍存在的数据稀缺的关键问题。本文提出了一种新的转导式学习方法,将信息传播和原型校正集成到少样本学习中,在四个流行的数据集上实现了最先进的分类性能。我们使用一阶信息传播而不是无限阶方法来避免图神经网络中由信息聚合和节点更新的迭代引起的过度平滑。我们进一步揭示了当前的转导小样本学习模型通常假设数据集具有平衡的类,这在实践中无法保证。因此,我们建议估计任务样本的分布以优化迭代次数,从而增强模型的鲁棒性。大量实验验证了所提出的模型,并揭示了优化策略可以有效解决的确认偏差。

更新日期:2022-05-14
down
wechat
bug