当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Dynamic Support Network for Few-Shot Class Incremental Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 23.6 ) Pub Date : 2022-05-19 , DOI: 10.1109/tpami.2022.3175849
Boyu Yang 1 , Mingbao Lin 2 , Yunxiao Zhang 1 , Binghao Liu 1 , Xiaodan Liang 3 , Rongrong Ji 4 , Qixiang Ye 1
Affiliation  

Few-shot class-incremental learning (FSCIL) is challenged by catastrophically forgetting old classes and over-fitting new classes. Revealed by our analyses, the problems are caused by feature distribution crumbling, which leads to class confusion when continuously embedding few samples to a fixed feature space. In this study, we propose a Dynamic Support Network (DSN), which refers to an adaptively updating network with compressive node expansion to “support” the feature space. In each training session, DSN tentatively expands network nodes to enlarge feature representation capacity for incremental classes. It then dynamically compresses the expanded network by node self-activation to pursue compact feature representation, which alleviates over-fitting. Simultaneously, DSN selectively recalls old class distributions during incremental learning to support feature distributions and avoid confusion between classes. DSN with compressive node expansion and class distribution recalling provides a systematic solution for the problems of catastrophic forgetting and overfitting. Experiments on CUB, CIFAR-100, and miniImage datasets show that DSN significantly improves upon the baseline approach, achieving new state-of-the-arts.

中文翻译:

小样本增量学习的动态支持网络

Few-shot class-incremental learning (FSCIL) 面临灾难性遗忘旧课程和过度拟合新课程的挑战。我们的分析表明,这些问题是由特征分布崩溃引起的,当连续将少量样本嵌入到固定特征空间时会导致类混淆。在这项研究中,我们提出了动态支持网络 (DSN),它是指具有压缩节点扩展以“支持”特征空间的自适应更新网络。在每次训练中,DSN 都会尝试扩展网络节点以扩大增量类的特征表示能力。然后通过节点自激活动态压缩扩展网络以追求紧凑的特征表示,从而减轻过度拟合。同时地,DSN 在增量学习期间有选择地召回旧类别分布以支持特征分布并避免类别之间的混淆。具有压缩节点扩展和类别分布召回的 DSN 为灾难性遗忘和过度拟合问题提供了系统的解决方案。在 CUB、CIFAR-100 和 miniImage 数据集上进行的实验表明,DSN 显着改进了基线方法,达到了新的最先进水平。
更新日期:2022-05-19
down
wechat
bug