当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
FoCL: Feature-oriented continual learning for generative models
Pattern Recognition ( IF 7.5 ) Pub Date : 2021-07-01 , DOI: 10.1016/j.patcog.2021.108127
Qicheng Lao , Mehrzad Mortazavi , Marzieh Tahaei , Francis Dutil , Thomas Fevens , Mohammad Havaei

In this paper, we propose a general framework in continual learning for generative models: Feature-oriented Continual Learning (FoCL). Unlike previous works that aim to solve the catastrophic forgetting problem by introducing regularization in the parameter space or image space, FoCL imposes regularization in the feature space. We show in our experiments that FoCL has faster adaptation to distributional changes in sequentially arriving tasks, and achieves state-of-the-art performance for generative models in task incremental learning. We discuss choices of combined regularization spaces towards different use case scenarios for boosted performance, e.g., tasks that have high variability in the background. Finally, we introduce a forgetfulness measure that fairly evaluates the degree to which a model suffers from forgetting. Interestingly, the analysis of our proposed forgetfulness score also implies that FoCL tends to have a mitigated forgetting for future tasks.



中文翻译:

FoCL:生成模型的面向特征的持续学习

在本文中,我们提出了生成模型持续学习的一般框架:面向特征的持续学习(FoCL)。与以前旨在通过在参数空间或图像空间中引入正则化来解决灾难性遗忘问题的工作不同,FoCL 在特征空间中强加了正则化。我们在实验中表明,FoCL 可以更快地适应顺序到达任务的分布变化,并在任务增量学习中实现生成模型的最先进性能。我们讨论了针对不同用例场景的组合正则化空间的选择,以提高性能,例如,在背景中具有高度可变性的任务。最后,我们引入了一种健忘措施,可以公平地评估模型遭受遗忘的程度。有趣的是,健忘分数还意味着 FoCL 倾向于减轻对未来任务的遗忘。

更新日期:2021-07-15
down
wechat
bug