当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Energy-Based Models for Continual Learning
arXiv - CS - Artificial Intelligence Pub Date : 2020-11-24 , DOI: arxiv-2011.12216
Shuang Li, Yilun Du, Gido M. van de Ven, Antonio Torralba, Igor Mordatch

We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs have a natural way to support a dynamically-growing number of tasks or classes that causes less interference with previously learned information. We find that EBMs outperform the baseline methods by a large margin on several continual learning benchmarks. We also show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a class of models naturally inclined towards the continual learning regime.

中文翻译:

基于能量的持续学习模型

我们鼓励基于能量的模型(EBM)作为有前途的模型,以解决持续学习的问题。除了通过使用外部存储器,不断增长的模型或正则化来解决持续学习之外,EBM还具有一种自然的方式来支持数量动态增长的任务或类别,从而减少对先前学习的信息的干扰。我们发现,在几个持续学习基准上,EBM在很大程度上优于基准方法。我们还表明,EBM适用于更通用的持续学习设置,在这种设置中,数据分布发生了变化,而没有明确描述任务的概念。这些观察结果将EBMs视为自然倾向于持续学习机制的一类模型。
更新日期:2020-11-25
down
wechat
bug