当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning without Forgetting
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 23.6 ) Pub Date : 2017-11-14 , DOI: 10.1109/tpami.2017.2773081
Zhizhong Li , Derek Hoiem

When building a unified vision system or gradually adding new apabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning with similar old and new task datasets for improved new task performance.

中文翻译:

不忘学习

当构建统一的视觉系统或逐渐向系统中添加新功能时,通常的假设是所有任务的训练数据始终可用。但是,随着任务数量的增加,对此类数据进行存储和重新训练变得不可行。当我们向卷积神经网络(CNN)添加新功能时出现了一个新问题,但现有功能的训练数据不可用。我们提出了“学习无遗忘”方法,该方法仅使用新任务数据来训练网络,同时保留原始功能。与常用的特征提取和微调适应技术相比,我们的方法具有良好的性能,并且与使用我们认为不可用的原始任务数据的多任务学习类似,其性能也得到了提高。
更新日期:2018-11-05
down
wechat
bug