当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Progressive learning: A deep learning framework for continual learning.
Neural Networks ( IF 7.8 ) Pub Date : 2020-05-18 , DOI: 10.1016/j.neunet.2020.05.011
Haytham M Fayek 1 , Lawrence Cavedon 2 , Hong Ren Wu 1
Affiliation  

Continual learning is the ability of a learning system to solve new tasks by utilizing previously acquired knowledge from learning and performing prior tasks without having significant adverse effects on the acquired prior knowledge. Continual learning is key to advancing machine learning and artificial intelligence. Progressive learning is a deep learning framework for continual learning that comprises three procedures: curriculum, progression, and pruning. The curriculum procedure is used to actively select a task to learn from a set of candidate tasks. The progression procedure is used to grow the capacity of the model by adding new parameters that leverage parameters learned in prior tasks, while learning from data available for the new task at hand, without being susceptible to catastrophic forgetting. The pruning procedure is used to counteract the growth in the number of parameters as further tasks are learned, as well as to mitigate negative forward transfer, in which prior knowledge unrelated to the task at hand may interfere and worsen performance. Progressive learning is evaluated on a number of supervised classification tasks in the image recognition and speech recognition domains to demonstrate its advantages compared with baseline methods. It is shown that, when tasks are related, progressive learning leads to faster learning that converges to better generalization performance using a smaller number of dedicated parameters.



中文翻译:

渐进式学习:持续学习的深度学习框架。

持续学习是学习系统通过利用从学习中获得的先前知识并执行先前任务而不会对所获得的先前知识产生重大不利影响的方式来解决新任务的能力。持续学习是推进机器学习和人工智能的关键。渐进式学习是一种用于持续学习的深度学习框架,其中包括三个过程:课程,渐进和修剪。课程过程用于从一组候选任务中主动选择要学习的任务。通过添加利用先前任务中学习到的参数的新参数,同时从可用于新任务的现有数据中学习,而不会遭受灾难性遗忘,渐进过程可用于通过添加新参数来增加模型的容量。修剪过程用于抵消学习更多任务时参数数量的增长,以及减轻负面的前向转移,其中与手头任务无关的先验知识可能会干扰并降低性能。在图像识别和语音识别领域中,对许多有监督的分类任务进行了渐进式学习评估,以证明其与基准方法相比的优势。结果表明,与任务相关时,渐进式学习会导致学习速度加快,并使用较少数量的专用参数收敛到更好的泛化性能。在图像识别和语音识别领域中,对许多有监督的分类任务进行了渐进式学习评估,以证明其与基准方法相比的优势。结果表明,与任务相关时,渐进式学习会导致学习速度加快,并使用较少数量的专用参数收敛到更好的泛化性能。在图像识别和语音识别领域中,对许多有监督的分类任务进行了渐进式学习评估,以证明其与基准方法相比的优势。结果表明,与任务相关时,渐进式学习会导致学习速度加快,并使用较少数量的专用参数收敛到更好的泛化性能。

更新日期:2020-05-18
down
wechat
bug