当前位置: X-MOL 学术Int. J. Mach. Learn. & Cyber. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Model-agnostic multi-stage loss optimization meta learning
International Journal of Machine Learning and Cybernetics ( IF 3.1 ) Pub Date : 2021-04-26 , DOI: 10.1007/s13042-021-01316-6
Xiao Yao , Jianlong Zhu , Guanying Huo , Ning Xu , Xiaofeng Liu , Ce Zhang

Model Agnostic Meta Learning (MAML) has become the most representative meta learning algorithm to solve few-shot learning problems. This paper mainly discusses MAML framework, focusing on the key problem of solving few-shot learning through meta learning. However, MAML is sensitive to the base model for the inner loop, and training instability occur during the training process, resulting in an increase of the training difficulty of the model in the process of training and verification process, causing degradation of model performance. In order to solve these problems, we propose a multi-stage loss optimization meta-learning algorithm. By discussing a learning mechanism for inner and outer loops, it improves the training stability and accelerates the convergence for the model. The generalization ability of MAML has been enhanced.



中文翻译:

与模型无关的多阶段损失优化元学习

模型不可知元学习(MAML)已成为解决少量学习问题的最具代表性的元学习算法。本文主要讨论MAML框架,重点讨论通过元学习解决少拍学习的关键问题。但是,MAML对内环的基本模型很敏感,并且在训练过程中会发生训练不稳定性,从而导致在训练和验证过程中增加模型的训练难度,从而导致模型性能下降。为了解决这些问题,我们提出了一种多阶段损失优化元学习算法。通过讨论内外循环的学习机制,它提高了训练的稳定性并加快了模型的收敛速度。MAML的泛化能力已得到增强。

更新日期:2021-04-27
down
wechat
bug