当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Meta Learning Backpropagation And Improving It
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-12-29 , DOI: arxiv-2012.14905
Louis Kirsch, Jürgen Schmidhuber

Many concepts have been proposed for meta learning with neural networks (NNs), e.g., NNs that learn to control fast weights, hyper networks, learned learning rules, and meta recurrent neural networks (Meta RNNs). Our Variable Shared Meta Learning (VS-ML) unifies the above and demonstrates that simple weight-sharing and sparsity in an NN is sufficient to express powerful learning algorithms. A simple implementation of VS-ML called Variable Shared Meta RNN allows for implementing the backpropagation learning algorithm solely by running an RNN in forward-mode. It can even meta-learn new learning algorithms that improve upon backpropagation, generalizing to different datasets without explicit gradient calculation.

中文翻译:

元学习反向传播并进行改进

已经提出了许多用于利用神经网络(NN)进行元学习的概念,例如,学会控制快速权重的NN,超网络,学习的学习规则和元递归神经网络(Meta RNN)。我们的可变共享元学习(VS-ML)结合了以上内容,并证明了NN中简单的权重共享和稀疏性足以表达强大的学习算法。VS-ML的一个简单实现称为可变共享元RNN,它允许仅通过以正向模式运行RNN来实现反向传播学习算法。它甚至可以元学习新的学习算法,这些算法在反向传播时会有所改善,无需进行明显的梯度计算即可推广到不同的数据集。
更新日期:2021-01-01
down
wechat
bug