当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks
IEEE Transactions on Cybernetics ( IF 11.8 ) Pub Date : 2021-09-08 , DOI: 10.1109/tcyb.2021.3107415
Hideaki Iiduka 1
Affiliation  

This article deals with nonconvex stochastic optimization problems in deep learning. Appropriate learning rates, based on theory, for adaptive-learning-rate optimization algorithms (e.g., Adam and AMSGrad) to approximate the stationary points of such problems are provided. These rates are shown to allow faster convergence than previously reported for these algorithms. Specifically, the algorithms are examined in numerical experiments on text and image classification and are shown in experiments to perform better with constant learning rates than algorithms using diminishing learning rates.

中文翻译:

训练深度神经网络的自适应学习率优化算法的适当学习率

本文处理深度学习中的非凸随机优化问题。提供了基于理论的适当学习率,用于自适应学习率优化算法(例如 Adam 和 AMSGrad)来近似此类问题的驻点。这些速率被证明比以前为这些算法报告的更快收敛。具体来说,这些算法在文本和图像分类的数值实验中进行了检查,并在实验中显示在恒定学习率下比使用递减学习率的算法表现更好。
更新日期:2021-09-08
down
wechat
bug