当前位置: X-MOL 学术J. Comput. Syst. Sci. Int. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic Gradient Method with Barzilai–Borwein Step for Unconstrained Nonlinear Optimization
Journal of Computer and Systems Sciences International ( IF 0.6 ) Pub Date : 2021-02-19 , DOI: 10.1134/s106423072101010x
L. Wang , H. Wu , I. A. Matveev

Abstract

The use of stochastic gradient algorithms for nonlinear optimization is of considerable interest, especially in the case of high dimensions. In this case, the choice of the step size is of key importance for the convergence rate. In this paper, we propose two new stochastic gradient algorithms that use an improved Barzilai–Borwein step size formula. Convergence analysis shows that these algorithms enable linear convergence in probability for strongly convex objective functions. Our computational experiments confirm that the proposed algorithms have better characteristics than two-point gradient algorithms and well-known stochastic gradient methods.



中文翻译:

无约束非线性优化的Barzilai–Borwein步随机梯度法

摘要

随机梯度算法用于非线性优化引起了极大的兴趣,尤其是在高维情况下。在这种情况下,步长的选择对于收敛速度至关重要。在本文中,我们提出了两种新的随机梯度算法,它们使用改进的Barzilai-Borwein步长公式。收敛性分析表明,这些算法可以实现强凸目标函数的概率线性收敛。我们的计算实验证实,与两点梯度算法和众所周知的随机梯度方法相比,该算法具有更好的特性。

更新日期:2021-02-21
down
wechat
bug