当前位置: X-MOL 学术SIAM J. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
SIAM Journal on Optimization ( IF 2.6 ) Pub Date : 2021-03-01 , DOI: 10.1137/20m1320651
Anton Rodomanov , Yurii Nesterov

SIAM Journal on Optimization, Volume 31, Issue 1, Page 785-811, January 2021.
In this paper, we study greedy variants of quasi-Newton methods. They are based on the updating formulas from a certain subclass of the Broyden family. In particular, this subclass includes the well-known DFP, BFGS, and SR1 updates. However, in contrast to the classical quasi-Newton methods, which use the difference of successive iterates for updating the Hessian approximations, our methods apply basis vectors, greedily selected so as to maximize a certain measure of progress. For greedy quasi-Newton methods, we establish an explicit nonasymptotic bound on their rate of local superlinear convergence, as applied to minimizing strongly convex and strongly self-concordant functions (and, in particular, to strongly convex functions with Lipschitz continuous Hessian). The established superlinear convergence rate contains a contraction factor, which depends on the square of the iteration counter. We also show that greedy quasi-Newton methods produce Hessian approximations whose deviation from the exact Hessians linearly converges to zero.


中文翻译:

显式超线性收敛的贪婪拟牛顿法

SIAM优化杂志,第31卷,第1期,第785-811页,2021年1月。
在本文中,我们研究了拟牛顿法的贪婪变体。它们基于Broyden系列某个子类的更新公式。特别是,此子类包含众所周知的DFP,BFGS和SR1更新。但是,与经典的拟牛顿方法不同,经典的拟牛顿方法使用连续迭代的差异来更新Hessian近似,我们的方法应用贪婪选择的基向量,以最大程度地提高进度。对于贪婪的拟牛顿法,我们建立了它们的局部超线性收敛速率的显式非渐近界,用于最小化强凸和强自洽函数(尤其是使用Lipschitz连续Hessian的强凸函数)。建立的超线性收敛速率包含一个收缩因子,这取决于迭代计数器的平方。我们还表明,贪婪的拟牛顿法产生的Hessian近似值与实际Hessian的偏差线性收敛为零。
更新日期:2021-03-21
down
wechat
bug