当前位置: X-MOL 学术Math. Program. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Limited-memory BFGS with displacement aggregation
Mathematical Programming ( IF 2.2 ) Pub Date : 2021-01-29 , DOI: 10.1007/s10107-021-01621-6
Albert S. Berahas , Frank E. Curtis , Baoyu Zhou

A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing the limited-memory method can achieve the same theoretical convergence properties as when full-memory (inverse) Hessian approximations are stored and employed, such as a local superlinear rate of convergence under assumptions that are common for attaining such guarantees. To the best of our knowledge, this is the first work in which a local superlinear convergence rate guarantee is offered by a quasi-Newton scheme that does not either store all curvature pairs throughout the entire run of the optimization algorithm or store an explicit (inverse) Hessian approximation. Numerical results are presented to show that displacement aggregation within an adaptive L-BFGS scheme can lead to better performance than standard L-BFGS.



中文翻译:

有限内存BFGS与位移聚合

针对存储在有限内存BFGS(又名L-BFGS)方法中的曲率对,提出了一种位移聚集策略,以使所得的(逆)Hessian近似值等于从全存储器BFGS方法得出的近似值。这意味着,如果存储了足够多的对,则采用有限内存方法的优化算法可以实现与存储和采用全内存(逆)Hessian近似时相同的理论收敛性,例如局部在获得此类保证的常见假设下的超线性收敛速度。据我们所知,这是通过准牛顿方案提供局部超线性收敛速率保证的第一项工作,该方案既不存储优化算法整个过程中的所有曲率对,也不存储显式(逆)Hessian近似。数值结果表明,在自适应L-BFGS方案中的位移聚集可以比标准L-BFGS产生更好的性能。

更新日期:2021-01-29
down
wechat
bug