当前位置: X-MOL 学术BIT Numer. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Explicit stabilised gradient descent for faster strongly convex optimisation
BIT Numerical Mathematics ( IF 1.5 ) Pub Date : 2020-07-04 , DOI: 10.1007/s10543-020-00819-y
Armin Eftekhari , Bart Vandereycken , Gilles Vilmart , Konstantinos C. Zygalakis

This paper introduces the Runge-Kutta Chebyshev descent method (RKCD) for strongly convex optimisation problems. This new algorithm is based on explicit stabilised integrators for stiff differential equations, a powerful class of numerical schemes that avoid the severe step size restriction faced by standard explicit integrators. For optimising quadratic and strongly convex functions, this paper proves that RKCD nearly achieves the optimal convergence rate of the conjugate gradient algorithm, and the suboptimality of RKCD diminishes as the condition number of the quadratic function worsens. It is established that this optimal rate is obtained also for a partitioned variant of RKCD applied to perturbations of quadratic functions. In addition, numerical experiments on general strongly convex problems show that RKCD outperforms Nesterov's accelerated gradient descent.

中文翻译:

显式稳定梯度下降,用于更快的强凸优化

本文介绍了针对强凸优化问题的 Runge-Kutta Chebyshev 下降法 (RKCD)。这种新算法基于刚性微分方程的显式稳定积分器,这是一类强大的数值方案,可避免标准显式积分器面临的严格步长限制。对于优化二次函数和强凸函数,本文证明了 RKCD 几乎达到了共轭梯度算法的最优收敛速度,并且 RKCD 的次优性随着二次函数条件数的恶化而减小。已经确定,对于应用于二次函数扰动的 RKCD 的分区变体,也可以获得该最佳速率。此外,对一般强凸问题的数值实验表明,RKCD 优于 Nesterov'
更新日期:2020-07-04
down
wechat
bug