当前位置: X-MOL 学术SIAM J. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multiscale Analysis of Accelerated Gradient Methods
SIAM Journal on Optimization ( IF 3.1 ) Pub Date : 2020-08-25 , DOI: 10.1137/18m1203997
Mohammad Farazmand

SIAM Journal on Optimization, Volume 30, Issue 3, Page 2337-2354, January 2020.
Accelerated gradient descent iterations are widely used in optimization. It is known that, in the continuous-time limit, these iterations converge to a second-order differential equation which we refer to as the accelerated gradient flow. Using geometric singular perturbation theory, we show that, under certain conditions, the accelerated gradient flow possesses an attracting invariant slow manifold to which the trajectories of the flow converge asymptotically. We obtain a general explicit expression in the form of functional series expansions that approximates the slow manifold to any arbitrary order of accuracy. To the leading order, the accelerated gradient flow reduced to this slow manifold coincides with the usual gradient descent. We illustrate the implications of our results on three examples.


中文翻译:

加速梯度法的多尺度分析

SIAM优化杂志,第30卷,第3期,第2337-2354页,2020年1月。
加速梯度下降迭代被广泛用于优化中。众所周知,在连续时间范围内,这些迭代收敛到一个二阶微分方程,我们称其为加速梯度流。使用几何奇异摄动理论,我们表明,在某些条件下,加速梯度流具有吸引不变的慢流形,流的轨迹渐近地收敛于该流形。我们以功能级数展开的形式获得了一个一般的显式表达式,该表达式将慢流形近似为任意精度的阶数。达到领先水平,减小到该缓慢歧管的加速梯度流与通常的梯度下降相吻合。我们通过三个示例说明了结果的含义。
更新日期:2020-08-25
down
wechat
bug