当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the Convergence Analysis of the Optimized Gradient Method
Journal of Optimization Theory and Applications ( IF 1.6 ) Pub Date : 2016-10-05 , DOI: 10.1007/s10957-016-1018-7
Donghwan Kim 1 , Jeffrey A Fessler 1
Affiliation  

This paper considers the problem of unconstrained minimization of smooth convex functions having Lipschitz continuous gradients with known Lipschitz constant. We recently proposed the optimized gradient method for this problem and showed that it has a worst-case convergence bound for the cost function decrease that is twice as small as that of Nesterov’s fast gradient method, yet has a similarly efficient practical implementation. Drori showed recently that the optimized gradient method has optimal complexity for the cost function decrease over the general class of first-order methods. This optimality makes it important to study fully the convergence properties of the optimized gradient method. The previous worst-case convergence bound for the optimized gradient method was derived for only the last iterate of a secondary sequence. This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method. We then discuss additional convergence properties of the optimized gradient method, including the interesting fact that the optimized gradient method has two types of worst-case functions: a piecewise affine-quadratic function and a quadratic function. These results help complete the theory of an optimal first-order method for smooth convex minimization.

中文翻译:

关于优化梯度法的收敛性分析

本文考虑了具有已知 Lipschitz 常数的 Lipschitz 连续梯度的平滑凸函数的无约束最小化问题。我们最近针对这个问题提出了优化梯度方法,并表明它具有最坏情况下的代价函数减少收敛边界,是 Nesterov 快速梯度方法的两倍,但具有类似的高效实际实现。Drori 最近表明,与一般的一阶方法相比,优化的梯度方法对于成本函数的降低具有最佳的复杂性。这种最优性使得充分研究优化梯度方法的收敛特性变得很重要。先前优化梯度方法的最坏情况收敛界限仅针对次级序列的最后一次迭代得出。本文为优化梯度法生成的主序列提供了解析收敛界。然后我们讨论优化梯度方法的其他收敛特性,包括一个有趣的事实,即优化梯度方法具有两种最坏情况函数:分段仿射二次函数和二次函数。这些结果有助于完成平滑凸最小化的最优一阶方法的理论。
更新日期:2016-10-05
down
wechat
bug