当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions
Journal of Optimization Theory and Applications ( IF 1.6 ) Pub Date : 2020-10-30 , DOI: 10.1007/s10957-020-01770-2
Donghwan Kim , Jeffrey A. Fessler

This paper optimizes the step coefficients of first-order methods for smooth convex minimization in terms of the worst-case convergence bound ( i.e. , efficiency) of the decrease in the gradient norm. This work is based on the performance estimation problem approach. The worst-case gradient bound of the resulting method is optimal up to a constant for large-dimensional smooth convex minimization problems, under the initial bounded condition on the cost function value. This paper then illustrates that the proposed method has a computationally efficient form that is similar to the optimized gradient method.

中文翻译:

优化降低平滑凸函数梯度的一阶方法的效率

本文根据梯度范数减少的最坏情况收敛边界(即效率)优化平滑凸最小化的一阶方法的步长系数。这项工作基于性能估计问题方法。在成本函数值的初始有界条件下,对于大维平滑凸最小化问题,所得方法的最坏情况梯度边界是最优的,最多为常数。然后,本文说明了所提出的方法具有类似于优化梯度方法的计算效率形式。
更新日期:2020-10-30
down
wechat
bug