当前位置: X-MOL 学术SIAM J. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Contracting Proximal Methods for Smooth Convex Optimization
SIAM Journal on Optimization ( IF 2.6 ) Pub Date : 2020-11-10 , DOI: 10.1137/19m130769x
Nikita Doikov , Yurii Nesterov

SIAM Journal on Optimization, Volume 30, Issue 4, Page 3146-3169, January 2020.
In this paper, we propose new accelerated methods for smooth convex optimization, called contracting proximal methods. At every step of these methods, we need to minimize a contracted version of the objective function augmented by a regularization term in the form of Bregman divergence. We provide global convergence analysis for a general scheme admitting inexactness in solving the auxiliary subproblem. In the case of using for this purpose high-order tensor methods, we demonstrate an acceleration effect for both convex and uniformly convex composite objective functions. Thus, our construction explains acceleration for methods of any order starting from one. The augmentation of the number of calls of oracle due to computing the contracted proximal steps is limited by the logarithmic factor in the worst-case complexity bound.


中文翻译:

收缩凸方法的光滑凸优化

SIAM优化杂志,第30卷,第4期,第3146-3169页,2020年1月。
在本文中,我们提出了用于平滑凸优化的新加速方法,称为收缩近端方法。在这些方法的每个步骤中,我们都需要最小化目标函数的压缩版本,并以正则项以Bregman散度的形式进行扩展。我们为允许解决辅助子问题不精确的一般方案提供全局收敛性分析。在为此目的使用高阶张量方法的情况下,我们证明了凸和均匀凸复合目标函数的加速效果。因此,我们的构造说明了从一开始的任何顺序方法的加速。由于在最坏情况下的复杂度范围内,对数因子限制了由于计算收缩近端步长而导致的oracle调用数量的增加。
更新日期:2020-11-13
down
wechat
bug