当前位置: X-MOL 学术Optim. Methods Softw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
Optimization Methods & Software ( IF 1.4 ) Pub Date : 2020-03-02 , DOI: 10.1080/10556788.2020.1731747
Yurii Nesterov 1 , Alexander Gasnikov 2, 3, 4 , Sergey Guminov 3, 4 , Pavel Dvurechensky 4, 5
Affiliation  

In this paper, a new variant of accelerated gradient descent is proposed. The proposed method does not require any information about the objective function, uses exact line search for the practical accelerations of convergence, converges according to the well-known lower bounds for both convex and non-convex objective functions, possesses primal–dual properties and can be applied in the non-euclidian set-up. As far as we know this is the first such method possessing all of the above properties at the same time. We also present a universal version of the method which is applicable to non-smooth problems. We demonstrate how in practice one can efficiently use the combination of line-search and primal-duality by considering a convex optimization problem with a simple structure (for example, linearly constrained).



中文翻译:

具有小维松弛预言的原始对偶加速梯度方法

在本文中,提出了一种新的加速梯度下降变体。所提出的方法不需要关于目标函数的任何信息,使用精确的直线搜索来获得实际的收敛加速度,根据凸和非凸目标函数的众所周知的下界进行收敛,具有原始对偶性质,并且可以应用于非欧几里得设置。据我们所知,这是第一个同时拥有上述所有属性的方法。我们还提出了该方法的通用版本,适用于非光滑问题。我们通过考虑具有简单结构(例如,线性约束)的凸优化问题,展示了如何在实践中有效地使用线搜索和原始对偶的组合。

更新日期:2020-03-02
down
wechat
bug