当前位置: X-MOL 学术Optim. Methods Softw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Universal intermediate gradient method for convex problems with inexact oracle
Optimization Methods & Software ( IF 1.4 ) Pub Date : 2020-01-17 , DOI: 10.1080/10556788.2019.1711079
Dmitry Kamzolov 1 , Pavel Dvurechensky 2, 3 , Alexander V. Gasnikov 1, 3, 4
Affiliation  

In this paper, we propose new first-order methods for minimization of a convex function on a simple convex set. We assume that the objective function is a composite function given as a sum of a simple convex function and a convex function with inexact Hölder-continuous subgradient. We propose Universal Intermediate Gradient Method. Our method enjoys both the universality and intermediateness properties. Following the ideas of Nesterov (Math. Program. 152 (2015), pp. 381–404) on Universal Gradient Methods, our method does not require any information about the Hölder parameter and constant and adjusts itself automatically to the local level of smoothness. On the other hand, in the spirit of the Intermediate Gradient Method proposed by Devolder et al. (CORE Discussion Paper 2013/17, 2013), our method is intermediate in the sense that it interpolates between Universal Gradient Method and Universal Fast Gradient Method. This allows to balance the rate of convergence of the method and rate of the oracle error accumulation. Under the additional assumption of strong convexity of the objective, we show how the restart technique can be used to obtain an algorithm with faster rate of convergence.



中文翻译:

具有不精确预言的凸问题的通用中间梯度方法

在本文中,我们提出了新的一阶方法,用于在简单凸集上最小化凸函数。我们假设目标函数是一个复合函数,作为简单凸函数和具有不精确 Hölder 连续次梯度的凸函数之和给出。我们提出了通用中间梯度法。我们的方法同时具有普遍性和中间性。遵循 Nesterov (Math. Program. 152 (2015), pp. 381–404) 关于通用梯度方法的想法,我们的方法不需要任何有关 Hölder 参数和常数的信息,并自动调整到局部平滑度水平。另一方面,本着 Devolder 等人提出的中间梯度方法的精神。(CORE 讨论文件 2013/17, 2013), 我们的方法是中间的,因为它在通用梯度方法和通用快速梯度方法之间进行插值。这允许平衡方法的收敛速度和预言错误累积的速度。在目标具有强凸性的附加假设下,我们展示了如何使用重新启动技术来获得具有更快收敛速度​​的算法。

更新日期:2020-01-17
down
wechat
bug