当前位置: X-MOL 学术Found. Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Simple Nearly Optimal Restart Scheme For Speeding Up First-Order Methods
Foundations of Computational Mathematics ( IF 2.5 ) Pub Date : 2021-03-26 , DOI: 10.1007/s10208-021-09502-2
James Renegar , Benjamin Grimmer

We present a simple scheme for restarting first-order methods for convex optimization problems. Restarts are made based only on achieving specified decreases in objective values, the specified amounts being the same for all optimization problems. Unlike existing restart schemes, the scheme makes no attempt to learn parameter values characterizing the structure of an optimization problem, nor does it require any special information that would not be available in practice (unless the first-order method chosen to be employed in the scheme itself requires special information). As immediate corollaries to the main theorems, we show that when some well-known first-order methods are employed in the scheme, the resulting complexity bounds are nearly optimal for particular—yet quite general—classes of problems.



中文翻译:

一种简单的接近最优的重启方案,用于加速一阶方法

我们提出了一个简单的方案,用于重新启动凸优化问题的一阶方法。重新启动仅基于达到目标值的指定减少量,对于所有优化问题,指定的数量均相同。与现有的重新启动方案不同,该方案不尝试学习表征优化问题结构的参数值,也不需要任何在实践中不可用的特殊信息(除非选择用于该方案的一阶方法)本身需要特殊信息)。作为对主要定理的直接推论,我们表明,当在该方案中采用一些众所周知的一阶方法时,对于某些特定的问题(但很普遍),复杂度范围几乎是最佳的。

更新日期:2021-03-27
down
wechat
bug