当前位置: X-MOL 学术Comput. Optim. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
Computational Optimization and Applications ( IF 1.6 ) Pub Date : 2021-04-08 , DOI: 10.1007/s10589-021-00269-4
Majid Jahani , Naga Venkata C. Gudapati , Chenxin Ma , Rachael Tappenden , Martin Takáč

In this work we introduce the concept of an Underestimate Sequence (UES), which is motivated by Nesterov’s estimate sequence. Our definition of a UES utilizes three sequences, one of which is a lower bound (or under-estimator) of the objective function. The question of how to construct an appropriate sequence of lower bounds is addressed, and we present lower bounds for strongly convex smooth functions and for strongly convex composite functions, which adhere to the UES framework. Further, we propose several first order methods for minimizing strongly convex functions in both the smooth and composite cases. The algorithms, based on efficiently updating lower bounds on the objective functions, have natural stopping conditions that provide the user with a certificate of optimality. Convergence of all algorithms is guaranteed through the UES framework, and we show that all presented algorithms converge linearly, with the accelerated variants enjoying the optimal linear rate of convergence.



中文翻译:

快速和安全:带有最佳性证书和低估序列的加速梯度方法

在这项工作中,我们介绍了低估序列(UES)的概念,该概念受Nesterov的估计序列的启发。我们对UES的定义利用了三个序列,其中之一是目标函数的下限(或估计不足)。解决了如何构造适当的下界序列的问题,我们提出了强凸光滑函数和强凸复合函数的下界,这些下界遵循UES框架。此外,我们提出了几种一阶方法,用于在光滑和复合情况下最小化强凸函数。该算法基于有效地更新目标函数的下限,具有自然的停止条件,可为用户提供最佳证明。

更新日期:2021-04-08
down
wechat
bug