当前位置: X-MOL 学术SIAM J. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An Average Curvature Accelerated Composite Gradient Method for Nonconvex Smooth Composite Optimization Problems
SIAM Journal on Optimization ( IF 2.6 ) Pub Date : 2021-01-13 , DOI: 10.1137/19m1294277
Jiaming Liang , Renato D. C. Monteiro

SIAM Journal on Optimization, Volume 31, Issue 1, Page 217-243, January 2021.
This paper presents an accelerated composite gradient (ACG) variant, referred to as the AC-ACG method, for solving nonconvex smooth composite minimization problems. As opposed to well-known ACG variants that are based on either a known Lipschitz gradient constant or a sequence of maximum observed curvatures, the current one is based on the average of all past observed curvatures. More specifically, AC-ACG uses a positive multiple of the average of all observed curvatures until the previous iteration as a way to estimate the “function curvature” at the current point and then two resolvent evaluations to compute the next iterate. In contrast to other variable Lipschitz estimation variants, e.g., the ones based on the maximum curvature, AC-ACG always accepts the aforementioned iterate regardless of how poor the Lipschitz estimation turns out to be. Finally, computational results are presented to illustrate the efficiency of AC-ACG on both randomly generated and real-world problem instances.


中文翻译:

非凸光滑复合优化问题的平均曲率加速复合梯度法

SIAM优化杂志,第31卷,第1期,第217-243页,2021年1月。
本文提出了一种加速复合梯度(ACG)变体,称为AC-ACG方法,用于解决非凸光滑复合最小化问题。与基于已知的Lipschitz梯度常数或最大观察曲率序列的众所周知的ACG变量相反,当前变量基于所有过去观察到的曲率的平均值。更具体地说,AC-ACG使用所有观察到的曲率的平均值的正数倍,直到上一次迭代为止,以此作为估计当前点的“函数曲率”的方法,然后使用两次分解评估来计算下一次迭代。与其他可变的Lipschitz估计变量(例如基于最大曲率的变量)相反,AC-ACG始终接受上述迭代,无论Lipschitz估计结果有多糟糕。
更新日期:2021-03-21
down
wechat
bug