当前位置: X-MOL 学术SIAM J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Search Direction Correction with Normalized Gradient Makes First-Order Methods Faster
SIAM Journal on Scientific Computing ( IF 3.1 ) Pub Date : 2021-09-16 , DOI: 10.1137/20m1335480
Yifei Wang , Zeyu Jia , Zaiwen Wen

SIAM Journal on Scientific Computing, Volume 43, Issue 5, Page A3184-A3211, January 2021.
The so-called fast inertial relaxation engine is a first-order method for unconstrained smooth optimization problems. It updates the search direction by a linear combination of the past search direction, the current gradient, and the normalized gradient direction. We explore more general combination rules and call this generalized technique the search direction correction (SDC). SDC is extended to composite and stochastic optimization problems as well. Deriving from a second-order ODE, we propose a fast inertial search direction correction (FISC) algorithm as an example of methods with SDC. We prove the $\mathcal{O}(k^{-2})$ convergence rate of FISC for convex optimization problems. Numerical results on sparse optimization, logistic regression, as well as deep learning demonstrate that our proposed methods are quite competitive to other state-of-the-art first-order algorithms.


中文翻译:

使用归一化梯度的搜索方向校正使一阶方法更快

SIAM 科学计算杂志,第 43 卷,第 5 期,第 A3184-A3211 页,2021 年 1 月。
所谓快速惯性松弛引擎,是一种无约束平滑优化问题的一阶方法。它通过过去搜索方向、当前梯度和归一化梯度方向的线性组合来更新搜索方向。我们探索了更通用的组合规则,并将这种通用技术称为搜索方向校正 (SDC)。SDC 也扩展到复合和随机优化问题。从二阶常微分方程中推导出来,我们提出了一种快速惯性搜索方向校正 (FISC) 算法作为 SDC 方法的示例。我们证明了 FISC 对于凸优化问题的 $\mathcal{O}(k^{-2})$ 收敛率。稀疏优化的数值结果,逻辑回归,
更新日期:2021-09-16
down
wechat
bug