当前位置: X-MOL 学术J. Appl. Math. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing
Journal of Applied Mathematics and Computing ( IF 2.2 ) Pub Date : 2021-07-13 , DOI: 10.1007/s12190-021-01589-8
Yufeng Liu 1 , Benxin Zhang 1 , Zhibin Zhu 2
Affiliation  

In this paper, we present two new three-term conjugate gradient methods which can generate sufficient descent directions for the large-scale optimization problems. Note that this property is independent of the line search used. We prove that these three-term conjugate gradient methods are global convergence under the Wolfe line search. Numerical experiments and comparisons demonstrate that the proposed algorithms are efficient approaches for test functions. Moreover, we use the proposed methods to solve the \(\ell _1-\alpha \ell _2\) regularization problem of sparse signal decoding in compressed sensing, and the results show that our methods have certain advantages over the existing solvers on such problems.



中文翻译:

用于无约束优化问题的两种充分下降三项共轭梯度方法在压缩感知中的应用

在本文中,我们提出了两种新的三项共轭梯度方法,它们可以为大规模优化问题生成足够的下降方向。请注意,此属性与所使用的线搜索无关。我们证明了这些三项共轭梯度方法在沃尔夫线搜索下是全局收敛的。数值实验和比较表明,所提出的算法是测试功能的有效方法。此外,我们使用所提出的方法来解决压缩感知中稀疏信号解码的\(\ell _1-\alpha \ell _2\)正则化问题,结果表明我们的方法在此类问题上比现有的求解器具有一定的优势.

更新日期:2021-07-13
down
wechat
bug