当前位置: X-MOL 学术Commun. Math. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A three-term conjugate gradient algorithm using subspace for large-scale unconstrained optimization
Communications in Mathematical Sciences ( IF 1.2 ) Pub Date : 2020-01-01 , DOI: 10.4310/cms.2020.v18.n5.a1
Yuting Chen 1 , Yueting Yang 2
Affiliation  

It is well known that conjugate gradient methods are suitable for large-scale nonlinear optimization problems, due to their simple calculation and low storage. In this paper, we present a three-term conjugate gradient method using subspace technique for large-scale unconstrained optimization, in which the search direction is determined by minimizing the quadratic approximation of the objective function in a subspace which is discussed in two cases. We show the search direction can both satisfy the descent condition and Dai–Liao conjugacy condition. Under proper assumptions, global convergence result of the proposed method is established. Numerical experiments show the proposed method is efficient and robust.

中文翻译:

使用子空间的三项共轭梯度算法用于大规模无约束优化

众所周知,共轭梯度法由于其计算简单和存储量少而适合于大规模非线性优化问题。在本文中,我们提出了一种使用子空间技术进行大规模无约束优化的三项共轭梯度法,该方法通过最小化子空间中目标函数的二次逼近来确定搜索方向,这在两种情况下进行了讨论。我们表明搜索方向既可以满足下降条件,也可以满足Dai辽共轭条件。在适当的假设下,建立了该方法的全局收敛结果。数值实验表明,该方法是有效且鲁棒的。
更新日期:2020-01-01
down
wechat
bug