当前位置: X-MOL 学术J. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An Efficient Modified AZPRP Conjugate Gradient Method for Large-Scale Unconstrained Optimization Problem
Journal of Mathematics ( IF 1.4 ) Pub Date : 2021-04-26 , DOI: 10.1155/2021/6692024
Ahmad Alhawarat 1, 2 , Thoi Trung Nguyen 1, 3 , Ramadan Sabra 4 , Zabidin Salleh 5
Affiliation  

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.

中文翻译:

大规模无约束优化问题的一种改进的AZPRP共轭梯度法

为了找到无约束优化问题的解决方案,我们通常使用共轭梯度(CG)方法,因为它不会像牛顿方法或Broyden–Fletcher–Goldfarb–Shanno(BFGS)方法那样消耗内存或存储二阶导数。最近,有人提出了对Polak和Ribiere方法的新修改,并以新的重启条件给出了所谓的AZPRP方法。在本文中,我们提出了一种新的AZPRP CG方法改进方法,它基于重新启动条件的改进方法来解决大规模无约束优化问题。通过强大的Wolfe-Powell线搜索,新参数满足了下降特性和全局收敛性分析。数值结果证明,与CG_Descent方法相比,新的CG方法具有较强的攻击性。比较是根据CUTEst库中的140多个标准函数进行的。比较包括迭代次数和CPU时间。
更新日期:2021-04-26
down
wechat
bug