当前位置: X-MOL 学术Iran. J. Sci. Technol. Trans. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Modified Descent Spectral Conjugate Gradient Method for Unconstrained Optimization
Iranian Journal of Science and Technology, Transactions A: Science ( IF 1.4 ) Pub Date : 2020-10-31 , DOI: 10.1007/s40995-020-01012-0
Saeed Nezhadhosein

Recently, Peyghami et al. (Optim Meth Soft 43:1–28, 2015) proposed a modified secant equation, which applied a new updating Yabe and Takano’s rule as an adaptive version of the parameter of conjugate gradient. Here, using this modified secant equation, we propose a new modified descent spectral nonlinear conjugate gradient method. The proposed method includes two major features; the higher-order accuracy in approximating the second-order curvature information of the objective function and the sufficient descent condition. Global convergence of the proposed method is proved for uniformly convex functions and general functions. Numerical experiments are done on a set of test functions of the CUTEr collection. The results are compared with some well-known methods.



中文翻译:

无约束优化的改进的下降谱共轭梯度法

最近,Peyghami等。(Optim Meth Soft 43:1-28,2015)提出了一个修正的割线方程,该方程应用了新的更新的Yabe和Takano规则作为共轭梯度参数的自适应版本。在这里,使用该修正割线方程,我们提出了一种新的修正下降谱非线性共轭梯度法。所提出的方法包括两个主要特征;在逼近目标函数的二阶曲率信息和足够的下降条件时具有较高的精度。证明了该方法对于全局凸函数和一般函数的全局收敛性。在CUTEr集合的一组测试功能上进行了数值实验。将结果与一些众所周知的方法进行比较。

更新日期:2020-11-02
down
wechat
bug