当前位置: X-MOL 学术Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Two-step conjugate gradient method for unconstrained optimization
Computational and Applied Mathematics ( IF 2.5 ) Pub Date : 2020-08-11 , DOI: 10.1007/s40314-020-01297-2
R. Dehghani , N. Bidabadi

Using Taylor’s series, we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, using this relation and an approach introduced by Dai and Liao, we present a conjugate gradient algorithm to solve unconstrained optimization problems. The proposed method makes use of both gradient and function values, and utilizes information from the two most recent steps, while the usual secant relation uses only the latest step information. Under appropriate conditions, we show that the proposed method is globally convergent without needing convexity assumption on the objective function. Comparative results show computational efficiency of the proposed method in the sense of the Dolan–Moré performance profiles.

中文翻译:

无约束优化的两步共轭梯度法

使用泰勒级数,我们提出了修正的割线关系,以获得目标函数第二曲率的更精确近似值。然后,利用这种关系以及Dai和Liao提出的方法,我们提出了一种共轭梯度算法来解决无约束优化问题。所提出的方法同时利用了梯度值和函数值,并利用了两个最新步骤的信息,而通常的割线关系仅使用了最新的步骤信息。在适当的条件下,我们证明了该方法是全局收敛的,不需要对目标函数进行凸假设。比较结果表明,从Dolan-Moré性能概况的角度来看,所提出方法的计算效率。
更新日期:2020-08-11
down
wechat
bug